The proprietary geocoding system What3words, which assigns a three-word mapcode to each three-square-metre point on the planet, has been getting some grief lately. It’s always been somewhat controversial because it’s a closed system, and because of the steps What3words has taken to protect its proprietary database and algorithms: it’s issued takedown notices relating to the compatible, open-source WhatFreeWords (details here), to the point of threatening a security researcher over his tweets about it last April. Which, you know, got noticed.
It’s also been the subject of several parodies, including what3emojis (emojis instead of words), Four King Maps (four swear words, UK and Ireland only on account of a lack of swear words, which frankly disappoints me) and Maps Mania’s own April Fool’s joke for this year, what2figures, which expresses any point on the globe in just two numbers (I’ll wait).
But more recently it’s come under criticism for having similar sounding word combinations for addresses only a few miles apart: see Andrew Tierney’s blog post (which expands upon this Twitter thread) and What3words’s response. This is especially a problem for first responders trying to locate someone who may have misspoken or mistyped their location, or because of their accent, resulting in rescue teams being sent to the wrong location.
Last month it was reported that the X-Mode software development kit, used by many apps, was collecting and selling user location data, with the U.S. military among the buyers. In response, earlier this month both Apple and Google gave developers a deadline to remove X-Mode from their apps: seven days in Google’s case, fourteen in Apple’s. Apple found 100 apps that contained the code; X-Code claims 400 apps on all platforms, tracking 25 million devices in the U.S. and 40 million elsewhere. The Wall Street Journal story is behind a paywall; see The Verge and MacRumors for summaries.
Motherboard reported last week that the U.S. military was buying location data that originated, among other places, from Muslim prayer and dating apps. The Motherboard exposé details how it happened: how the location data supply chain works, and, for example, how data brokers pay app developers to incorporate their frameworks into apps so that user data can be harvested and sold to buyers like law enforcement and military contractors. Developers may not necessarily be aware of what they’re agreeing to when they accept those frameworks, but they don’t have to embed data harvesting algorithms in their apps either. [Daring Fireball, MetaFilter]
Previously: New York Times: How Location Data Is Gathered, Shared and Sold.
Remember the farm in Kansas that, thanks to an error in MaxMind’s geolocation database, became the default physical location for any IP address in the United States that couldn’t be resolved? It’s happened again, this time to a couple in Pretoria, South Africa, who received online and physical threats and visits from the police because IP addresses that were from Pretoria, but whose precise location couldn’t be resolved any further, defaulted to their front yard. Kashmir Hill, who covered the Kansas incident, has the story for Gizmodo. It’s a fascinating long read that burrows into the sources of geolocation data and the problematic ways in which it’s used.
In this case the problem was traced to the National Geospatial-Intelligence Agency, which assigned the lat/long coordinates for Pretoria to this family’s front yard. The end result: one home becomes the location for one million IP addresses in Pretoria. (The NGA has since changed it.)
The problem here is twofold. First, a failure to account for accuracy radius: a city or a country is represented by a single, precise point at its centre. That’s a real problem when the data point being geotagged can’t be more specific than “Pretoria” or “United States,” because the geotagging is made artificially precise: it’s not “somewhere in Pretoria,” it’s this specific address. Second is the misuse of IP location data. It’s one thing to use a web visitor’s IP address to serve them local ads or to enforce geographical restrictions on content, quite another to use that data for official or vigilante justice. The data, Hill points out, isn’t good enough for that. [MetaFilter]
Previously: A Geolocation Glitch Creates a ‘Technological Horror Story’.
A long exposé from the New York Times explores just how much location data is collected from mobile apps, to the point where the identity of an anonymous user can be reconstructed from where they’ve been. The key point: whatever purpose the app is collecting your location for (for example, to give you your local weather), that location data may be shared with and sold to other parties.
“Google wants to know where you go so badly that it records your movements even when you explicitly tell it not to,” the Associated Press reports. Their exclusive investigation discovered that “many Google services on Android devices and iPhones store your location data even if you’ve used a privacy setting that says it will prevent Google from doing so.” Basically, turning the “Location History” feature off doesn’t stop Google apps from recording your location at various junctures: your location data may still be found in places like “My Activity” or “Web and App Activity,” for example. Google insists its descriptions are clear; critics are calling Google’s hairsplitting disingenuous, disturbing and wrong.
Not every geographic database uses Null Island. When MaxMind’s geolocation database, which matches IP addresses to physical locations, can only identify an IP address’s country, it uses a default location roughly at the centre of that country. In the case of the United States, it turned out to be Joyce Taylor’s farm in Potwin, Kansas. Fusion’s Kashmir Hill has the horror story that has ensued: MaxMind’s database is used by thousands of online services, whose users mistook a default location with a precise address.
For the last decade, Taylor and her renters have been visited by all kinds of mysterious trouble. They’ve been accused of being identity thieves, spammers, scammers and fraudsters. They’ve gotten visited by FBI agents, federal marshals, IRS collectors, ambulances searching for suicidal veterans, and police officers searching for runaway children. They’ve found people scrounging around in their barn. The renters have been doxxed, their names and addresses posted on the internet by vigilantes. Once, someone left a broken toilet in the driveway as a strange, indefinite threat.
As Hill’s article points out, Taylor is far from the only one to be hit by this problem. MaxMind is updating its database to correct this and one other case by moving the default location to a body of water. (I can’t help but think that we will soon start hearing stories about people driving into the lake as a result of this change.) There’s no such thing as a set of coordinates that can’t be represented precisely. What’s the solution?