Motherboard reported last week that the U.S. military was buying location data that originated, among other places, from Muslim prayer and dating apps. The Motherboard exposé details how it happened: how the location data supply chain works, and, for example, how data brokers pay app developers to incorporate their frameworks into apps so that user data can be harvested and sold to buyers like law enforcement and military contractors. Developers may not necessarily be aware of what they’re agreeing to when they accept those frameworks, but they don’t have to embed data harvesting algorithms in their apps either. [Daring Fireball, MetaFilter]
Remember the farm in Kansas that, thanks to an error in MaxMind’s geolocation database, became the default physical location for any IP address in the United States that couldn’t be resolved? It’s happened again, this time to a couple in Pretoria, South Africa, who received online and physical threats and visits from the police because IP addresses that were from Pretoria, but whose precise location couldn’t be resolved any further, defaulted to their front yard. Kashmir Hill, who covered the Kansas incident, has the story for Gizmodo. It’s a fascinating long read that burrows into the sources of geolocation data and the problematic ways in which it’s used.
In this case the problem was traced to the National Geospatial-Intelligence Agency, which assigned the lat/long coordinates for Pretoria to this family’s front yard. The end result: one home becomes the location for one million IP addresses in Pretoria. (The NGA has since changed it.)
The problem here is twofold. First, a failure to account for accuracy radius: a city or a country is represented by a single, precise point at its centre. That’s a real problem when the data point being geotagged can’t be more specific than “Pretoria” or “United States,” because the geotagging is made artificially precise: it’s not “somewhere in Pretoria,” it’s this specific address. Second is the misuse of IP location data. It’s one thing to use a web visitor’s IP address to serve them local ads or to enforce geographical restrictions on content, quite another to use that data for official or vigilante justice. The data, Hill points out, isn’t good enough for that. [MetaFilter]
A long exposé from the New York Times explores just how much location data is collected from mobile apps, to the point where the identity of an anonymous user can be reconstructed from where they’ve been. The key point: whatever purpose the app is collecting your location for (for example, to give you your local weather), that location data may be shared with and sold to other parties.
“Google wants to know where you go so badly that it records your movements even when you explicitly tell it not to,” the Associated Press reports. Their exclusive investigation discovered that “many Google services on Android devices and iPhones store your location data even if you’ve used a privacy setting that says it will prevent Google from doing so.” Basically, turning the “Location History” feature off doesn’t stop Google apps from recording your location at various junctures: your location data may still be found in places like “My Activity” or “Web and App Activity,” for example. Google insists its descriptions are clear; critics are calling Google’s hairsplitting disingenuous, disturbing and wrong.
In the wake of reports that fitness apps’ user data was exposed and could be used to identify military and intelligence personnel in sensitive areas like bases and deployment zones, U.S. military and defense employees can no longer use geolocation features in devices and apps in operational areas. The new policy was announced last Friday. Also see coverage at Stars and Stripes. [Gizmodo]
Previously: Strava Heat Map Reveals Soldiers’ Locations; Non-Anonymized Strava User Data Is Accessible; Strava, Responding to Security Concerns, Disables Features; Polar Flow User Data Can Be Used to Identify Military and Intelligence Personnel.
Not every geographic database uses Null Island. When MaxMind’s geolocation database, which matches IP addresses to physical locations, can only identify an IP address’s country, it uses a default location roughly at the centre of that country. In the case of the United States, it turned out to be Joyce Taylor’s farm in Potwin, Kansas. Fusion’s Kashmir Hill has the horror story that has ensued: MaxMind’s database is used by thousands of online services, whose users mistook a default location with a precise address.
For the last decade, Taylor and her renters have been visited by all kinds of mysterious trouble. They’ve been accused of being identity thieves, spammers, scammers and fraudsters. They’ve gotten visited by FBI agents, federal marshals, IRS collectors, ambulances searching for suicidal veterans, and police officers searching for runaway children. They’ve found people scrounging around in their barn. The renters have been doxxed, their names and addresses posted on the internet by vigilantes. Once, someone left a broken toilet in the driveway as a strange, indefinite threat.
As Hill’s article points out, Taylor is far from the only one to be hit by this problem. MaxMind is updating its database to correct this and one other case by moving the default location to a body of water. (I can’t help but think that we will soon start hearing stories about people driving into the lake as a result of this change.) There’s no such thing as a set of coordinates that can’t be represented precisely. What’s the solution?