The Electronic Freedom Foundation’s Atlas of Surveillance is an interactive map and searchable database of surveillance technologies used by law enforcement across the United States. “We specifically focused on the most pervasive technologies, including drones, body-worn cameras, face recognition, cell-site simulators, automated license plate readers, predictive policing, camera registries, and gunshot detection. Although we have amassed more than 5,000 datapoints in 3,000 jurisdictions, our research only reveals the tip of the iceberg and underlines the need for journalists and members of the public to continue demanding transparency from criminal justice agencies.” [Maps Mania]
Incognito mode for Google Maps, announced last May, is currently in testing. With the mode enabled, user activity isn’t saved to the user’s Google account. It was made available last week to beta testers using the preview version of the Google Maps Android app.
Meanwhile, Strange Maps looks at the curious lack of Google Street View in Germany and Austria, where privacy concerns are paramount.
Google integrates its maps into its search results: synergy! What, then, is scrappy upstart search engine DuckDuckGo, which makes a point of not tracking its users,1 to do in response? Answer: use Apple Maps. “We’re excited to announce that map and address-related searches on DuckDuckGo for mobile and desktop are now powered by Apple’s MapKit JS framework, giving you a valuable combination of mapping and privacy.”
A long exposé from the New York Times explores just how much location data is collected from mobile apps, to the point where the identity of an anonymous user can be reconstructed from where they’ve been. The key point: whatever purpose the app is collecting your location for (for example, to give you your local weather), that location data may be shared with and sold to other parties.
Apple now has a fleet of cars collecting data for Apple Maps. Since they’ve been making a point about consumer privacy lately, this page lists where their cars are going to be in the coming weeks. (AppleInsider notes that some of that data collection is pedestrian-based.) It turns out Google has a page for Street View data collection that includes similar information, though it’s far less granular: windows of several months, whereas Apple tells you where it’ll be within a two-week timeframe.
“Google wants to know where you go so badly that it records your movements even when you explicitly tell it not to,” the Associated Press reports. Their exclusive investigation discovered that “many Google services on Android devices and iPhones store your location data even if you’ve used a privacy setting that says it will prevent Google from doing so.” Basically, turning the “Location History” feature off doesn’t stop Google apps from recording your location at various junctures: your location data may still be found in places like “My Activity” or “Web and App Activity,” for example. Google insists its descriptions are clear; critics are calling Google’s hairsplitting disingenuous, disturbing and wrong.
In the wake of reports that fitness apps’ user data was exposed and could be used to identify military and intelligence personnel in sensitive areas like bases and deployment zones, U.S. military and defense employees can no longer use geolocation features in devices and apps in operational areas. The new policy was announced last Friday. Also see coverage at Stars and Stripes. [Gizmodo]
Previously: Strava Heat Map Reveals Soldiers’ Locations; Non-Anonymized Strava User Data Is Accessible; Strava, Responding to Security Concerns, Disables Features; Polar Flow User Data Can Be Used to Identify Military and Intelligence Personnel.
Remember how in January the mobile fitness app Strava was found to reveal the training routes and user data of military and security personnel? It wasn’t just Strava. A joint investigation by Bellingcat and De Correspondent found that the data for users of the Polar Flow app is even more exposed: even the names and home addresses of military and intelligence personnel working at embassies, bases, intelligence agencies and other sensitive locations could be figured out from the user data. De Correspondent shows how.
Polar, the Finnish company behind the app and service, announced that they were suspending the Explore feature that made the data accessible. They also note, and it’s worth remembering, that Polar data is private by default. If you’re military or intelligence and using a fitness app, what the hell are you doing exposing your location data—especially if you’re in a sensitive location?
The report also contains one hell of a buried lede. They tested other apps, namely Strava, Endomondo and Runkeeper, and, well: “Though it’s harder to identify people and find their home addresses than it is through Polar, we were ultimately able to do so using these apps. In contrast to Polar’s app, there is no indication that people whose profiles are set to private can also be identified in these apps. We informed them of our findings last week.” In other words, this is an industry-wide problem, not just a problem with one or two services. [The Verge]
Strava has reportedly disabled certain features in the wake of the privacy and security issues raised last month, with users reporting that they can no longer create workout segments. In a statement given to The Verge, Strava said: “We are reviewing features that were originally designed for athlete motivation and inspiration to ensure they cannot be compromised by people with bad intent.” [Canadian Cycling Magazine]
More on the privacy issues regarding Strava’s global heat map and its customer data. Now Wired UK is reporting that Strava’s data isn’t anonymous. Because you can compare your results with nearby users, all it takes is a local GPS tracklog—which can be created out of whole cloth, as Steve Loughran’s blog post demonstrates—to see detailed information about users. Wired UK:
By uploading an altered GPS file, it’s possible to de-anonymise the company’s data and show exactly who was exercising inside the walls of some of the world’s most top-secret facilities. Once someone makes a data request for a specific geographic location—a nuclear weapons facility, for example—it’s possible to view the names, running speeds, running routes and heart rates of anyone who shared their fitness data within that area.
The leaderboard for an area, the Guardian reports, can be extremely revealing. “The leaderboard for one 600m stretch outside an airbase in Afghanistan, for instance, reveals the full names of more than 50 service members who were stationed there, and the date they ran that stretch. One of the runners set his personal best on 20 January this year, meaning he is almost certainly still stationed there.”
Which makes the security issue regarding military personnel using fitness trackers even worse than simply the anonymous aggregate of the routes they take. Yes, this is very much an unintended and unforseen consequence of relatively innocuous social sharing bumping up against operational and personal security protocols; and it’s as much on military personnel to, you know, not use GPS-enabled devices that upload your location to a third-party server as it is on companies to have clear and effective privacy controls. This is very much the result of a whole lot of people not thinking things through.
Previously: Strava Heat Map Reveals Soldiers’ Locations.
Strava is a mobile fitness tracking app that uses GPS data from phones and watches. It has access to a lot of data, and has been using that data to create a global heat map showing the paths taken by its cycling and running customers. The map’s most recent update, last November, aggregates user data through September 2017. But analyst Nathan Ruser noticed a problem: in places where local Strava use is low, the map can reveal the paths of people from wealthy western countries—for example, soldiers at U.S. military bases overseas, whether they’re patrolling or simply exercising. (U.S. troops are encouraged to use fitness trackers.) Which is to say, suddenly Strava is a security problem. Details at BBC News and the Washington Post.
Google’s Street View blurs people’s faces for privacy reasons. Licence plates, too. But a tweet by the Guardian’s David Shariatmadari reveals that Google’s algorithm sometimes extends privacy rights to cows.
Great to see Google takes cow privacy seriously pic.twitter.com/ACTBpDwno6
— David Shariatmadari (@D_Shariatmadari) September 13, 2016