How a fitness app became a matter of international security

Reading Time: 5 minutes

Category: Threat Intel


Sometimes a security problem occurs despite a well-intended functionality. Polar, the Finnish company known for its smartwatches and fitness wearables, aggregates data from thousands of users for its Polar Flow Explore website. But along with sharing their bike rides and running trails, Polar users shared potentially sensitive locations on this global map.

Imagine Tom, a 32-year-old Dutch male who is part of a mission in Iraq, where the Dutch military is stationed near the airport of Erbil. It is one of the most important areas in Northern Iraq, where the military fights with terrorist groups related to Islamic State (IS). Tom goes running in the area around the airport every week, wearing his fitness watch.

Addresses and pictures of 6,460 individuals

Using Polar Flow Explore, Dutch journalists were able to discern that his running route starts and ends at a group of houses in a small village in the north of the country, potentially his base. Using his Polar account, Facebook profile and information found on search engines, Dutch journalists discovered Tom’s full name, home address and family members (two daughters) as well.

By using this combination of Polar Flow data, social media profiles and other public information, Dutch journalists and research group Bellingcat were able to recreate the names, private addresses and family pictures of 6,460 individuals with 69 different nationalities.

Polar’s well-intended functionality – sharing GPS-routes and trails with other Polar users and athletes – proved to be a potentially dangerous feature, posing a security threat to men and women working on military bases, nuclear power plants and national security agencies around the world.

White spots on a dark map

Unfortunately, Polar is no exception: sports app Strava suggested that military personnel opt out of its heatmap in January 2018, as it turned out that data about exercise routes shared online by soldiers could be used to pinpoint overseas facilities. In remote locations like Afghanistan and Syria, Strava users seem to be almost exclusively foreign military personnel. With one look on Strava’s heat map, secret military bases stand out brightly as white spots against the black map.

Seemingly innocent data can easily be misused

These incidents make fitness apps suddenly seem a lot less innocent. In fact, they could be potentially dangerous: if a group of journalists succeeds in discovering profiles and potentially sensitive locations, then other malevolent organizations or individuals can also use this data to find military personnel and members of national security services. Information is power, and can be used for extortion, bribery, or worse. Seemingly innocent data can be weaponized.

Public or private?

Polar replied by taking the Polar Flow Explore map offline. The company came out with a statement that only users that have the settings of their workouts and activities on ‘public’, appeared on the global map. The company emphasized that most Polar users pick ‘private’ or ‘only for followers’ in their privacy settings, rather than ‘public’.

Strava did not take its heatmap down but came out with a statement and list of recommendations for users. The company also promised to carry out changes, including working with military and government officials to address potentially sensitive data and working with the engineering and user-experience teams to simplify the privacy and safety features to ensure users know how to control their own data.

Risks arise even without hacks or data breaches

The security issues with Polar and Strava’s aggregated data collections and heatmaps did not arise from a large-scale hack or a data breach. No military database was breached in order to find the exact location of training camps and areas of combat. Rather, this security issue occurred despite, or because of, a well-intended functionality in applications for private use.

Part of the problem lies with end users. The lack of awareness by users that their public data might fall into the wrong hands is problematic. Defense organizations and national security organizations have a responsibility as well, and should offer policies and provide guidance for employees when they are using apps with personal or location-sensitive data, creating more awareness and reminding users of their privacy settings.

IoT devices: the blurring line between our professional and personal lives

According to Joepke van der Linden, squad lead and CISO at ON2IT, what we are seeing now is only the tip of the iceberg. In her former role as regulatory affairs counsellor at companies such as Ziggo, Vodafone and T-Mobile, Van der Linden encountered potential security risks associated with cellphone usage in foreign countries.

“A corporate cellphone is a well-understood piece of technology for which guidelines can be drafted fairly easily. But the new breed of IoT devices are blurring the line between our professional and personal lives. Connected cars, smartwatches and even wifi connected fridges or voice assistants exponentially increase the attack surface for hackers who are looking for sensitive data by invading the private lives of employees. Organizations need to prepare themselves for this irreversible trend.”

Anonymously aggregated data mostly harmless?

It did not take a hacker to discover secret American bases in Afghanistan or the exact location of a nuclear plant in the Netherlands. Rather, the security issue arose from the notion that anonymously aggregated data is mostly harmless, while, in fact, it can be used for profiling. If the Strava and Polar cases make one thing clear, it is that there is a disconnect between the mundane innocence of using a fitness app, and the real-world repercussions of its data.