Heartbeats And Headaches: Wearable Tech’s Privacy Problem

Heartbeats And Headaches: Wearable Tech’s Privacy Problem

For many of us, wearable technology has become a part of everyday life, a way to track key health metrics to improve our overall wellbeing. But a new report from vpnMentor has suggested that these health benefits come at a significant cost to our privacy.  

Tracked Health Data is a Regulatory Gray Area 

According to the report, 90% of wearable devices monitor at least one health and wellness metric, making it the most widely collected data category. 71% of devices tracked heart rates, while 56% measured blood oxygen levels. A smaller but growing share monitor glucose, skin temperature, and stress.  

In clinical contexts, this type of data would be protected under HIPAA laws. Byt consumer wearables sit outside those safeguards, potentially leaving users exposed. If compromised, such data could allow insurers or employers to discriminate against impacted individuals, or cybercriminals could use it for extortion schemes.  

The vpnMonitor report cites one case in which City of Chicago employees were compelled to provide biometric screening data in a wellness data in a wellness program, only to discover their data was shared with multiple companies without clear consent – triggering a class-action lawsuit over privacy violations.  

Fitness Records Aren’t as Harmless as They Seem 

While fitness records seem low risk, they can reveal sensitive personal information. In 2021, researchers discovered an unsecured GetHealth databases containing 61 million records from FitBit, Apple HealthKit, and other. The data connected fitness records with PII like names and dates of birth.  

Cybercriminals could use this information for targeted phishing, spear-phishing attacks on high-value individuals, or even blackmail tied to health conditions.  

GPS and Geolocation Data Poses Security Risks 

According to vpnMonitor, 63% of wearables track location data. When organizations or users fail to secure that data properly, it can create serious risk.  

In 2018, Strava, a fitness app that tracks users’ running and cycling routes, inadvertently revealed the location and movement patterns of US and allied military personnel. Bases in Afghanistan, Syria, and other conflict zones showed up on visual heatmaps because soldiers were logging workouts while deployed. Analysts were even able to infer patrol routes and daily routines.  

Polar, another fitness app, faced a similar scandal the same year. Its public activity maps didn’t just reveal general routes but allowed anyone with an account to zoom in on individual users. The app exposed names, addresses, and time-stamped GPS workout logs.  

Excessive Data Sharing Practices Proliferate 

Seven major wearable brands – including Samsung, Huawei, Xiaomi, Amazfit, and Meta – explicitly share or sell user data to advertisers. Meta goes even further, feeding biometric inputs like gaze tracking into its ad and AI engines, with limited op-out availability. 

Meanwhile, 55% of brands share de-identified data with researchers. While often framed as benign, re-identification remains a serious threat when combined with other datasets. Without strict controls, anonymization can quickly unravel.  

Wearables Raise Surveillance Concerns 

Employee wellness programs may sound like a nice office perk, but vpnMonitor argues that, when wearables are involved, they could constitute undue surveillance. WHOOP, for example, markets dashboards that let coaches and HR managers monitor heart rate, strain, and recovery in real time.  

What’s more, once collected, sensitive data could be subpoenaed, leaked, or repurposed in ways employees never consented to. Mozilla has already flagged six leading wearables as “Privacy Not Included,” citing weak transparency and minimal user control. 

Building Stronger Defenses 

The bottom line, according to vpnMonitor, is that consumer wearbles protections should align closer with regulations surrounding clinical data. Key measures include:  

  • Transparency at the variable: User should know exactly what is collected and how it is used.  
  • Local processing: Sensitive data should remain on-device whenever possible. 
  • HIPAA-like protections for consumer wearables: Closing the regulatory loophole would extend baseline security to the consumer market. 
  • User empowerment: Clear opt-out options, strict data minimization, and third-party privacy audits should be industry standards. 

It’s important to view wearables as what they really are: devices capable of capturing and exposing some of the most intimate personal information. Their security should reflect that fact.   


josh breaker rolfe

Josh is a Content writer at Bora. He graduated with a degree in Journalism in 2021 and has a background in cybersecurity PR. He’s written on a wide range of topics, from AI to Zero Trust, and is particularly interested in the impacts of cybersecurity on the wider economy.

The opinions expressed in this post belong to the individual contributors and do not necessarily reflect the views of Information Security Buzz.

link

Leave a Reply

Your email address will not be published. Required fields are marked *