Long before the Supreme Court overturned the constitutional right to an abortion, digital privacy hawks had flagged issues surrounding menstrual-tracking apps. Absent tougher data protections, these experts warned, information from the apps could be used to target users with ads or even determine insurance coverage or loan rates.
SCOTUS’s toppling of Roe v. Wade last month ratcheted up concerns to a new level: that app users’ fertility, missed periods and more could be used against them in criminal and civil proceedings as evidence that they’ve had an abortion. As recent studies show, such fears are not hollow ones.
A research team at the Organization for the Review of Care and Health Apps, a company that tests health apps for the U.K.’s National Health Service, examined the privacy policies of 25 of the most popular period-tracker apps and found that 84% share data with third parties. Nearly two-thirds of them fork it over to authorities for legal obligations, according to results published last week.
This includes details of sexual activity, contraception use and period timings. Contact details of users who tracked their cycles were also being sold as marketing contact lists, with nearly 70% of apps that share data saying they do so for marketing purposes. Just two, FitBit and Natural Cycles, were recommended as safe and secure. ORCHA didn’t name those that scored poorly.
Consumer Reports, which conducted its own evaluation this year, did name names. The organization tested the most popular period-tracking apps in the U.S., and found that Flo, Clue, Stardust, Period Calendar and Period Tracker all faltered when it comes to privacy. CR recommended only three apps: Euki, Drip and Periodical, all of which store data locally and don’t allow third-party tracking.
The lack of privacy of period-tracking apps should come as no surprise, says Eric Perakslis, chief science and digital officer for the Duke Clinical Research Institute and professor at Duke School of Medicine.
Perakslis co-led a study last year with privacy-focused group the Light Collective looking at the data shared by genetic-testing and digital-medicine companies. They found that several of the companies sent customer information to Facebook for ad targeting, frequently in violation of their own written policies. His team also found issues with where and when users of health apps and websites are asked for their permission to share such data.
”Advertising technology is probably the most ubiquitous software on the Internet,” said Perakslis. “How does Facebook or Google make money? They pretty much make it through ads.”
The lack of proper standards governing ad tech is an industry-wide problem. Marketing, he points out, was also the reason a third of the top 100 hospitals were recently found to be sharing data with Facebook.
Most consumers are on their own in learning how these processes work. The Federal Trade Commission and the Federal Communications Commission haven’t exactly jumped all over Big Tech.
“There aren’t hundreds of cases pending where the Breach Notification Rule through the FTC has been enforced,” said Perasklis. “HHS is enforcing the HIPAA stuff, but it’s not a crime where there’s been a lot of examples or punishments meted out yet.”
But what’s happening with the erosion of women’s reproductive rights in America has put period-tracking apps in the spotlight and for good reason. These apps are used by about a third of women, according to a Kaiser Family Foundation survey fielded in 2019.
Flo, for instance, has a reported 200 million users worldwide. According to a 2021 settlement with the Federal Trade Commission, Flo handed users’ health information out to numerous third-party marketing and analytics firms, including the aforementioned social media giants.
Targeted ads are one type of harm. “Being presented with targeted content can be intrusive and a problem if someone doesn’t have complete personal and ambient privacy,” explained Perakslis. That happens often for the most vulnerable in society, meaning it’s not always the case that all their devices are their own and aren’t shared with others.
Of even greater concern now, with more than two dozen states moving to outlaw abortion, is that health and pregnancy information stored in an app and then passed on to others could prompt someone to look for a person, or prosecute them or their doctor.
“That’s been the concern of activists for many years, that this behavioral advertising network would eventually become an incredibly useful tool in surveillance and in an authoritarian crackdown,” said Cooper Quintin, senior staff technologist with the Electronic Frontier Foundation.
There’s precedent for using big data in this manner. In 2016, for instance, Rewire noted that an anti-abortion group was targeting ads to women in and around Planned Parenthood centers based on location data. The fact that smartphones can be turned into advanced surveillance devices — complete with GPS, camera and microphone — makes fertility tracking especially fraught.
“The last time that abortion was illegal in the U.S., you didn’t have a smartphone constantly logging everybody’s location,” said Quintin. “You didn’t have a giant behavioral advertising network. You didn’t have a giant surveillance network. It’s a much different scenario this time around.”
The Flo case highlights one reason why unauthorized data sharing may continue to occur. The potentially specious ways health apps handle consumer information may slip outside the FTC’s definition of a health data breach. The agency said it’s undertaking a review of the rule and considering undertaking public comments.
Women’s health apps are the “tip of the iceberg” when it comes to security of health data, observed Fatima Ahmed, ORCHA’s clinical lead for maternity and women’s health, in a statement about the study results.
Nor is sharing data with third parties the only privacy issue researchers have found. “Even app developers who promise to stop sharing names and addresses, for example, should be aware that people can be identified by an IP address,” Ahmed added.
“De-identification is almost like a parlor game at this point, it’s so easy to re-identify people,” added Perakslis.
When contacted, he said, companies his team studied were usually not aware of the disconnect between their consumer-facing privacy statements and what their ad technology was actually doing. He recommends marketers find out.
“How many buy the software and then hire a forensics expert to go in and see everything it’s actually doing? Zero,” he said. “Because they trust the people selling the stuff that it only does certain things. At the end of day, you will be held responsible for what’s in your [tech] stack, just like a biopharma company will be held responsible for what it puts on shelves or on the supply line.”
Perakslis, who worked in the pharma industry for nearly two decades, also suggested marketers consider setting up the equivalent of an institutional review board to ensure their data practices are on the up and up.
“Have an ethics board; work with them,” he urged. “There are people who can help you not make the bad mistakes.”
Companies should think about doing good data science, checking into the provenance and lineage of health information. “You wouldn’t for a second want to have illicit substances come into your supply line — stolen or impure products, reused syringes,” he said. “They need to think about not allowing it in their data. They need to be thinking about the fact that this data may have been exploited, may have been stolen, taken without consent, all these different things.”
Quintin was even more pointed:
“I hope that this is a moment of reckoning for the advertising industry,” he said. “I hope that a lot of people in the industry will seriously consider whether or not they want to be oppression’s little helper.”
This story first appeared on mmm-online.com.