Social media users have posted ideas on how to protect people’s reproductive privacy when the Supreme Court Overturned Roe v. Wadeincluding input “junk” data in apps designed for menstrual cycle tracking.
People use time period tracking apps to: predict their next period, talk to their doctor about their cycle and determine when they are fertile. Users log everything from cravings to periods, and apps provide predictions based on these inputs. The app predictions help with simple decisions, such as when to buy tampons next, and provide life-changing observations, such as whether you’re pregnant.
The argument for submitting unwanted data is that it activates the apps’ algorithms, making it difficult or impossible for authorities or vigilantes to use the data to invade people’s privacy. However, that argument does not hold.
Does your fintech have global ambitions?
Before you think about expanding, check out our handy checklist
As researchers developing and evaluate technologies that help people manage their health, we analyze how app companies collect data from their users to provide useful services. We know that popular time-tracking applications would require millions of people to enter unwanted data to even give the algorithm a boost.
Junk data is also a form of “noise”, which is an inherent problem developers design algorithms for to be robust against. Even if unwanted data “confused” the algorithm or provided too much data for authorities to investigate, the success would be short-lived as the app would be less accurate for its intended purpose and people would stop using it.
Moreover, it would not solve existing privacy problems, because people’s digital footprints are everywhere, from internet searches for using the phone app and location tracking. This is why the advice is to urge people to delete their period apps well-meaning but not well.
How the apps work
When you first open an app, enter your age, date of your last period, how long your cycle is, and what type of birth control you use. Some apps connect to other apps, such as physical activity trackers. You record relevant information including when your period starts, cramps, discharge consistency, cravings, sex drive, sexual activity, mood and heaviness.
Once you give your data to the app company of the period, it’s unclear what exactly happens to it, as the algorithms are owned and part of the company’s business model. Some apps ask for the user’s cycle length, which people may not know. Indeed, researchers found that 25.3% of people said their cycle had the oft-cited duration of 28 days; however, only 12.4% actually had a 28 day cycle. So if an app has used the data you enter to make predictions about you, it may take a few cycles for the app to calculate your cycle length and more accurately calculate the phases of your cycle.
An app can make predictions based on any data the app company has collected from its users or based on your demographics. For example, the app’s algorithm knows that a person with a higher body mass index can have a 36 day cycle. Or it can use a hybrid approach that makes predictions based on your data, but compares it to the company’s large dataset of all its users to let you know what’s typical — say, a majority of people report having cramps. have just before their period.
What does submitting unwanted data result in
If you regularly use a period-tracking app and provide it with inaccurate data, the app’s personalized predictions, such as when your next period will occur, may also become inaccurate. If your cycle is 28 days and you start logging that your cycle is now 36 days, the app should adapt – even if that new information is incorrect.
But what about the data in total? The easiest way to combine data from multiple users is to average them. For example, the most popular time tracking app, Flo, has an estimated 230 million users. Imagine three cases: a single user, the average of 230 million users, and the average of 230 million users plus 3.5 million users submitting unwanted data.
A person’s data may contain noise, but the underlying trend is clearer when averaged over many users, smoothing out the noise to make the trend more apparent. Junk data is just another kind of noise. The difference between the clean and dirty data is noticeable, but the general trend in the data is still clear.
This simple example illustrates three problems. People submitting unwanted data are unlikely to affect predictions for an individual app user. It would take an extraordinary amount of work to shift the underlying signal across the entire population. And even if this were to happen, poisoning the data risks rendering the app useless to those who need it.
Other approaches to protect privacy
In response to people’s concerns about their menstrual app data being used against them, some menstrual apps have made public statements about creating a anonymous modeusing end-to-end encryption and according to European privacy laws.
The security of any “anonymous mode” depends on what it actually does. Flo’s statement says the company will de-identify data by removing names, email addresses and technical identifiers. Removing names and email addresses is a good start, but the company doesn’t define what they mean by technical identifiers.
With Texas paving the way to legally sue anyone who helps someone else get an abortionand 87% of people in the US can be identified by minimal demographic information such as zip code, gender, and date of birth, demographics or identifiers can be harmful to people seeking reproductive health care. There is a huge user data marketmainly for targeted advertising, which makes it possible to learn a terrifying amount about almost anyone in the US
While end-to-end encryption and the European General Data Protection Regulation (GDPR) can protect your data from legal questions, unfortunately none of these solutions help with the digital footprint that everyone leaves behind as they use technology every day. Can even identify users’ search history how far along are they in the pregnancy?.
What do we really need?
Rather than brainstorming ways to circumvent technology to reduce potential harm and legal problems, we believe that people should advocate for: digital privacy protections and restrictions on data use and sharing. Businesses need to communicate effectively and get feedback from people about how their data is being used, their level of risk from exposure to potential harm, and the value of their data to the business.
People have been concerned about digital data collection in recent years. However, in a post-Roe world, more people may be at legal risk from doing standard health tracking.
Article by Katie Siekprofessor and chair of computer science, University of Indiana; Alexander L. Hayes, Ph.D. Health informatics student, University of Indianaand Zaidat IbrahimPhD student health informatics, University of Indiana