Privacy and Wearables – opportunities and ethical challenges in tracking your intimate data

Women are in danger of their intimate data being used without their informed consent. Intimate data is tracked and sold to third parties which then send us personalized ads.

Photo by Ehimetalor Akhere Unuabona on Unsplash

We’ve all heard a lot about privacy, whether we’re being bombarded by cookie permissions on browsers, agreeing to impossible terms and conditions, or using biometrics such as a thumbprint or facial recognition to access services. It’s a bit tiresome really, we just want to use a convenient app and not worry about what our data is being used for. 

Photo by Firmbee.com on Unsplash

However, today’s world is one of “surveillance capitalism” where the typical situation is that our data is sold to third parties which then send us personalized ads. Sometimes, this data backfires and delivers us something that we don’t want to see.

An example of this would be a woman, using a menstruation tracking app who reports that she is pregnant, but then has a miscarriage, and is being shown endless ads for baby clothes, toys, food, diapers, and more, each more heartbreaking than the last.

There’s no way to turn this off. The app has already delivered her “big news” to third parties, and so she’s stuck with a profile, a pregnant woman who probably wants to buy things for her new baby. 

Photo by engin akyurt on Unsplash

Femtech is a category of technology that caters to people with uteruses, those who menstruate, those who are trying to conceive, or who are trying not to conceive, or those who are experiencing menopause, and more. It deals in intimate data.

You might use an app to track your menstrual cycle, or one which helps you to do kegel exercises, or one which helps you track when you get a hot flash. These apps are long overdue.

Since smart products like fitness trackers, step counters, and smartwatches started emerging on the market, wearables and app developers have shied away from offering tracking options of certain kinds of data which have historically been considered inappropriate to talk about in public  – menstruation, fertility and menopause.

Women’s issues. Things society doesn’t typically discuss. But then we did as slowly and surely, femtech started to emerge and suddenly, it’s commonplace to discuss which menstruation app one is using and the pros and cons of the user experience.

However, with the joy of finally being able to cater to our own health, women and menstruators are in danger of their data being used without their informed consent. Privacy is a real issue. As Michele Estrin Gilman, Venable Professor of Law at the University of Baltimore School of Law and Co-Director at the Center on Applied Feminism states: 

“Menstruation is being monetized and surveilled, with the voluntary participation of millions of women”.

(Gilman, 2021)

Intimate data isn’t just when you last got your period. 

It’s your sexual preferences, the food you eat, your emotional state, and whether or not you have breast tenderness, headaches, or are feeling frisky. If you are using biometric sensors, it’s your heart rate, how much exercise you are getting, the state of your cervical fluid, or your temperature if you’re testing for fertility.

And maybe you’re like many, who would readily discuss these things over dinner. But maybe you’re in a difficult and dangerous domestic abuse situation where you are worried about sharing this information for fear of the repercussions.

Or perhaps your app knows you are pregnant but for whatever reason, that isn’t the best choice for you and your body and you need to get an abortion – if you’re in the USA right now, that’s a very difficult situation to be facing. In these cases and many more, your privacy is vital to your wellbeing. 

Photo from Clue

Am I abnormal? 

Perhaps one of the greatest problems with femtech is that apps must generalize to accommodate thousands or millions of users. A period is not always 28 days and apps do allow for this, but only just. People’s bodies are different. You might have noticed we use the word ‘menstruators’ here, this has been picked up from the work by Gilman who includes a range of people in this term.

In a recent article by Catriona McMillan (2022), this problem of femtech normalizing one idea of a “woman” is expanded upon: 

“These technologies seemingly operate on the assumption that users conform to particular social, sexual, and bodily norms…These apps are generally based on general assumptions about women’s bodies, with very little input on the wide variety of ways in which female processes can occur (eg length of the menstrual cycle). … Further, the above concerns regarding what counts as ‘data accuracy’ evidences femtech industry’s clear lack of understanding of, and responsiveness to, its diverse (potential) user base, some, but not all, of which identify as women.”

(McMillan, 2022)

What this means is that if these apps gather all your data, and give you predictions based on their understanding of what a woman should be, then we have a problem, because not everyone is the same. People have different period cycle lengths, different fertility needs, and different menopause symptoms. 

When data from these apps is used to present a ‘norm’ – this is what you should be like – it becomes harmful to people from different hormonal ranges, races, cultures, genders, or other backgrounds for whom this norm doesn’t apply. 

Of course I didn’t read that. 

The terms and conditions for using any app or technology (or anything, really, for that matter!) are difficult to understand and have only two options, opt in, or opt out and don’t use this thing.

Instead of forcing companies to be responsible for how they use data, people are responsible to protect their own privacy, even when they can’t necessarily understand the language used in the terms and conditions, or understand the implications of how their data is used.

And while our exact profiles aren’t sent off to third parties, there is often enough information to, in the worst case, trace the information back to the user, or in most cases, simply sell the user’s data for profit. 

Photo: Pixabay

Now this might seem like a fair trade – many apps are free and we really need to start being more critical and ask ourselves why apps are free. Why is Facebook free?

It’s not out of the kindness of their hearts. It’s because every action taken on these platforms is sent as data to third parties to form incredibly detailed profiles of people which are then used to sell you – the user – products and services. We are selling ourselves without even realizing it.

As the Feminist Digital Justice (FDJ) report from 2021 points out, with many of the menstruation tracking apps, for example, the intimate data collected has little to do with period prediction. It ends up being used for “targeted advertising and market research and not just for app operation” (FDJ, 2021).

They further point out that many of these apps – and they specifically name Flo, MyFLO and Period Tracker provide vague legal phrasing using terms such as “might include” or “such as” without ever naming exactly what your data is being used for.

Many of these apps are careful about how they word things, by ‘anonymizing’ the user they are not dealing with personally identifiable data and thus navigate the boundaries of the GDPR. 

One exception as highlighted by the FDJ report is the Clue app “In addition, Clue also seems to hold an ethical line with respect to third-party data sharing; its privacy policy states that the app shares de-identified personal data with “carefully vetted researchers to advance female health studies”, and provides users a contact email ID, in case they want an opt-out option from their data being included in such research.” (Read more in the detailed report from FDJ here). 

Photo: Clue

You might not mind that your data is being used for studies or even to sell you products. But besides the aforementioned unfortunate case of incorrect profiling with the woman who had a miscarriage, there are also problematic uses of data. 

Ovia, an app for businesses to use with their employees gathers data about employees’ personal ‘wellbeing’ and then this has the potential to impact their career, health insurance and more (Wells, 2019). A classic case outside of femtech is that of the connected toothbrush. A toothbrush that measures not only if you brush, but when and how well.

This data is then used for your dental insurance company. (Higginbotham, 2015). Or the John Hancock Vitality program which uses Fitbit data to offer incentives such as rewards for steps taken or activities completed.

If anyone has seen Black Mirror’s “Nose Dive” episode, they have seen where things like this can quickly lead – at least for now, in science fiction, but with the data being readily available, and already tied to insurance programs and benefits, where does our willingness to give up our data end? 

And while the majority of our readers are based in the Nordics where we still have a sense of trust in our insurance, health care, and government, the world (including the Nordics) is shifting towards a more data-driven society.

We rely on data to make decisions. Your bank, your insurer, and your grocery store all use data about their users to change pricing and services based on the behaviours of their users. Much of this data is provided to them either by directly monitoring interactions with the service provider or through analyzing third-party data, such as that coming from your app.

(For a brief overview – in Danish – about the Internet of Things, see Dansk Industri’s page here). Therefore, it matters if your data is being sent somewhere. This article, by Consumer Reports, outlines the problem well:

“It could, for instance, affect your ability to obtain life insurance and how much you pay for that coverage, increase the interest rate you’re charged on loans, and even leave you vulnerable to workplace discrimination. And because you usually don’t know who has your data, you may never know if you’ve experienced any of those harms.”

(Rosato, 2020) 

We’re still learning about apps

Apps are still relatively new for all of us. The majority of people using smartphones are still in the phase where they know not to click on strange links but otherwise will try an app and see how it goes for them. They don’t necessarily know to review the app permissions (why does it need access to your contacts or photos?) and further, they haven’t considered whether the app has been reviewed or approved by any health authorities.

There is a problem with liability with apps. What if the app gets your period prediction wrong and you end up with an unplanned pregnancy? The app developer is not responsible – by design. Because we agree to impossible terms to be able to use these services. Furthermore, the app may request permissions that it doesn’t necessarily need, and then this data can also be sent to third parties. This includes everything from your contacts, to messages, to photos, to files. 

It is important that the people developing femtech whether apps, hardware, or communities adhere to a strict set of ethics and conduct thoughtful and thorough user-centered design testing where the people who will use these new technologies are not only considered and designed for but designed with so that their actual needs are heard and integrated. 

Better business models

What is needed is a change in the business model, or how the apps make money. It’s unreasonable to expect people to develop something for our convenience without being paid. We wouldn’t work for years, for free. So why do we expect it from apps? We’ve been trained to expect free apps, from Facebook to Instagram, to photo editing apps, to our personal apps, we don’t typically pay for these. Instead, we expect them to be free, and perhaps they should be. Not everyone can afford to pay for apps, even when that app helps them with their health. Funding from public institutions or community-driven funding (such was the case with both Drip and Periodical) are some ways to make this business model possible. 

There is hope! 

Just as we celebrated when female health-focused apps started to emerge, we can appreciate that organizations such as the Female Digital Justice, or the Center on Applied Feminism are working towards ensuring data privacy and demanding responsibility be taken by these apps. As consumers, we can start to be a little more aware of what we’re agreeing to, and be more conscious about choosing ethically inclined apps. We can ask our local governments and organizations to enable people to create affordable or free femtech apps through grants and funding. 

Women and menstruators represent half the population. It is vitally important that the intimate data from these people remains private, and does not become the property of someone else who gains from the profit of this data; or who negatively affects the life of the person providing that data.

Photo by Ehimetalor Akhere Unuabona on Unsplash

As a final note – this is the tip of an iceberg that covers privacy, data, new technologies, and not least, gender politics and the implications of the misuse of data in terms of gender. This article is providing a brief overview and for more in-depth information, please refer to our reference list below. 


Further reading and references:

Feminist Digital Justice Project. Data Subjects in the Femtech Matrix: A Feminist Political Economy Analysis of the Global Menstruapps Market (Issue Paper 6). (2021). Available at: https://itforchange.net/sites/default/files/1620/FDJ-Issue-Paper-6-Data-Subjects-In-the-Femtech-Matrix-IT-for-Change.pdf 

Femtech Live. “Data Privacy: How FemTech Should be Thinking”. (2021). Available at: https://femtech.live/data-privacy-how-femtech-should-be-thinking/ 

Gilman, Michele Estrin. “Periods for profit and the rise of menstrual surveillance.” Colum. J. Gender & L. 41 (2021): 100. Available at: https://heinonline.org/HOL/LandingPage?handle=hein.journals/coljgl41&div=13&id=&page= 

Higginbotham, Stacey. “Meet a startup building an insurance business around a connected toothbrush”. Fortune.com. (2015). Available at: https://fortune.com/2015/06/26/connected-toothbrush-insurance/ 

McKinsey. “The dawn of the femtech revolution”. (2022). Available at: https://www.mckinsey.com/industries/healthcare-systems-and-services/our-insights/the-dawn-of-the-femtech-revolution

McMillan, Catriona. Monitoring Female Fertility Through ‘Femtech’: The Need for a Whole-System Approach to Regulation, Medical Law Review, 2022; fwac006, Available at: https://doi.org/10.1093/medlaw/fwac006

Oliver, Elizabeth and Lerner, Avi. Empowerment, privacy and the rise of FemTech. (2021) Available at: https://www.bionews.org.uk/page_160731

Rosas, Celia. The Future is Femtech: Privacy and Data Security Issues Surrounding Femtech Applications, 15 Hastings Bus. L.J. 319 (2019).

Available at: https://repository.uchastings.edu/hastings_business_law_journal/vol15/iss2/5

Rosato, Donna. “What Your Period Tracker App Knows About You”. Consumer Reports. (2020). Available at: https://www.consumerreports.org/health-privacy/what-your-period-tracker-app-knows-about-you-a8701683935/ 

Wells, Rachel. “Your Pregnancy App May Be Selling Your Data—to Your Boss”. Glamour.com. (2019). Available at: https://www.glamour.com/story/your-pregnancy-app-may-be-selling-your-datato-your-boss 

Translate