Your Body Data Should Work Like a Co-pilot, Not a Spy

A friend of mine opened her period-tracking app one morning and felt something shift. Not the app, her relationship to it. She'd been using it for years, logging symptoms, moods, intimacy. The app knew her body's patterns better than she did. Then she read about the Flo verdict. Meta had been collecting menstrual health data from millions of users through tracking code embedded in the app. A jury ruled it illegal interception. The penalties could reach billions.

She didn't delete the app immediately. But she stopped logging honestly. Hot flashes? She'd note them in a paper journal instead. Sex? She left that field blank. The app became less useful because she couldn't trust where the information went. The relationship had changed.

That moment, when you realize something intimate has been made extractive, isn't paranoia. It's pattern recognition. And the pattern is everywhere once you start looking.

When Wellness Tech Stops Working With You

The Flo case wasn't an outlier. It was a symptom.

In 2023, researchers at Duke discovered that data brokers were openly selling lists of people with depression, anxiety, and other mental health conditions. Some lists included names and addresses. Prices ranged from $275 for small samples to $75,000 for annual subscriptions. You could literally buy a marketing list of people struggling with depression. Lists like these can be used to target people during vulnerable moments. High-interest financial offers, unregulated supplements, or predatory advertising they would never knowingly opt into become easy avenues for exploitation. None of this data came from HIPAA-protected sources. It came from apps, browsing history, and behavioral signals that people had no idea were being collected and sold.

That same year, hospitals faced lawsuits after patients discovered that Meta Pixels,tiny pieces of tracking code,were embedded in patient portals. When you scheduled an appointment or clicked through your medical records, that activity was being transmitted to advertising platforms. The hospitals claimed it was for analytics. The patients felt surveilled during some of the most vulnerable moments of their lives.

These aren't edge cases. A 2022 study published in JMIR reviewed 23 popular women's health apps and found that 87% shared data with third parties. Thirteen percent collected data before users even consented. Only 70% had a visible privacy policy at all. These weren’t obscure apps , the study reviewed top-downloaded women’s health apps on the major app stores.

The infrastructure isn't broken. It's working exactly as designed,to turn your body into data that serves someone else's business model.

The Breach Isn't Technical, It's Relational

In the first essay of this series, we talked about the difference between a playlist and a pulse. Your playlist is a preference,it reveals taste, maybe mood. Your pulse is your body. It reveals when you're stressed, when you're recovering, when something might be wrong. The intimacy isn't comparable.

When wellness tools treat your pulse like your playlist, feeding both into the same advertising machinery and applying the same surveillance logic, they break something fundamental. Not just privacy in the legal sense, but trust in the human sense.

My friend who stopped logging her hot flashes wasn't worried about a specific harm. She couldn't articulate exactly what Meta might do with behavioral metadata about when she opened her period tracker. But she knew the relationship had changed. The app had gone from co-pilot to informant.

That shift happens in an instant, and it's nearly impossible to reverse. When people discover their intimate data has been shared without their real understanding, they don't just change their privacy settings. They disengage. They log less. They lie. Or they leave entirely.

This is why the surveillance model isn't just ethically problematic,it's strategically self-defeating. The more accurate your data needs to be, the more it requires genuine trust. And trust, once broken by the discovery that your co-pilot was actually a spy, doesn't recover with a revised privacy policy.

What a Co-pilot Actually Looks Like

A co-pilot helps you navigate. It reads the instruments, spots patterns you might miss, suggests course corrections. But it doesn't report your route to someone else. It doesn't sell your flight plan. And it definitely doesn't use your altitude to serve you ads for oxygen masks.

That metaphor isn't just rhetorical. It describes a fundamentally different technical architecture.

When your wellness data works like a co-pilot, it processes information locally,on your device, where you control it. It identifies patterns that matter to you: recovery trends, sleep quality over time, how your body responds to different routines. It helps you understand what's working and what needs adjustment. But it doesn't require that data to leave your control to be useful.

This approach is sometimes called "local-first" or "privacy by design," but those terms make it sound more complicated than it is. The core idea is simple: your body data should serve your goals, not someone else's.

When you track sleep in your forties and that data informs insights in your sixties, it should do so without ever living on a server that could be breached, sold, or subpoenaed. When you notice patterns between stress and recovery, the system should highlight them without transmitting raw biometric signals to analytics platforms. When you choose to share information with a doctor or partner, that should be an active decision,not a default buried in fine print.

This isn't about hiding. It's about belonging to yourself.

Why Architecture Matters More Than Policy

Every major wellness tech privacy scandal of the past five years has involved companies that had privacy policies. Flo had one. The hospitals with Meta Pixels had them. The apps selling data to brokers had them. The policies existed, and they were technically accurate. Users had, in the legal sense, consented.

But consent theater isn't consent. When a privacy policy requires a graduate degree to parse, when sharing is opt-out instead of opt-in, and when the default setting is "share everything," the policy becomes camouflage for extraction.

Real privacy doesn't come from better policies. It comes from architecture that makes harmful practices impossible by design.

When data never leaves your device, there's no server to breach. When there are no third-party tracking SDKs embedded in the product, there's no hidden pipeline for information to leak through. When sharing requires an explicit choice rather than an overlooked checkbox, consent becomes meaningful.

This architectural approach does more than protect privacy,it creates the foundation for long-term relationships. If you're building technology meant to serve someone from age 45 to 75, you need infrastructure that can maintain trust across decades. That means systems where the user retains control, where data doesn't accumulate in ways that create compounding risk, where the business model doesn't depend on monetizing intimacy.

This is also why privacy-by-design creates sustainable competitive advantage. Trust isn't a feature you can add later. It's a structural attribute that emerges from how the system is built. Companies that understand this are creating moats that matter,the kind that compound rather than erode over time.

The market is starting to recognize this. Health and fitness apps already convert at 30–43% in app stores,significantly higher than most categories, suggesting users are willing to pay for tools they trust. Meanwhile, regulatory pressure continues to mount. The Flo verdict wasn't a fluke. Several U.S. states have now passed dedicated consumer health data laws, signaling a broader regulatory shift toward tighter protections. It's a signal that the legal and social costs of surveillance-based models are becoming unsustainable.

Trust Unlocks What Surveillance Can't

Here's what shifts when your wellness data actually works like a co-pilot:

You use it honestly. You log the symptoms that matter, track the patterns that feel significant, share the context that makes the data meaningful. You're not performing wellness for an algorithm. You're working with a tool that serves you.

That honesty makes personalization possible,not the marketing kind that targets you, but the kind that actually adapts to how your body works. When the system knows that your recovery takes longer after high-stress weeks, it can adjust expectations and suggestions. When it recognizes that your sleep quality drops before you feel the physical effects, it can flag the pattern early.

But that kind of genuine, useful personalization only works if you trust the tool enough to give it accurate information. And you only trust it when you know where the data goes,and where it doesn't.

This is the foundation that makes adaptation possible. Not just tracking what you do, but understanding what your body is telling you and helping you respond. That's where we're headed next: how technology that understands context can guide without controlling, illuminate without exposing, and help you maintain the kind of wellness that actually lasts.

Reclaiming Partnership

My friend went back to paper. Not because she's anti-technology, but because she couldn't find a digital tool she could trust with that level of intimacy. She tracks her cycles in a notebook now, the way her mother did, and in a post-Roe landscape where reproductive data can be subpoenaed in some jurisdictions, the stakes aren’t theoretical. It works, but she's lost the pattern recognition, the ability to see trends over time, the early signals that a good system could illuminate.

She's waiting for technology that works the way it should have all along,where the architecture makes trust possible, not just promised. She isn’t alone. Millions of people are waiting for the same thing: tools that treat the body as something to serve, not something to harvest.

Wellness technology should feel like partnership, not surveillance. It should amplify your ability to understand and respond to your body, not turn your body into a product. When your data works like a co-pilot instead of a spy, something fundamental changes: the tool becomes trustworthy not because of what it says, but because of what it can't do.

Because technology that knows you without owning you transforms wellness into something else entirely: a relationship that lasts.

That opens up a question worth asking: What becomes possible when your wellness tech actually knows you,and only you know it back?

That's where adaptation lives.

→ Next in the series: Adaptive Wellness: When Technology Learns Your Body's Language

_________________________

1 Jury Finds Meta Liable for Collecting Private Reproductive Data, National Law Review, August 2025. https://natlawreview.com/article/jury-finds-meta-liable-collecting-private-reproductive-health-data, and Reuters: https://www.reuters.com/legal/government/class-action-trial-looms-meta-flo-could-face-mind-boggling-damages-2025-07-15/

2 Kim, Joanne. "Data Brokers and the Sale of Americans' Mental Health Data," Duke Sanford School of Public Policy, February 2023. https://techpolicy.sanford.duke.edu/data-brokers-and-the-sale-of-americans-mental-health-data/

3 Aurora Health Agrees To $12.25M Settlement in Tracking Pixel Suit, Milberg LLP, September 2024. https://milberg.com/news/aurora-health-data-breach-proposed-settlement/; The Markup investigation (June 2022), https://themarkup.org/pixel-hunt/2022/06/16/facebook-is-receiving-sensitive-medical-information-from-hospital-websites/; cited in multiple lawsuits.

4 Alfawzan, Najd, et al. "Privacy, Data Sharing, and Data Security Policies of Women's mHealth Apps: Scoping Review and Content Analysis," JMIR mHealth and uHealth, May 6, 2022. DOI: 10.2196/33735

5 App Store Conversion Rate By Category in 2025, Adapty (citing Statista 2022 and AppTweak 2024 data). https://adapty.io/blog/app-store-conversion-rate/