Your Data, Your Body: Why Cirdia's Privacy Approach Matters

Woman reading on smartphone at cafe, reflecting wellness and privacy.

Every transformative technology begins with a moment of clarity—that instant when you realize the status quo isn't just inconvenient, but fundamentally misaligned with human dignity. For me, that moment came when I systematically compared how different wearable companies handle our most intimate body data.

The Revelation: When I Compared the Fine Print

Unlike most people, I've always been drawn to legal agreements. There's something fascinating about the precision of language, the careful boundaries being established, the dance between protection and permission. I'm that odd person who actually enjoys reading terms of service—each one a window into corporate values and priorities.

So I embarked on a comprehensive audit—downloading, printing, and meticulously comparing the privacy policies and terms of service from every major player in the wearable space. I created spreadsheets tracking key provisions, highlighted concerning clauses, and mapped how permissions flowed across their ecosystems.

What I discovered was deeply troubling. These companies weren't just collecting data—they were claiming sweeping rights to use, change, publish, and share our most intimate biometric information however they wanted. Most disturbing was the almost complete silence on AI training, with few making clear promises about not feeding your heart rate, sleep patterns, and stress levels into larger systems.

The Crucial Distinction: Biometric Data vs. Personal Information

What became abundantly clear through my research was a fundamental problem in how companies handle our most sensitive information. Most wearable companies make no meaningful distinction between your biometric data—the intimate measurements of your body's functions—and other personal information like your email address or birthday.

This false equivalence creates the foundation for deeply problematic data practices. When your heart rate variability is treated with the same privacy protections as your zip code, something has gone terribly wrong.

The Cirdia Difference: Privacy as Our Foundation

At Cirdia, we've taken a fundamentally different approach. We believe your biometric data deserves special protection—not just in policy documents, but in how our entire system is designed. As a Public Benefit Corporation chartered in Colorado, we have a legal obligation that goes beyond profit maximization. Our corporate charter explicitly commits us to:

"Empowering User Agency: Building systems that respect users as competent decision-makers by providing transparent opt-in processes, collaborative research opportunities, and meaningful control over their wellness journey and data sharing choices."

This isn't just aspiration—it's a legally binding commitment that shapes every aspect of our product and business.

We make a clear legal and technical distinction between your biometric data and your account information. Your body's measurements receive fundamentally different privacy protections than your email address or other account details. This isn't just semantic—it shapes how our entire system is built, with local-first processing that keeps your raw biometric data on your device, not our servers.

While some companies like Apple and Oura have taken steps in the right direction, Cirdia was created from the ground up with privacy and user agency as our core design principles—not features added later or compromises made within a different business model.

From Personal Revelation to Industry Contrast

Let me share what I discovered when I actually took the time to decode what other companies are claiming rights to do:

Fitbit/Google: Your Body as a Data Mine

Fitbit's terms state explicitly that when you share content through their services, you grant them "the right to use, copy, modify, publicly display, publicly perform, reproduce, translate, create derivative works from, and distribute your content". That heartfelt journal entry about your health struggles? That's now Google's content to use as they see fit.

Even more concerning, when you use Fitbit with a Google account, your data is handled according to Google's privacy practices. This means your biometric data becomes part of Google's vast data ecosystem—the same one powering their advertising empire. While they have publicly said they won’t combine your biometrics with ads, they have not bothered to make that promise legally binding in either their terms of service or their privacy policy, which means they can try it out and even implement this without consequence.

WHOOP: The Surveillance Business Model

Unlike what the sleek marketing suggests, WHOOP's privacy policy reveals a business model built on surveillance. They collect data through cookies and other automated technologies, including tracking your interactions over time across the web and other services. They and their advertising partners use this information to serve you targeted ads.

Your sleep patterns, recovery metrics, and heart rate variability become inputs for advertising algorithms. WHOOP shares your data with "service providers, vendors who advertise our Services or other WHOOP products, security and fraud prevention consultants, analytics providers, and staff augmentation and contract personnel". That's an extraordinarily broad network of third parties gaining access to your most intimate biometric data.

Most concerning is that WHOOP makes no technical or legal distinction between your biometric data and regular personal information—it's all treated as a resource to be leveraged for business purposes.

Samsung: No Control, No Choice

Samsung's Health terms of service are particularly aggressive in diminishing user control. They "EXPRESSLY DISCLAIM ANY AND ALL LIABILITY" for how your health information is used, while simultaneously reserving the right to "remove or disable access to the Fitbit Service, any Fitbit Content, or Your Content at any time and without notice".

In other words, they can delete your health history at any time with no warning, but accept no responsibility for how that same data is used or shared. This imbalance of power is staggering.

Xiaomi: Your Data, Their Ecosystem

Xiaomi's approach to privacy is particularly alarming for global users. They state explicitly that they "may use and combine the information we collect about you from Samsung Health with data from other services or features you use and your devices, and other sources, to provide you with a better experience". This broad combination of data across services creates comprehensive user profiles.

Making no distinction between biometric data and other personal information, Xiaomi freely shares your information with affiliates and third parties for marketing purposes.

The Real-World Impact of These Policies

These terms and policies aren't just abstract legal frameworks—they have concrete impacts on people's lives:

  1. Health Insurance Discrimination: When biometric data is sold or shared with data brokers, it can ultimately influence insurance algorithms, potentially resulting in higher premiums or denied coverage based on activity patterns.
  2. Sensitive Life Event Exposure: We've all heard stories of targeted ads revealing pregnancies before women were ready to share the news. These aren't urban myths—they're the predictable outcome of health data being fed into advertising systems.
  3. Location Tracking Without Consent: Many fitness apps track your precise location even when not needed for functionality. This creates detailed maps of your movements, habits, and patterns.
  4. Perpetual Data Retention: Most fitness platforms retain your data indefinitely, even after you've deleted your account. This creates permanent digital shadows of our physical existence that we can never fully reclaim.

Our Design Principles: Better by Design, Not Just Better Terms

Local-First Architecture: Privacy by Design

Unlike traditional wearables that require cloud processing, Cirdia uses a local-first approach. Your data is processed primarily on your device or your phone. By design, Cirdia does not store your raw biometric data on centralized servers.

This architectural choice isn't just more private—it's more resilient. You don't lose access to your health insights when servers go down or companies change policies.

Transparent Algorithms

All algorithms used in our App are open source or auditable. This transparency extends to how we communicate insights about your health—no black box recommendations or unexplainable guidance.

When it comes to AI and machine learning, we've reimagined the approach entirely. If we ever incorporate model training using your data, we would only do so through a distributed, local-first framework that keeps your raw biometric data on your device. This means the insights and patterns can improve our collective understanding without your biometric information ever leaving your personal sphere of control.

This isn't a technical limitation—it's a deliberate architectural choice that aligns with our core values. We believe distributed AI approaches that respect data boundaries are not just more private but ultimately more innovative, drawing insights from diverse experiences while honoring individual autonomy.

Data Ownership in Practice

You own your data. The App offers tools to visualize, export, or delete your data at any time. This isn't just rhetoric—it's built into how our technology works.

When you choose to share data with Cirdia for research or product improvement, we follow the principle of data minimization—collecting only what's necessary, for the specific purpose you've consented to, and only for the duration required.

Building a Movement, Not Just a Product

The wearable device industry has normalized deeply problematic data practices by burying them in legal documents and making them seem inevitable. At Cirdia, we're not just building another device with marginally better terms. We're reimagining what relationship between technology companies and users should look like—one founded on respect, transparency, and genuine partnership.

When our market research showed that women wanted "a fitness tracker that doesn't sell you out" and to "stay in their bodies, not on their phones," they weren't just expressing product preferences. They were articulating a vision for a fundamentally different relationship with technology—one that enhances their embodied experience rather than extracting value from it.

Join Us in Reimagining Wearable Privacy

Your intimate body data—your heartbeats, sleep patterns, stress levels, and activity—deserve better than becoming inputs for advertising algorithms or assets on corporate balance sheets.

At Cirdia, we're committed to proving that better approaches aren't just possible—they're essential for the future of ethical technology. Your data, like your body, should always remain yours to control.

Because wellness isn't about optimization or gamification. It's about presence, autonomy, and living fully in your body. And technology should serve that vision, not undermine it.

Join our community: https://cirdia.com

Mary Camacho is CEO and Co-founder of Cirdia, a Public Benefit Corporation reimagining ethical wellness technology. This post is part of our ongoing commitment to transparency about how we approach privacy and data governance.