
We love our wellness apps: they track steps, monitor sleep, count calories, and encourage mindfulness. But beneath the glossy interface lie many pitfalls — especially for people whose bodies, lifestyles, or skin tones were not centered in the design of those algorithms or hardware.
If you’ve ever felt frustrated by inaccurate data, confusing feedback, or an app that doesn’t seem made for you, this article is for you. We’ll unpack how wellness apps can backfire, especially for folks from Black, Indigenous, People of Color (BIPOC) communities and other underrepresented groups — and how to use or demand better tools.
Skewed Datasets & Underrepresented Bodies
Many wellness and health apps are built, trained, tested, or validated on data from homogeneous or more privileged populations. This means large groups (by race, ethnicity, age, body size, or socioeconomic status) are missing or under-represented — and the models and recommendations that result reflect this bias.
- A report by the Center for Democracy & Technology (CDT) called Heal-gorithms identifies how datasets often overrepresent white, young, educated, or otherwise more privileged users — creating tools that work “well” for those groups but less so for those outside them.
- A McGill University-led study found that many symptom-checking apps give less accurate diagnoses for underrepresented ethnic/racial populations due to biased or narrow data.
Why this matters:
- Recommendations like “ideal body weight,” “normal heart rate ranges,” or “healthy skin types” may be calibrated for lighter skin tones, narrower body sizes, or Western diets.
- When your body diverges from the “norm” assumed by the app, the feedback can be useless or even harmful — causing discouragement, misinformation, or health mismanagement.

Assumptions About Ideal Bodies, Diets, & Motions
Beyond data, many wellness apps carry embedded assumptions:
- What an “ideal” body looks like
- What kinds of movement count as “good exercise”
- What foods are healthy/cultural
These assumptions can subtly— or not so subtly — exclude or shame people whose preferences, culture, or body chemistry differ from what the app designers assumed.
Examples:
- An app that only gives calorie-count feedback based on Western foods (lettuce, quinoa, apples) may not understand foods that are staples in your culture.
- Fitness apps that expect high-intensity exercise (“HIIT,” “running,” etc.) without considering that some users have mobility challenges, chronic conditions, or different physical preferences.
- Body image: apps sometimes promote thinness, visible abs, or certain body shapes as ideal, which can worsen self-esteem issues.
These assumptions can lead users away from sustainable practices and toward self-criticism or behavior that isn’t realistic long-term.
RELATED: Alexa, Help Me Heal: How Voice Apps Could Revolutionize Recovery
Poor Calibration for Darker Skin in Sensors, Cameras, Heart Rate Monitors
This is one of the more concrete and well-documented issues. Many wearables and smartphone sensors rely on optical or light-based sensors (LEDs, photoplethysmography (PPG), etc.) to read heart rate, blood oxygen (SpO₂), or skin temperature. These sensors often assume a certain light reflection/absorption level that is more accurate for lighter skin tones, but less so for darker skin, or for skin with more melanin, or even with tattoos or higher BMI.
Evidence / Cases:
- A study/modeling showed that wearables like Apple Watch Series 5, Fitbit Versa 2, and Polar M600 unevenly detect signals in darker skin or in people with obesity — signals become weaker, making heart rate readings, SpO₂, etc., less reliable.
- Fitbit is being sued in a proposed class-action for allegedly inaccurately reading SpO₂ for users with darker skin, because the devices were marketed as able to measure oxygen saturation reliably.
- Research comparing wearable devices shows that many are less accurate for heart rate among people of color.
Why it backfires:
- Inaccurate heart rate or SpO₂ readings can lead to misguided health decisions: underestimating risk, ignoring warning signs, or worse.
- Users may distrust all their data or feel invalidated (“it must be me, not the device”).
Reinforcement Loops & Feedback-Based Errors
Wellness apps often use real-time feedback (badges, streaks, visible charts) to motivate users. This feedback loop can be very effective—but also problematic when:
- The feedback is based on flawed data (see above). If your heart rate reading is wrong, the fitness equivalent (calories burned, recovery metrics, etc.) is also off.
- The app nudges users to do more, always “increase,” or compare to others, rather than listen to their body. Streaks and achievements can create pressure, guilt, or anxiety.
- Feedback loops do not account for rest, recovery, or context (life stress, illness, weather, etc.). For instance, telling someone with chronic illness or menopausal changes that their “sleep quality” is “poor” day after day because the sensor misreads their skin or their movement can lead to discouragement rather than helpful change.

How to Audit Apps & Hold Them to Better Standards
Given all this, what can you do — as a user — to protect yourself, and help push the industry toward better design?
Here are practical tips, plus what good developers/companies should be doing:
For Users
- Check the documentation: Does the app mention what populations/skin tones/body types the training data includes?
- Test sensor accuracy: For example, compare the heart rate from a wearable with your manual pulse or a medical device regularly. Notice consistent errors?
- Use feedback carefully: Don’t let streaks or “ideal” metrics guilt you. If the app flags something as “poor”, ask: “Is the sensor, my context, or my body type contributing?”
- Track patterns, not perfection: Using the app as one data stream among many (how you feel, your rest, your lab/clinical results) rather than relying solely on numerical feedback.
- Choose tools with transparency: Open privacy policies, documentation of algorithms, clear disclaimers about limits or errors — especially regarding skin tone or data diversity.
For Developers / Companies
(What you should see more of / push for / require in app development and regulation.)
- Include diverse datasets in training: skin tones, body sizes, ages, health statuses, and different races/ethnicities.
- Test hardware sensors (PPG, optical sensors, etc.) across a wide range of users before releasing to market.
- Make clear documentation of limitations: e.g., “This sleep tracker works less accurately when used on dark skin tones / with tattoos / in high BMI.”
- Allow customization or calibration: user input about skin tone/body type that helps the algorithms adjust or compensate.
- Regulate feedback design: create sane default feedback that encourages rest, variation, doesn’t punish rest or non-ideal performance, and respects human context (illness, cycles, aging).
- Establish third-party review or audit: similar to how medical devices are regulated — or at least transparent peer review in digital health / AI fairness research.
- Data privacy and security: ensure user data is not exposed, misused, or sold without clear consent; ensure anonymization is meaningful.
What to Watch Out For: Red Flags
Here are warning signs your wellness app might be working against you:
- Claims of “one size fits all” wellness or “works for everyone” without clarifying limits.
- Lack of clarity about who was included in testing (skin tone, ethnicity, body size).
- Data that seems obviously incorrect: heart rate spikes when you’re resting; SpO₂ readings that don’t match clinical devices, etc.
- Constant reminders, pings, or “bad” metrics that shame rather than encourage. High alert/streak loss that penalizes rest.
- Hidden costs: features that require a “premium subscription” to access more accurate or transparent data.
Toward Better Wellness Tech: A Balanced Mindset
It’s not about rejecting wellness apps altogether — many are powerful tools for awareness, motivation, and tracking improvement. But using them well means staying critical, listening to your body, and understanding that technology can’t fully replace lived experience, medical advice, or cultural nuance.
- Use apps as assistants, not authorities.
- Be ready to adjust or switch apps when the fit isn’t right.
- Share your experience: companies improve when users from underrepresented groups demand better calibration and transparency.






