• Skip to main content
  • Skip to secondary menu
  • Skip to primary sidebar
  • Skip to footer
BlackDoctor.org
Where Wellness & Culture Connect

Where Wellness & Culture Connect

  • Conditions
  • Wellness
  • Lifestyle
  • Longevity
  • Clinical Trials
  • Resources
    • Generational Health
    • Top Blacks in Healthcare 2025
    • Hall Of Fame
    • Clinical Trials Resource Center
    • Obesity Resource Center
    • Cancer Resource Center
    • Wellness on the Yard
    • Immunocompromised Care
    • BDO Resource Library
  • Find A Doctor
  • BDO TV
Home / Wellness / AI & Technology Dependencies / The Surprising Ways Your Wellness App Is Working Against You

The Surprising Ways Your Wellness App Is Working Against You

wellness app

We love our wellness apps: they track steps, monitor sleep, count calories, and encourage mindfulness. But beneath the glossy interface lie many pitfalls — especially for people whose bodies, lifestyles, or skin tones were not centered in the design of those algorithms or hardware.

If you’ve ever felt frustrated by inaccurate data, confusing feedback, or an app that doesn’t seem made for you, this article is for you. We’ll unpack how wellness apps can backfire, especially for folks from Black, Indigenous, People of Color (BIPOC) communities and other underrepresented groups — and how to use or demand better tools.

Skewed Datasets & Underrepresented Bodies

Many wellness and health apps are built, trained, tested, or validated on data from homogeneous or more privileged populations. This means large groups (by race, ethnicity, age, body size, or socioeconomic status) are missing or under-represented — and the models and recommendations that result reflect this bias.

You May Also Like
Psoriatic Arthritis Can Feel Beyond Your Control. Consider a Different Direction. Learn More Here. Psoriatic Arthritis Can Feel Beyond Your Control. Consider a Different Direction. Learn More Here.

  • A report by the Center for Democracy & Technology (CDT) called Heal-gorithms identifies how datasets often overrepresent white, young, educated, or otherwise more privileged users — creating tools that work “well” for those groups but less so for those outside them. 
  • A McGill University-led study found that many symptom-checking apps give less accurate diagnoses for underrepresented ethnic/racial populations due to biased or narrow data. 

Why this matters:

  • Recommendations like “ideal body weight,” “normal heart rate ranges,” or “healthy skin types” may be calibrated for lighter skin tones, narrower body sizes, or Western diets.
  • When your body diverges from the “norm” assumed by the app, the feedback can be useless or even harmful — causing discouragement, misinformation, or health mismanagement.

Assumptions About Ideal Bodies, Diets, & Motions

Beyond data, many wellness apps carry embedded assumptions:

You May Also Like
Get GLP-1s Delivered to You As Low As $99/Month! Get GLP-1s Delivered to You As Low As $99/Month!

  • What an “ideal” body looks like
  • What kinds of movement count as “good exercise”
  • What foods are healthy/cultural

These assumptions can subtly— or not so subtly — exclude or shame people whose preferences, culture, or body chemistry differ from what the app designers assumed.

Examples:

  • An app that only gives calorie-count feedback based on Western foods (lettuce, quinoa, apples) may not understand foods that are staples in your culture.
  • Fitness apps that expect high-intensity exercise (“HIIT,” “running,” etc.) without considering that some users have mobility challenges, chronic conditions, or different physical preferences.
  • Body image: apps sometimes promote thinness, visible abs, or certain body shapes as ideal, which can worsen self-esteem issues.

These assumptions can lead users away from sustainable practices and toward self-criticism or behavior that isn’t realistic long-term.

RELATED: Alexa, Help Me Heal: How Voice Apps Could Revolutionize Recovery

Poor Calibration for Darker Skin in Sensors, Cameras, Heart Rate Monitors

This is one of the more concrete and well-documented issues. Many wearables and smartphone sensors rely on optical or light-based sensors (LEDs, photoplethysmography (PPG), etc.) to read heart rate, blood oxygen (SpO₂), or skin temperature. These sensors often assume a certain light reflection/absorption level that is more accurate for lighter skin tones, but less so for darker skin, or for skin with more melanin, or even with tattoos or higher BMI.

Evidence / Cases:

  • A study/modeling showed that wearables like Apple Watch Series 5, Fitbit Versa 2, and Polar M600 unevenly detect signals in darker skin or in people with obesity — signals become weaker, making heart rate readings, SpO₂, etc., less reliable. 
  • Fitbit is being sued in a proposed class-action for allegedly inaccurately reading SpO₂ for users with darker skin, because the devices were marketed as able to measure oxygen saturation reliably. 
  • Research comparing wearable devices shows that many are less accurate for heart rate among people of color. 

Why it backfires:

  • Inaccurate heart rate or SpO₂ readings can lead to misguided health decisions: underestimating risk, ignoring warning signs, or worse.
  • Users may distrust all their data or feel invalidated (“it must be me, not the device”).

Reinforcement Loops & Feedback-Based Errors

Wellness apps often use real-time feedback (badges, streaks, visible charts) to motivate users. This feedback loop can be very effective—but also problematic when:

  • The feedback is based on flawed data (see above). If your heart rate reading is wrong, the fitness equivalent (calories burned, recovery metrics, etc.) is also off.
  • The app nudges users to do more, always “increase,” or compare to others, rather than listen to their body. Streaks and achievements can create pressure, guilt, or anxiety.
  • Feedback loops do not account for rest, recovery, or context (life stress, illness, weather, etc.). For instance, telling someone with chronic illness or menopausal changes that their “sleep quality” is “poor” day after day because the sensor misreads their skin or their movement can lead to discouragement rather than helpful change.
wellness app
A man is at home, he is using a mobile phone in the living room

How to Audit Apps & Hold Them to Better Standards

Given all this, what can you do — as a user — to protect yourself, and help push the industry toward better design?

Here are practical tips, plus what good developers/companies should be doing:

For Users

  • Check the documentation: Does the app mention what populations/skin tones/body types the training data includes?
  • Test sensor accuracy: For example, compare the heart rate from a wearable with your manual pulse or a medical device regularly. Notice consistent errors?
  • Use feedback carefully: Don’t let streaks or “ideal” metrics guilt you. If the app flags something as “poor”, ask: “Is the sensor, my context, or my body type contributing?”
  • Track patterns, not perfection: Using the app as one data stream among many (how you feel, your rest, your lab/clinical results) rather than relying solely on numerical feedback.
  • Choose tools with transparency: Open privacy policies, documentation of algorithms, clear disclaimers about limits or errors — especially regarding skin tone or data diversity.

For Developers / Companies

(What you should see more of / push for / require in app development and regulation.)

  • Include diverse datasets in training: skin tones, body sizes, ages, health statuses, and different races/ethnicities.
  • Test hardware sensors (PPG, optical sensors, etc.) across a wide range of users before releasing to market.
  • Make clear documentation of limitations: e.g., “This sleep tracker works less accurately when used on dark skin tones / with tattoos / in high BMI.”
  • Allow customization or calibration: user input about skin tone/body type that helps the algorithms adjust or compensate.
  • Regulate feedback design: create sane default feedback that encourages rest, variation, doesn’t punish rest or non-ideal performance, and respects human context (illness, cycles, aging).
  • Establish third-party review or audit: similar to how medical devices are regulated — or at least transparent peer review in digital health / AI fairness research.
  • Data privacy and security: ensure user data is not exposed, misused, or sold without clear consent; ensure anonymization is meaningful.

What to Watch Out For: Red Flags

Here are warning signs your wellness app might be working against you:

  • Claims of “one size fits all” wellness or “works for everyone” without clarifying limits.
  • Lack of clarity about who was included in testing (skin tone, ethnicity, body size).
  • Data that seems obviously incorrect: heart rate spikes when you’re resting; SpO₂ readings that don’t match clinical devices, etc.
  • Constant reminders, pings, or “bad” metrics that shame rather than encourage. High alert/streak loss that penalizes rest.
  • Hidden costs: features that require a “premium subscription” to access more accurate or transparent data.

Toward Better Wellness Tech: A Balanced Mindset

It’s not about rejecting wellness apps altogether — many are powerful tools for awareness, motivation, and tracking improvement. But using them well means staying critical, listening to your body, and understanding that technology can’t fully replace lived experience, medical advice, or cultural nuance.

  • Use apps as assistants, not authorities.
  • Be ready to adjust or switch apps when the fit isn’t right.
  • Share your experience: companies improve when users from underrepresented groups demand better calibration and transparency.
By Dominique Lambright | Published October 9, 2025

October 9, 2025 by Dominique Lambright

The Latest In AI & Technology Dependencies

innovations

Healthcare in 2025: 5 Mind‑Blowing Innovations You Can Use Today

Whether we like it or not, systemic health disparities are a thing. Marginalized communities, forgotten projects, and low-income areas simply do not have the access that other places take for granted. While legal frameworks and grassroots efforts help, there is read more about Healthcare in 2025: 5 Mind‑Blowing Innovations You Can Use Today
AI therapy

Illinois Bans AI Therapy, Leaving Black Families With Fewer Options

Artificial intelligence is increasingly used in mental health care through virtual therapists in VR settings, digital therapeutics, chatbots for symptom tracking, and wearable devices that monitor patients. While these tools show promise in reducing symptoms and improving access, experts caution read more about Illinois Bans AI Therapy, Leaving Black Families With Fewer Options
AI

Don’t Let a Chatbot Mislead You on Your Health

Artificial intelligence (AI) is moving fast into the world of healthcare. It’s in hospital systems, medical research, and even in our phones, ready to answer health questions in seconds. However, new studies are revealing that while AI can be beneficial, read more about Don’t Let a Chatbot Mislead You on Your Health
AI in healthcare

Code, Context, and Care: Navigating the Crossroads of AI and Healthcare Ethics

AI is rapidly changing the future of healthcare. Its ability to analyze complex medical and research data is helping improve diagnostics, drug development, and our overall knowledge of healthcare! But along with its promise comes serious ethical questions, particularly around read more about Code, Context, and Care: Navigating the Crossroads of AI and Healthcare Ethics
AI therapy

The Dangers Of Using AI As Therapy

The rise of AI-driven mental health tools—such as chatbots, virtual therapists, and emotional companion apps—has sparked widespread interest. Offering constant availability, perceived empathy, reduced costs, and privacy, these tools appear promising. Yet beneath this potential lie hazards that can undermine read more about The Dangers Of Using AI As Therapy
AI

Can AI Help You Invest?

Though artificial intelligence (AI) has been around for decades, it has made significant advances in the past year. These improvements have made it possible for people to use AI in instances that they hadn’t considered before. It has been particularly read more about Can AI Help You Invest?

Primary Sidebar

Subscribe to our newsletter

Icon

Caring for You, Too - Caregiver Workbook

1 file(s) 297 KB
Download

Trending Articles

12 Reasons Why It Feels Like Your Heart Rate Won’t Slow Down

heart rate

How to Treat Hidradenitis Suppurativa in Black People

How to Treat Hidradenitis Suppurativa in Black People

5 Early Signs of Bed Bugs You Need To Know

early signs of bed bugs

Why I Did Clinical Trials for TNBC: “It Very Likely Saved And Extended My Life”

Why I Did a Trial for TNBC: "It Very Likely Saved And Extended My Life"

This Clinical Trial Reversed a Rare Cause of Vision Loss

This Clinical Trial Reversed a Rare Cause of Vision Loss
Find a Culturally Sensitive Doctor

Footer

Where Wellness & Culture Connect

BDO is the world’s largest and most comprehensive online health resource specifically targeted to African Americans. BDO understands that the uniqueness of Black culture - our heritage and our traditions - plays a role in our health. BDO gives you access to innovative new approaches to the health information you need in everyday language so you can break through the disparities, gain control and live your life to its fullest.

Connect With Us

Resource Centers

  • Top Blacks in Healthcare
  • Clinical Trials
  • Wellness on the Yard
  • Cancer
  • Immunocompromised Care
  • About Us
  • Privacy Policy
  • Cookie Policy
  • Terms of Service
  • Careers
  • Advertise With Us
  • Advertising & Sponsorship Policy
  • Daily Vitamina
  • TBH

Copyright © 2025, Black Doctor, Inc. All rights reserved.