Dark Metrics: Illuminating the Negative Impact of Digital Health Design
Summary
Traditional design metrics and KPIs are often geared towards measuring product success. Dark metrics challenge this paradigm by proactively measuring the unintended yet harmful psychological, social, and physical effects of our technologies. The examples within digital health are plentiful. From accelerating burnout among clinicians to widening racial disparities in quality of care, we can only reach the height of our most courageous solutions when we expose our deepest failures.
Key Insights
-
•
Traditional product metrics, like the Google HEART framework, often miss broader impacts on users, focusing narrowly on product success rather than holistic well-being.
-
•
Dark Metrics is a framework designed to measure negative unintended effects in digital health across four dimensions: disempowerment, exclusion, addiction, and distraction.
-
•
Disempowerment occurs when technology removes users’ autonomy, such as opaque black-box AI systems that undermine clinician or patient decision-making.
-
•
Exclusion can be subtle, as algorithms that proxy biased variables like healthcare cost can reproduce racial disparities without explicit intent.
-
•
Racial equity in design can be assessed using heuristics or rubrics co-created by diverse teams, as demonstrated by Raven’s IBM colleagues Dre Barbara, Sherees Cooper, and Morgan Foreman.
-
•
Addiction to technology is an overused concept in consumer health, but distinguishing healthy from excessive use requires linking engagement data to well-being measures.
-
•
Distraction from core tasks is common in clinical environments when new tech disrupts workflows, evidenced by studies with ER staff and clinical trial recruitment tools.
-
•
Ethics frameworks like the Institute for the Future’s Ethical OS help anticipate risks like surveillance, bias, and data control, which inform Dark Metrics design principles.
-
•
Engaging diverse stakeholders and including co-creation early in research helps uncover biases and unintended consequences before launch.
-
•
Addressing negative impacts requires transparency with clients and a strong ethical posture, even when business priorities may conflict with user protection.
Notable Quotes
"The traditional product metrics focus narrowly on the product or near-term impact but fail to capture what success means for the whole person."
"Within IBM Watson, we prefer the term augmented intelligence rather than artificial intelligence to emphasize support, not takeover."
"An AI algorithm that didn’t explicitly consider race still produced racial disparities by using healthcare costs as a proxy."
"I am a Black person, but I do not have every Black experience. Not having experienced something is not proof that it doesn’t exist."
"The difference between technology and slavery is that slaves are fully aware they are not free."
"Doctors want to help people, not be on a machine all day; many health technologies are more distracting than helpful."
"We can assess distraction by observing time spent on screens versus with patients, and self-reported mental effort and stress."
"It’s important to ask, before any new release, what are all the things that could possibly go wrong?"
"Our jobs are to protect users from harm. If clients don’t care about side effects, it may be necessary to draw a line and walk away."
"Storytelling is highly effective in helping stakeholders understand the complete performance of products, including the darker sides."
Or choose a question:
More Videos
"Screen readers have two modes: browse mode for navigating content and focus mode to interact with form fields or apps."
Sam ProulxEverything You Ever Wanted to Know About Screen Readers
June 11, 2021
"Sponsor sessions are completely free and offer content just as valuable as the main program sessions."
Bria AlexanderOpening Remarks
November 17, 2022
"Taking time to intentionally create a career strategy helps avoid reactive decisions based on fear."
Corey Nelson Amy SanteeLayoffs
November 15, 2022
"Shared language lets different roles venture out of their comfort zones and collaborate on changing the enterprise system."
Milan GuentherA Shared Language for Co-Creating Ambitious Endeavours
June 6, 2023
"It’s not a theme unless you can put your hand over every sticky note and the label explains the story."
Erin May Roberta Dombrowski Laura Oxenfeld Brooke HintonDistributed, Democratized, Decentralized: Finding a Research Model to Support Your Org
March 10, 2022
"No one uses the screen reader with out-of-the-box settings; users customize it to fit their needs."
Sam ProulxUnderstanding Screen Readers on Mobile: How And Why to Learn from Native Users
June 6, 2023
"Adding a senior researcher, Izzy, in field meant we could have more, smaller research tracks and better support for research newbies."
Mujtaba HameedThe new horizon of ethnography: using AI to unlock the full potential of in-person research
March 11, 2026
"The internet as the self-regulating market is a failed experiment."
Ilana LipsettAnticipating Risk, Regulating Tech: A Playbook for Ethical Technology Governance
December 10, 2021
"You own the design system, which gives you the unique ability to integrate that accessibility thinking into all of your components."
Samuel ProulxFrom Standards to Innovation: Why Inclusive Design Wins
September 10, 2025