Dignity Is the Missing Metric
In 2020, Oakland’s #OaklandUndivided initiative distributed over 36,000 laptops and 11,000 Wi-Fi hotspots to public school students. Connectivity reached 98%, but many still lacked quiet places to study. Others faced unreliable internet or had teachers untrained in using the devices. The initiative closed the access gap—but not the learning one.
Students weren’t just handed tools—they were left to figure them out, often without support. That’s not merely a tech issue, but a design failure that reveals a dignity gap.
While we can’t anticipate every barrier, failing to follow up on what worked and what didn’t risks mistaking effort for impact. Donating laptops or funding Wi-Fi may close the access gap, but without support, training, or feedback, the real goal—lasting learning—can remain out of reach.
Dignity is the difference between a solution used with confidence and a solution quietly abandoned. Just because a solution is used doesn’t mean it was chosen or trusted. Did people understand how to use it? Were they part of shaping how it was delivered? Would they choose it again today? These aren’t soft questions. They are the foundation of lasting impact.
When Metrics Miss What Matters
In impact work, data often tells the story. However, when that story excludes the people it’s meant to serve, its meaning can be lost.
Among others, IRIS+ standardizes impact metrics. GIIRS evaluates governance, labor, and environmental standards. Lean Data, developed by Acumen, gathers user feedback via mobile surveys with questions like: Did this improve your life? Would you recommend it? This tool shows that capturing meaningful outcomes is possible. Yet data that reflects user experience remains rare. According to 60 Decibels’ 2023 Microfinance Index, covering 32 countries, over 80% of financial service providers don’t collect consistent customer feedback. That matters. Without it, impact reports often focus on what was delivered, not whether it met people’s needs.
Consider a microloan program that reports a 95% repayment rate. On paper, it looks like the model is working. However, without hearing from borrowers, what’s left out? Were they growing their businesses, or making hard trade-offs to stay on track? Did repayment reflect confidence, or quiet pressure? The numbers may suggest one kind of success; lived reality may suggest another. Without feedback, it’s hard to know whether the product helped people move forward or just hold on.
That distinction matters—not just for what appears in the data, but for what people lived through. When feedback is missing, even the clearest metrics can point in the wrong direction. Dignity begins by listening.
When Assumptions Shape Design
Design begins long before anything is built or delivered. When lived reality isn’t part of that early process, even well-meant solutions can fall short of upholding dignity.
Back in 2007, the Kenya Slum Upgrading Programme relocated families from Kibera, a dense informal settlement in Nairobi, to nearby apartments. Many returned, as higher rents and smaller spaces made the relocation unsustainable. The program overlooked economic realities. Without dignity, progress unraveled.
In the brief window between relocation and return, the intervention may have looked promising. Promising to whom, and on what terms? Did families stay because the housing worked, or because they hoped it eventually would? What trade-offs shaped their decisions, and were those recognized by the program designers? When people began moving back, was that read as feedback or dismissed as resistance?
What would success have looked like, not on a chart, but in someone’s daily life? It’s a reminder that thoughtful design begins not with assumptions, but with listening—early, often, and with humility.
When Dignity Is Built In
Sometimes, dignity doesn’t come from changing the entire system, but from small, thoughtful shifts that make people feel seen and valued.
PEG Africa, a West African solar company, utilized Lean Data after noticing usage drop-offs. They found that satisfaction depended more on communication with agents than on hardware quality. By retraining staff to improve customer interactions, the company observed enhanced customer retention and repayment rates. Dignity was restored through clarity, and results followed.
In Tanzania, Zola Electric lets customers pay via mobile money in Kiswahili. Local agents fix issues, and calls are answered by nearby staff. The result? Fewer defaults, faster issue resolution, and higher rates of long-term use. The system works because it reflects how people live and communicate. Dignity is built into the design, and it shows.
Since 2018, women farmers in Cambodia’s Takeo Province have received solar irrigation through a UN Initiative. Adoption rose only when loans matched income cycles, training fit their schedules, and users gave feedback. Once those changes were made, loan repayments increased, crop yields rose, and fewer systems were abandoned. When dignity entered the design, the technology gained traction and stayed in use.
Metrics That Reflect Experience
Tools exist that place lived experience at the center, moving beyond usage and reach to help us better understand dignity. Among them:
- Lean Data uses mobile surveys to capture users’ views on value and usability.
- Most Significant Change relies on participatory storytelling that identifies key impacts.
- Outcome harvesting starts with observed change and traces back to root causes.
These aren’t fringe methods. They’re reshaping how decisions are made. They mark a shift—from tracking delivery to understanding reality, and honoring dignity.
When reading impact reports, look beyond the numbers. Ask:
- Did users shape the solution, or was it imposed?
- Were costs and support clearly explained?
- Would users recommend it, or hesitate, knowing its flaws?
Adoption, then, is not proof of dignity, but often reflects it, especially when people choose freely to keep using a product.
What If Users Defined Success?
True impact isn’t defined by what was handed out. It’s found in what continues to matter. While dashboards offer valuable insights about reach, they often overlook whether a product earned trust or found a meaningful place in people’s lives. That knowledge lives in daily experience—in the daily choice to keep using a product, or to let it go.
So when you see “5,000 lives served,” pause. Ask what the number says about lived experience:
- Did people keep using the product?
- Who found it valuable—and who didn’t?
- What wasn’t measured?
Reader Question:
What might change if users defined success and the metrics to measure it?