
Beyond Good Intentions: What AI Bias Reveals About Building Better Tech Systems
When algorithms trained on inequality make decisions, they don't fix disparities—they encode them
A new research paper on AI in fintech, titled The Gendered Algorithm (arXiv:2504.07312) reveals that gender bias can seep into so-called “objective” credit scoring systems. The models provided women with fewer and smaller loans, not due to creditworthiness, but because historical data shaped by inequality trained the algorithms.
This isn’t just about fintech. It’s a warning for anyone designing systems meant to serve the public good.
Neutral Technology Doesn’t Exist
At Digitunity, we don’t assume that technology is neutral. We are building systems, such as upgrading our Digitunity Connect platform with integrated conversational AI, with intentional focus on equity, practicality, and the realities of how people live and work.
We help coalitions and partners turn systems and technology into real solutions that serve people's needs.
But even the best tools and plans fall short when the systems behind them reinforce the wrong outcomes.
What This Means for the Digital Opportunity Field
Assumptions matter. Tech solutions often assume neutrality. “Devices are good, donations help.” But who receives the devices? Who is ready to use them? Who is left out?
Design choices shape outcomes. The way we distribute shows our priorities. It involves who qualifies, which partners we support, and how we structure implementation. Intentional design leads to sustainable results.
Catalysts must ask better questions. Digitunity is not a direct service provider. We help shape the field. This means questioning what we think we know, hearing voices that are often ignored, and building effective systems that really work.
When systems are designed without accountability they tend to replicate existing disparities.
Learning from the Past Helps Us Build Better Systems
History reminds us that new technologies often reinforce old patterns. As noted in a recent GovTech article, when systems are designed without accountability they tend to replicate existing disparities.
This is as true for past tools as it is for today’s AI-driven platforms. To build systems that genuinely help people, we must carry forward the lessons of what hasn’t worked. We should avoid creating new tools just for their function or novelty.
How Digitunity Applies This Thinking
We use a systems-thinking approach across our work:
Platform development: The Digitunity Connect platform features conversational AI and self-guided tools. These tools make it easy for everyone to donate and reuse computers, no matter their tech skills.
Local project design: We assist communities in state and regional efforts. We help them create delivery models that bridge infrastructure gaps. This also promotes broad, meaningful participation in the digital economy.
Research and field guidance: We gather data on device supply, population needs, service capacity and impact. These insights guide smart, sustainable tech programs. They meet real needs in actual communities.
We are not just moving computers. We are helping communities build long-term access to computers, tools, support, and digital opportunity.
A Call to Build with Intention
This research offers a warning, but also a call to action.
We need more than good intent. We need intentional systems. Systems change that does more than simply close gaps on paper. They must create real, usable pathways to participate in today’s digital world.
Let’s stop assuming our systems are neutral.
Let’s build them to reflect the people they’re meant to serve.
If you're designing systems meant to create opportunity...whether platforms, programs, or partnerships...let's talk about building with intention, not just good intent.

