Skip to main content

Command Palette

Search for a command to run...

Credit Scoring for the Unseen Economy

The Farabi AI Manifesto

Published
6 min read
Credit Scoring for the Unseen Economy

Over a thousand years ago, a philosopher named Abu Nasr Al-Farabi wrote that the purpose of knowledge is to give strength to the intellect and light to the path of justice. He believed that a just society is one where systems serve all people, not just the privileged few.

We named our company after him because we believe the same thing.


The problem we refuse to ignore

Right now, over 70% of adults in Sub-Saharan Africa are unbanked or underbanked. 80% of informal workers have no formal credit history. The traders, artisans, farmers, and small business owners who form the backbone of our economies are invisible to the financial system. Not because they aren't creditworthy, but because the systems designed to measure creditworthiness were never designed to see them.

Traditional credit scoring was built for a world of salaried employees, bank statements, and bureau records. It works well for those already inside the system. For thin-file and no-file applicants, it returns a blank. And a blank means "no." Over 50% of creditworthy applicants are rejected simply because their financial lives don't fit the model's assumptions.

Worse, these models inherit biases from historical data across demographics and regions, reinforcing the very inequities they should correct. The result is a self-reinforcing cycle: denied applicants cannot build credit history and are pushed further from financial inclusion.

Alternative data offers a path forward, but most current efforts are marred by poor consent practices and irresponsible data use. And even when alternative data is used, final decisions remain opaque, making outcomes hard to explain or challenge.

This is the problem Farabi AI was built to solve.

What we believe

Creditworthiness is not the absence of data. It's the presence of the wrong lens.

Every person leaves a trail of financial behaviour. Mobile money transactions. Informal credit and repayment patterns. Airtime purchases. Utility payments. Rent paid to landlords in cash. These signals are rich, consistent, and deeply revealing of a person's relationship with money. Traditional models ignore them. We don't.

Alternative data done right requires consent done right.

We are not the first company to look at alternative data for credit scoring. But too many have done it irresponsibly, harvesting data without meaningful consent and using it in ways applicants never agreed to. At Farabi AI, every data point is user-authorised. Consent is not a checkbox. It is the starting point.

AI should expand access, not entrench bias.

Many credit scoring models inherit the biases embedded in the data they're trained on. They penalise geography, gender, and informality: the very factors that correlate with exclusion, not risk. At Farabi AI, fairness constraints are built into our model architecture, not bolted on as an afterthought. We actively measure and reduce demographic bias, targeting up to 30% reduction through constrained optimisation.

Every decision deserves an explanation.

We reject black-box scoring. Every Farabi score comes with transparent reason codes. Not just a number, but the "why" behind it. Applicants can see what drives their score and receive personalised, actionable steps to improve it. Lenders can see the decision drivers and audit them. If a score can't be explained, it shouldn't be trusted.

Ethical infrastructure should be open infrastructure.

Farabi AI is open-source. This is a deliberate choice, not an afterthought. When a system makes decisions that shape people's access to credit, housing, and opportunity, its logic should not be hidden behind proprietary walls. Open source means anyone can inspect our models, verify our fairness claims, and hold us accountable. It means researchers across Africa can build on our work instead of starting from scratch. It means regulators can audit what we do without relying on our word alone. Trust in AI is not built by asking people to believe you. It is built by showing your work.

What we're building

Farabi AI is an open-source, ethical credit scoring engine that uses machine learning and responsibly sourced alternative data to generate fair, transparent credit scores for thin-file and no-file applicants.

Our model learns from mobile money transactions and informal credit and repayment behaviour, converting these into measurable signals to predict creditworthiness, increasing approval rates by up to 40% for previously invisible borrowers.

For individuals, we provide a clear path to credit visibility through the Farabi App. Applicants onboard, authorise access to their transaction data, and receive a credit score with full transparency into what drives it and how to improve it. Accessible via USSD, IVR, and mobile app, with built-in encryption to protect user data.

For lenders, we provide the Farabi Dashboard: a web-based interface to access pre-scored, consent-verified applicants with full visibility into credit scores, risk tiers, and decision drivers.

For developers, we provide the Farabi API: a RESTful interface to integrate ethical scoring directly into lending infrastructure. Submit data, receive a credit score, risk tier, and reason codes.

One intelligence. Three interfaces. Built for Africa's diversity, from a basic phone to enterprise-grade integrations.

Why now

Africa's financial landscape is shifting. Mobile money adoption is accelerating. Digital transaction volumes are growing exponentially. The data to score the unscored already exists, scattered across telecom networks, mobile wallets, and digital payment platforms. What's been missing is a system that can read this data ethically, score it fairly, and explain its decisions transparently.

That system is Farabi AI.

Our commitment

We are building Farabi AI with a set of commitments we will not compromise:

  • Ethics first. Every model we ship will be evaluated for fairness before it's evaluated for performance.

  • Transparency always. We will never deploy a model we can't explain.

  • Consent by design. No one will be scored without their knowledge and permission. No data will be used beyond what the applicant authorises.

  • Bias accountability. We will publish how we measure and reduce bias, and hold ourselves to those standards publicly.

  • Open by default. Our scoring engine is open-source. We believe the systems that determine financial access should be inspectable, auditable, and improvable by anyone.

  • Africa-first. We are building for the markets that need this most, with the people who understand them best.

An invitation

If you're an individual who has been told "no" by a system that never tried to understand you, we see you.

If you're a lender who knows there are creditworthy people your current models are missing, we can help.

If you're a developer who believes financial infrastructure should be fair by default, build with us.

We're just getting started. Join our waitlist at myfarabi.com and be part of what comes next.


"The purpose of knowledge is to give strength to the intellect and light to the path of justice." — Al-Farabi

2 views