KOSA AI: Technology should be inclusive of ages, genders and races
Meet Layla and Sonali — a founding duo with a shared passion for driving positive impact through technology. With KOSA AI, they want to help companies be more cautious about using and building inclusive technologies.
As AI is on the rise, and the impact it will have on our lives is becoming increasingly clear, many policymakers and companies have started asking themselves: How can we make it inclusive and without biases?
You may have seen Twitter users testing and proving hypotheses of racial bias in image previews. It may seem harmless and be unconscious, but the underlying issues pose a serious threat when similar biases are prevalent in software used for hiring, healthcare or the criminal justice system. And that may just be the tip of the iceberg.
Layla Li and Sonali Sanghrajka have started KOSA AI to be a part of the solution. The two met while taking part in the Anter Nairobi incubator program. Having worked at several tech companies building automated decision-making systems, Layla realized that too often, the “human factors” were not considered leading to negative consequences.
“When you belong to several minority groups yourself, you really notice when the native impact comes into play. For example, at the incubator in Nairobi, thermal scanners were used to check out temperatures connected to Covid. And everywhere we went, the detection worked better on my lighter skin than it did on Africans in our group”, tells Layla, CEO of KOSA AI.
Companies either should care or will care — soon
Layla approached Sonali, who had already spent more than ten years working with global healthcare and pharmaceutical companies such as Johnson & Johnson, and she was quickly sold on starting KOSA AI together.
“I was aware of the idea already, and I know the impact this can have. Especially with companies wanting to come into the African market”, says Sonali.
With a shared passion for driving positive impact through technology, they started KOSA AI with the ambition to help companies move from positive values and good intentions to really understand how they can become ethical AI-driven.
Thanks to the growing academic and media interest and increased regulation and policy support, especially the European Union, (they suggest that all companies start caring now rather than when new rules come into play.
“Companies should care for two reasons. One, employers care. A study found that 1 of 6 digital talents left their companies because they didn’t want to create harmful tech. Two, upcoming EU regulations will force them to care anyway, so they might as well start caring now, says Layla.
KOSA already has pilot projects running with big clients, and a team counting five people based in as diverse locations as Amsterdam, Nairobi and New York. APX invested in KOSA AI in April 2021, and the team is looking to leverage the APX network of experts and partners.
You can follow and connect with KOSA ai via LinkedIn, Twitter, Instagram and www.kosa.ai.
Why APX invested:
KOSA solves the problem that algorithms are trained with huge historic data sets that include racial and gender biases. To mitigate the disadvantages that result out of such algorithms and the (image) risks that they pose to companies, KOSA integrates seamlessly into existing machine learning workflows and lets business deciders quickly understand the tool’s impact and how “fair” the company’s programs are.
KOSA is thus entering a market that is currently growing rapidly and gaining relevance, especially in the Western world. With a strong team of young founders with a strong professional track record and a trusted partner like APX at their side, KOSA sets out to become the go-to solution when it comes to structural bias in AI.