Artificial intelligence is the most consequential technology of our time. It makes decisions about who gets a loan, who gets parole, whose face is recognised and whose is not, whose voice the system understands and whose it mishears. We are building systems of enormous power with very little agreement about what values should govern them, whose knowledge should inform them, or what consequences should follow when they go wrong.
We have been here before. Not with computers, but with the underlying problem: how do you build a system that makes consequential decisions justly, accounts for its errors, and remains answerable to the people it affects? The Yoruba Ifa corpus solved this problem three thousand years ago. Its 256 odu, its decision pathways, encode a binary logic system whose mathematical structure is identical to the binary logic underlying modern computing. But unlike modern computing, Ifa was never separated from ethics. Consequence, reciprocity and accountability were not added later as a compliance layer. They were the architecture from the beginning.
This talk explores what African knowledge systems, from the Kemetic concept of Ma'at as a framework of cosmic accountability, to the Ifa system of consequence and wisdom, to the Ubuntu philosophy of relational responsibility, have to say about the AI systems we are building now. Not as metaphor. Not as inspiration. As working models whose structural logic predates, and in several respects surpasses, the ethical frameworks currently being proposed by the world's leading technology institutions. Chinenye Egbuna Ikwuemesi draws on her published work in African mythology, her research at the Afrodeities Institute and 25 years of transformation leadership inside complex organisations to make this argument with rigour, accessibility and considerable urgency.
Audiences leave understanding that the question is not whether AI can be ethical. The question is whose ethics, built on whose knowledge, answerable to whom. And that the most sophisticated existing answers to that question were developed in Africa, long before Silicon Valley decided it had invented the problem. SUITABLE FOR: Corporate teams, technology and innovation conferences, leadership away days, universities and colleges, ethics and governance forums, science and humanities crossover events, Black History Month programming.
LENGTH: 45 to 90 minutes with question and answer session. Category: Science, science and technology. Four tags: Artificial intelligence, African philosophy, ethics, knowledge systems.
Views: 16 | Enquiries: 0Chinenye Egbuna Ikwuemesi spent 25 years transforming complex organisations. Then she went looking for older models, and found them exactly where history said there was nothing to find.
She is the author of "Nigerian Mythology: The Shadow Sky," "Meet the Orisas" and "The Girl Who Climbed the Tree," and the founder of the Afrodeities Institute, which treats African myths, philosophies and governance traditions as serious civilisational blueprints rather than colourful anedotes.
Her talks take audiences into worlds most of us were never taught existed. The Kemetic concept of Ma'at, a framework of cosmic balance and accountability that predates Greek philosophy by two thousand years. The Yoruba Ifa system, whose mathematical architecture is structurally identical to the binary logic underlying modern computing. The West African empire of Wagadu, whose priests engineered a memory system sophisticated enough to outlast civilisational collapse.
These are not origin stories dressed up as inspiration. They are working models with something urgent to say about how we build institutions, make decisions, think about AI, and decide whose knowledge counts. Audiences leave with a different map of the world in their heads, a new set of questions about the systems they inhabit, and consistently, a strong desire to keep talking long after the formal session ends.
If you are interested in this talk and wish to contact the speaker, please complete the following form: