Decolonizing AI: Indigenous Voices Breaking Through the Digital Divide

Brian Ritchie, kama.ai, Felicia Anthonio, #KeepItOn coalition, and Dr. Moses Isooba, Executive Director of UNNGOF for Forus Workshop on AI Activism

The month of June is National Indigenous History Month here in Canada. It is a month that seeks to bring to light the voices, stories, and history of Indigenous Peoples. It is these very voices that have been continually sidelined in the progression of this country, these stories that have struggled to be remembered, and this history that outlines a grand forewarning of mistakes we must not make again.

For hundreds of years, Indigenous peoples worldwide have been affected by the results of colonialism. We’ve seen it in the creation of Residential schools, the Sixty’s Scoop, and the continued biases and discriminations against Indigenous peoples. And now, we see it too in the growing digital divide.

The “modern world” seeks to progress; it seeks to innovate and create new and exciting technologies to make our world and our lives better. In doing so, we must acknowledge the mistakes of history and ensure they are not repeated in our progression. Our ever-growing and advancing technological society tends to build complicated and exclusive technology, only to be understood or profited by those who have had the privilege to have grown up with it. We are seeing an industry that can create unconscious biases in its data, catering to mainstream society and often lacking in diversity and inclusion. So, where does that leave Indigenous people?

The AI industry, to date, is advancing on the basis of heavy technical requirements with complex data sourcing and refinement tasks, and deeply technical machine learning models riddled with unconscious biases in favor of those who programmed it and those who left the data that powers it. We see data-driven applications where the values, cultures, and emotions of diverse users are not taken into account. How can we hope to automate our world when we do not understand or engage the people we are automating for? Consultation has become a fundamental concept in dealing with Indigenous Communities, and it is done relatively well by industrial proponents in construction and resource projects that may affect the traditional lands of Indigenous People. But consultation is not a concept upheld in the advancement of technology or in the automation of services that Indigenous or other minority communities may use or consume.

We risk another form of colonialism; a new digital colonial space is emerging, albeit unintentionally, where minority communities are falling further behind. A digital world where Indigenous and minority voices are not at the forefront, as developers nor as consumers. So how will we combat this? We can start by joining rather than opposing. The digital progression should no longer be dictated by the technical elite; we can choose to join together with Indigenous peoples and minority voices to ensure ethical, inclusive, and accessible technology is being created for the betterment of all societies and cultures.

We can strive for a desired future state of technology– a state of AI– that reflects the values of Indigenous and minority communities as well as the social mainstream. As stated in MIT’s “Artificial intelligence is creating a new colonial world order,”: “… the stories reveal how AI is impoverishing the communities and countries that don’t have a say in its development—the same communities and countries already impoverished by former colonial empires. They also suggest how AI could be so much more—a way for the historically dispossessed to reassert their culture, their voice, and their right to determine their own future.” Like all things, with darkness too comes the opportunity for light. Our new digital world does not have to follow our past; we can pave the way towards something new, something better.

Our vision at kama.ai has always been to create the most ethically, emotionally, and experientially aware conversational intelligence to improve the quality and trust of the human-AI experience. While we have far to go in our journey, we have progressed on our path with the trial, and now deployment of, Virtual Health Assistants for Indigenous Communities. In a trial earlier this year, with Chapleau Cree and Moose Cree First Nations, we saw great value in the emotional understanding, and the delivery of relevant, high-quality responses to community members accessing health and cultural information. This is only the beginning of our work in 2022, as we will continue to implement our platform to connect First Nations with government and industry throughout the remainder of the year. At the same time, we will continue our efforts toward making AI development and use more inclusive with Indigenous, diverse, and minority communities.

Though this month has given us a space to speak up and acknowledge Indigenous History,  history is not tied to the month of June; it is with us every day. We will not stop on July 1st, turn our heads, and walk away from this new colonial problem. Instead, we will take this goal– this message– and continue to pave the way for diversity and inclusivity in this industry. We implore those with the power and heart to follow to do the same.

To understand how kama.ai’s Designed Experiential Intelligence can empower your enterprise, social or environmental cause, please fill out a request on our contact form, and we will arrange a time to talk.