Episode Fifteen with Dr. Kem-Laurin Lubin

Brian Ritchie, kama.ai, Felicia Anthonio, #KeepItOn coalition, and Dr. Moses Isooba, Executive Director of UNNGOF for Forus Workshop on AI Activism

Responsible AI in Action – Episode 15: Human Agency, Values, and Culturally Attuned AI, with Dr. Kem-Laurin Lubin, University of Waterloo

In this episode of Responsible AI in Action, Charles Dimov is joined by Dr. Kem-Laurin Lubin, researcher, designer, and professor at the University of Waterloo, to explore how AI systems shape human agency, encode cultural values, and influence decision-making at scale.

Dr. Lubin brings over two decades of experience in technology alongside her research in computational rhetoric and design. Her work sits at the intersection of AI, human-centered design, and critical inquiry into how systems characterize individuals and communities. As AI adoption accelerates, this conversation examines why responsible AI must begin far earlier than most organizations realize, well before procurement, and even before systems are built.

The discussion challenges the assumption that AI is neutral, highlighting how systems inherit the values of their creators and embed them into decision-making processes that affect real lives. From hiring systems and financial services to healthcare and therapy tools, Dr. Lubin explains how AI-driven characterization can shape opportunity, access, and outcomes, often without visibility or accountability.

From the concept of a values audit to the need for a technological Hippocratic oath, this episode explores how organizations can design and adopt AI systems that preserve human agency, respect cultural context, and avoid unintended harm.

Episode Highlights

In this conversation, Charles and Dr. Lubin explore how responsible AI requires a deeper understanding of what is exchanged when humans interact with intelligent systems. While AI delivers speed and efficiency, it can also extract human agency, shifting decision-making, judgment, and expertise away from individuals and into systems that are not always transparent or equitable.

The episode also examines the risks of algorithmic characterization, where AI systems build profiles of individuals based on data proxies, influencing outcomes in hiring, banking, and other critical domains. Dr. Lubin emphasizes that without intentional design, these systems can reinforce bias and limit opportunity at scale.

Key insights include the importance of conducting values audits before adopting AI tools, the need for culturally attuned system design across diverse contexts, and why technologists must take on greater responsibility for the societal impact of the systems they build.

Watch the full episode now:

Key themes include:

  • Why responsible AI must begin before procurement, not at deployment
  • The concept of a values audit and understanding whose values are embedded in AI systems
  • How AI interactions can erode human agency, decision-making, and expertise
  • The risks of algorithmic characterization in hiring, finance, and other decision systems
  • Why AI is not neutral and often reflects Eurocentric and dominant cultural perspectives
  • What culturally attuned AI looks like in practice across global and community contexts
  • The limitations of AI systems that fail to account for relational and community-based worldviews
  • Why technologists need a Hippocratic oath for AI to guide ethical system design
  • The growing normalization of data exposure and loss of privacy in digital systems

As organizations continue to integrate AI into core operations, the challenge is no longer just about capability—it is about responsibility. Preserving human agency, ensuring cultural alignment, and embedding ethical accountability into system design will define the next phase of AI adoption.

Responsible AI is not just about what systems can do. It is about understanding what they take, whose values they reflect, and how they shape the choices we make.

🎧 Watch Now

Learn more about Dr. Kem-Laurin Lubin, University of Waterloo

Follow on LinkedIn: Dr. Kem-Laurin Lubin