Podcast: How Humans Bias AI

Print Friendly, PDF & Email

Kris Hammond, Chief Scientist, Narrative Science

In this AI Podcast, Kris Hammond from Narrative Science explains that while it’s easy to think of AI as cold, unbiased, and objective, it is also very good at repeating our own bias against us.

“I am not saying that we should give ourselves over to algorithmic decision-making. We should always remember that just as the machine is free of the cognitive biases that often defeat us, we have information about the world that the machine does not. My argument is that, with intelligent systems, we now have the opportunity to be genuinely smarter.”

Kris Hammond focuses on R&D as the Chief Scientist and co-founder at Narrative Science. His main priority is to define the future of Advanced NLG, the democratization of data rich information and how language will drive both interactive communications and access to the Internet of Things (IoT). In addition to being Chief Scientist, Kris is a professor of Computer Science at Northwestern University. Prior to Northwestern, Kris founded the University of Chicago’s Artificial Intelligence Laboratory. His research has always been focused on artificial intelligence, machine-generated content and context-driven information systems. Kris previously sat on a United Nations policy committee run by the United Nations Institute for Disarmament Research (UNIDIR). Kris received his PhD from Yale.

Recommended Reading: Hammond wrote a fascinating editorial over at Recode about human bias and AI in the context of policy making for autonomous killing machines.

Download the MP3

Sign up for our insideHPC Newsletter