Beyond Bias: Contextualizing “Ethical AI” Within the History of Race, Exploitation and Innovation in Medical Research

Dec 14, 2019

Speakers

About

Data-driven decision-making regimes, often branded as “artificial intelligence,” are rapidly proliferating across a number of high-stakes decision-making contexts, such as medicine. These data regimes have come under increased scrutiny, as critics point out the myriad ways that they reproduce or even amplify pre-existing biases in society. As such, the nascent field of AI ethics has embraced bias as the primary anchor point for their efforts to produce more equitable algorithmic systems. This talk will challenge this approach by exploring the ways that race-based exploitation has historically served as the bedrock for cutting-edge research in medicine. The speaker will draw from historical examples of ethical failures in medicine, such as the Tuskeegee syphillis project, in order to explore the limits of bias and inclusion as the primary framing for ethical research. She will then draw parallels to contemporary efforts to improve the fairness of medical AI through the inclusion of underrepresented groups. The aim of this talk is to expand the conversation regarding “ethical AI” to include structural considerations which threaten to undermine noble goals of creating more equitable medical interventions via artificial intelligence.

Organizer

Categories

About NIPS 2019

Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

Store presentation

Should this presentation be stored for 1000 years?

How do we store presentations

Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow NIPS 2019