Bad Proxies

Dez 14, 2019

Sprecher:innen

Über

The choice of convenient, seemingly effective proxies for ground truth can be an important source of algorithmic bias in many contexts. We illustrate this with an empirical example from health, where commercial prediction algorithms are used to identify and help patients with complex health needs. We show a widely-used algorithm, typical of this industry-wide approach and affecting millions of patients, exhibits significant racial bias: at a given risk score, blacks are considerably sicker than whites, as evidenced by signs of uncontrolled illnesses. Remedying this would increase blacks receiving additional help from 17.7% to 46.5%. The bias arises because the algorithm predicts health care costs rather than illness. But unequal access to care means we spend less caring for blacks than whites. So, despite appearing to be an effective proxy for health by some measures of predictive accuracy, large racial biases arise.

Organisator

Kategorien

Über NIPS 2019

Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

Präsentation speichern

Soll diese Präsentation für 1000 Jahre gespeichert werden?

Wie speichern wir Präsentationen?

Ewigspeicher-Fortschrittswert: 0 = 0.0%

Freigeben

Empfohlene Videos

Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

Interessiert an Vorträgen wie diesem? NIPS 2019 folgen