Image: Still from Coded Bias, featuring Joy Buolamwini
Virtual Online Discussion
Community Event
  US Mountain Time

Algorithms play an increasingly prominent role in societal decision-making in a variety of settings. Online streaming services use them to recommend new music, movies, or television shows; criminal justice courts use them, controversially, to predict the future behavior of someone accused or convicted of a crime. Their proponents claim that they are objective and accurate, and they are often presented as sophisticated and mysterious. But they’re not infallible: Even the most carefully-designed algorithms may produce biased outcomes, and blind trust in those programs can cause, perpetuate or even amplify societal problems.

In partnership with the Center for Contemporary Arts, SFI proudly presents a conversation with Cristopher Moore and Melanie Moses on the complex societal issues surrounding artificial intelligence and the fight for transparency on the new frontier of algorithmic justice. Moderated by the CCA's Jacqueline Frank, the discussion spurs from the recent documentary, Coded Bias, to which registrants will receive a streaming link a few days prior to the discussion.

Coded Bias follows MIT Media Lab researcher Joy Buolamwini’s investigation of widespread bias in algorithms upon discovering that most facial-recognition software does not accurately identify darker-skinned faces. What does it mean when artificial intelligence governs our liberties, and what are the consequences for the people artificial intelligence is biased against?

Click here to reserve your $12 tickets for the film and follow-up conversation.