10

INTERNATIONAL STORY COURTESY OF THE BIG ISSUE AUSTRALIA / INSP.NGO COURTESY OF THE BIG ISSUE AUSTRALIA / INSP.NGO SLAVE TO THE ALGORITHM BY CHER TAN Shalini Kantayya’s new documentary Coded Bias is a deep dive into the algorithms that are increasingly shaping the way we live our lives. It makes clear the fact that, contrary to popular opinion, technology is not neutral. The reality is that its biases are working their way into every part of our daily lives — and often with negative consequences. IN MARCH 2016, Microsoft released a bot that made its debut across apps like Twitter. The technology company hoped the bot, named Tay, would gain “conversational understanding”— meaning that the more a human being chatted with it, the smarter it would get. But as a result of Tay’s programming (Microsoft did not implement key safeguards), it proved easy for people to feed it offensive content. Within 24 hours of its launch, Tay was shut down as its content morphed from “Humans are super cool!” to dozens of misogynistic, racist, and fascist tweets. While Tay remains a rudimentary example of how AI can backfire, a bigger question lies in whether technology is able to detect biases inherent within its own codes — particularly if they are created by a group of people who don’t reflect the diversity of the global population. As software algorithms become increasingly allencompassing, who will end up bearing the consequences of their discrimination? Enter Coded Bias, a new documentary by the award-winning Brooklyn-based filmmaker Shalini Kantayya, which highlights the insidious ways technology further entrenches the racial- and gender-based prejudices already present in society. “All of my work as a filmmaker explores how disruptive technologies make the world less or more fair… [But] I don’t think I was prepared to fall that far down the rabbit hole,” Kantayya says when asked about the impetus behind her documentary. “It really was this incredible discovery.” COURTESY OF THE BIG ISSUE AUSTRALIA / INSP.NGO Coded Bias takes viewers on this same path. It follows the journey of MIT computer scientist Joy Buolamwini from her shocking initial discovery of the flaws inherent in Amazon’s Rekognition software (where she had to put on a white mask for her African-American face to be detected) to her founding of the Algorithmic Justice League (AJL), an organization that works to highlight the social implications and harms of AI. This is juxtaposed with talking-head interviews with data rights experts such as Safiya Umoja Noble, Zeynep Tufekci, and Weapons of Math Destruction author Cathy O’Neil, all of whom are engaged in similar battles for a freer technological landscape. The documentary makes plain the fact that Big Tech has a hold on everyday life. Often marketed under the guise of “connection”, “community” and “convenience”, platforms such as Facebook and Google (to name but two) harvest individual data that is then sold to advertisers, government institutions, such as the FBI, and other corporations. “Algorithms can impact things like who gets hired, who gets healthcare, who gets into college, who gets a longer prison sentence,” Kantayya says. “They’re already making such important decisions about human destiny. Computers are not unbiased, and we’ve sort of put them in the position of being our gods.” Numerous case studies in Coded Bias underscore this. In what is referred to as “algorithmic determinism” — where an algorithm makes uniform decisions regardless of its variables — Daniel Santos, a schoolteacher in Houston, received a damning evaluation despite his consistent track record of excellence. Further afield, a facial-recognition trial deployed by police in Britain saw Black teenagers get mistaken for wanted felons. In China, a burgeoning social credit system threatens to take over every facet of a citizen’s life. Through what O’Neil terms “algorithmic obedience training”, facial recognition is required for even mundane activities like shopping and taking the train. The system delivers a “score”, and “rights” can be withdrawn depending on the score. The fact that there is barely any regulation around the inner workings of AI is cause for concern. “We don’t have basic understanding and literacy around these algorithms that we use every day and how they impact our lives,” Kantayya explains. “The truth is that we actually need the space to be regulated the way television is regulated.” Indeed, if structural inequalities such as racism are “becoming mechanized [and] robotized”, as apartheid historian Patric Tariq Mellet says in the documentary, what can individuals do to reverse this? Kantayya has her answer: “I think the only way is through laws. A small group of people can make a difference. I’ve seen that with my allies in the making of the film.” It’s undeniable: the rigorous campaigning that Buolamwini engages in throughout Coded Bias saw its fruits in June this year when the United States introduced legislation to ban federal use of facial recognition. AJL’s work has also resulted in Microsoft recently stating that it will not sell facial recognition software to police departments until laws regulate it and Amazon setting a one-year pause on the sale of facial recognition code. “This is a sea change that we never thought was possible when I started making the film,” Kantayya continues. “And it happened because of the women in my film. We owe them a debt of gratitude. “I hope this is what people glean when they watch the film: that a small group of people can make a big change.” ■ Courtesy of The Big Issue Australia / INSP.ngo 10 DENVER VOICE September 2020

11 Publizr Home


You need flash player to view this online publication