Last week, we learned that President Trump thinks that Google searches are biased against him and perhaps should be regulated.
There's no evidence to support the president's accusation, but we do know that Google searches are biased in any number of ways.
We’ve seen several examples of the way Google’s search algorithms reflect and reinforce racist stereotypes. And Google is hardly alone. Algorithms increasingly determine what we see and have access to online and in the wider world.
These algorithms are often secret, unregulated, and -- intentionally or not -- they are often racist, sexist, or otherwise prejudiced.
Cathy O'Neil is a mathematician who has worked in academia and the finance sector. She's written a book called Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.
“I would argue that the Internet itself is set up to classify us by gender, race, class, education level, consumer power, all that kind of thing,” she told Living Lab Radio.
“The reason it's set up that way is because… we exchange our private data for for services like e-mail and Web browsing.”
She argues that the entire environment of our world online ends up being a prejudiced machine and sometimes the algorithms assist companies in breaking the law, she said.
“There are anti-discrimination laws in housing, in employment, in credit… in insurance which are essentially being ignored in the context of big data,” she said. “Basically all these industries have figured out that big data helps them target much more precisely the kind of customer they want and avoid the customers they don't want and they're just going for it.”
Companies have taken the attitude, “catch us if you can,” with federal regulators, O’Neil said.
This kind of bias in computer models has real consequences, she said.
“We're not just predicting the future…were causing the future,” O’Neil said. “When we say ‘you don't look like someone who's gonna pay back a loan,’ we're not just making predictions. We're actually refusing to let that person be part of the financial system.”
The solution? O’Neil says the public should push back.
“We should see more organized actions,” she said. “We just have blind trust in these algorithms that are quite actually quite terrible. What we need to do is demand that science is put into data science right now. We need to ask very basic questions of evidence like, ‘for whom does this algorithm fail?’”