Social Media Content Moderation Is Not Neutral, USF Researcher Says

Photo courtesy of Ali Alkhatib.

Photo courtesy of Ali Alkhatib.

After a mob of pro-Trump agitators stormed the U.S. Capitol last week, forcing a delay in the certification of the electoral vote for president, Twitter blocked President Trump from posting, and other platforms soon followed suit, citing concerns that his posts might incite further violence. Parler, an alternative to Twitter with a reputation for more permissive content rules, has been removed from Apple’s App Store and Google Play, and Amazon halted services to the platform, essentially taking it offline. Ali Alkhatib is a research fellow at the University of San Francisco’s Center for Applied Data Ethics, which studies ethical issues in tech like unjust bias, surveillance and disinformation. With a background in social and computer sciences, he has been considering the way these platforms make and enforce rules from a social rather than technical perspective.

“All of the technical systems that we build are things to support, or I guess — I hate to use this word — to disrupt the ways that people do live their lives,” Alkhatib said. “That’s not to say that technology is a bad thing or anything like that, but that it exists in society, and that it acts on society in these ways.”

The riot at the Capitol made it painfully evident that online discourse can and does have real-life consequences. But that has always been true, Alkhatib said, and pretending otherwise is a luxury only the privileged can afford to indulge in.

“The view that things that happen on the internet are all sandboxed and sort of like games, or sort of playful, inconsequential things, is the sort of thing that one says when they are insulated from the reality of consequences,” he said.

Alkhatib said the technology, rules and algorithms that govern online platforms — including safeguards against abuse or violence — are not neutral. 

“What ends up happening is that the groups of people who already have power end up using those bureaucratic institutions and those structures to perpetuate violence,” he said, likening platform rule enforcement to real-life law enforcement, with the same racial and social biases.

“Unless you are building this specifically with the marginalized and vulnerable groups, it’s hard to build any system like this that does anything but further oppress people who are already under the thumb of various other structures and various other bureaucracies and powers,” he said.

He also said he hopes users of these platforms will get more comfortable demanding the changes that they want to see and advocating for improvements, like residents might petition a city to fix a pothole or other quality of life problem.

“We don’t need computer science degrees or philosophy degrees or any of that stuff to be able to say, you know, ‘Twitter shouldn’t run this way,’” Alkhatib said. “‘This is not working for us and it’s causing harm for us. And the fact that I am a human being is enough to entitle me to have a say in how my life is mediated and run.’”

Don't miss out on our newest articles, episodes and events!
Sign up for our newsletter