Ross Professors: No Easy Answers for Social Media Companies as They Face Scrutiny

Share

Professor Erik Gordon and Lecturer Marcus Collins explain the difficulties ahead for companies like Facebook, Twitter, and Google.

Executives from Facebook, Google, and Twitter were taken to task by a congressional committee investigating Russian influence in the 2016 election. Members of Congress asked tough questions, but as Michigan Ross’ Erik Gordon and Marcus Collins explain, there are no easy answers:

Professor Erik Gordon: A Can’t Win/Win Scenario for Social Media Companies

Media channels like Facebook and Twitter that are open to all content contributors face challenges they can’t win and one they can win — but have botched.

They can’t win the challenge of being both a channel for real-time content created by a billion users and a channel whose content is censored. To win that challenge, each company would need a billion carefully trained employees to monitor the billion content creators and quickly decide which content they will allow and which content they will censor.

Gordon

If the companies can’t hire and train a billion censors, they will have to develop software more powerful than any currently known to the smartest experts, including national intelligence agencies – agencies that have not been able to stop or even identify all of the content that interests them.

Even if we woke up tomorrow to a breakthrough in software, the companies couldn’t win the challenge of censoring what one group thinks should be censored without another group complaining that its rights are being trampled.

Who will decide what content is sufficiently dangerous to society to warrant being censored? If a candidate’s supporter tweets that more than a thousand people showed up at the candidate’s speech when there were 100 people, is that news Twitter somehow should know is fake and therefore should censor? If someone posts on Facebook that a law voted for by a candidate put a hundred people in her town out of work, is Facebook supposed to know whether it is fake news?

Censorship decisions are made by whoever has the most power. Censorship will strip the people with less power, by definition the disenfranchised, of their chance to be heard peaceably. Beyond the moral question, we should ask what the disenfranchised will do to be heard if Facebook and Twitter are forced to block what they or the government decide to block. The classic methods are to turn to protests and riots.

It may be impossible for Facebook and Twitter to meet the technical challenge of spotting fake news, and it may be dangerous for them to censor news that somebody gets to decide is fake, but there is one challenge the companies can meet. They can meet the challenge of doing more to identify posts that obviously are questionable. Wikipedia, for example, uses volunteers who post warnings about content that seems unsupported or suspect.

Political ads in a U.S. campaign paid for in rubles are easy to spot and easy to classify as suspect. They gave politicians easy shots at Facebook and Twitter. The companies undermine the persuasiveness of their position that it is impossible to spot everything when they don’t spot even the obvious.

On the other hand, politicians should be careful what they push for. Do they want Facebook and Twitter to more carefully scrutinize and label posts supporting their re-election bids? Does the U.S. want other countries to more carefully scrutinize the U.S.’s efforts to influence elections in other countries?

Do we want to go back to the system in which a small number of powerful media companies that are not open to content from all of us decide what we will and will not be told? Voters re-elected Franklin Roosevelt without knowing about his weakened health because media companies agreed they wouldn’t report it. Prior to Facebook and Twitter giving all of us a voice, presidents were elected by voters who didn’t know information the traditional media companies decided wasn’t news. Do we prefer having a few powerful companies decide what we will hear over having open channels that allow us to hear a billion people, even when some of them are posting baloney?

And do we want the government deciding what Facebook and Twitter cannot let us read and see? That is a possibility, but it is a possibility that seems more Russian or Chinese than American.

Lecturer Marcus Collins: Do We Really Want to Turn Social Media Into the TSA?

It’s terribly difficult to police the platform — that is, the environments where people connect with their people — without policing people. And it’s nearly impossible without creating a new burden for the innocent.

Collins

Think of it like the Transportation Security Administration: There are a few bad apples in the bunch and since we can’t spot them as easily as we’d like, we have to put the burden on everyone to be searched and delayed. But, ultimately, it creates an unpleasant environment.

And not unlike the TSA and flight travel, there aren’t many alternatives outside of Google, Facebook, and Twitter for people to connect in the way they have become accustomed.

These three platforms could create filters and signals that identify potential nefarious behavior, but it comes with cost in that people will have to censor themselves. Again, not unlike what we do when we fly.

The good thing in this case, if there is one, is that these platforms (particularly Facebook and Google) are really good at identifying behavior. Whether it is what you do individually or what you do with your peers, these technologies have mastered the art of curation by behavioral signals. The hope is that they would be equally as effective in identifying suspicious behaviors and handling them accordingly.

Erik Gordon is clinical assistant professor at Michigan Ross with a focus on entrepreneurship and technology commercialization. Marcus Collins is lecturer of marketing and senior vice president of social engagement at Doner, a Southfield, Mich.,-based advertising agency.

 

Media Contact: michiganrosspr@umich.edu