Is it time to regulate Facebook?
The government shouldn’t regulate speech on private platforms, but it should rein in tech that is destructive to mental health and democracy.
I’ve argued in the past against regulation of social media on numerous occasions. The recent testimony of Facebook whistleblower Frances Haugen is pretty persuasive evidence that something needs to be done about the company, however.
To be clear, I still don’t support rewriting Section 230 to mandate fairness on social media. I agree with Facebook and Twitter’s decisions to ban Donald Trump from their platforms and to enforce their own community standards. As I’ve said in the past, these are private platforms and they have the right to set their own rules. That was true even when Facebook deleted a political page that I had operated for more than a decade.
Rather than viewpoint discrimination, the real danger of social media is the prospect of the platforms spreading mental illness and deepening our political divisions. These allegations are among the claims made by Haugen, many of which are backed up by internal company documents that she took before leaving Facebook last May.
While it is not the government’s job to regulate the hurt feelings of private citizens who are ejected from private platforms, public health and national security do fall directly under government jurisdiction. Haugen’s allegations, along with documents obtained by the Wall Street Journal, show that Facebook negatively affects mental health and national security are areas where government regulation is appropriate, unlike free speech claims against private entities.
Internal documents released over the past few weeks show that Facebook targeted teenage girls with its Instagram app even though it was aware that its algorithms steered them to topics that caused about a third to experience anxiety and depression. Alarmingly, about six percent of teens who reported suicidal thoughts said that the idea originated on Instagram. Reports that some teens say Instagram improves their self-image do not overcome this chilling fact.
Beyond that, engineers design social media sites to be addictive. They want to keep you engaged so that you keep scrolling and keep seeing ads. We engage on social media at the expense of real world interactions. As a result, we are both more connected and more isolated than ever before.
On top of that, Facebook’s attempts to increase “meaningful social interactions” (MSIs) between family members and friends backfired in a big way. An algorithm change meant to encourage people to interact rather than just read content online caused content creators to shift toward “clickbait” posts that were full of outrage and sensationalism. These posts generated lots of comments and shares, but they also generated a lot of anger.
As David French recently pointed out, Democrats and Republicans agree on a lot of issues yet both sides see the other as radicals who threaten America’s very existence. The sensationalist clickbait makes us focus on our differences rather than what we have in common. This drives us farther apart and deepens our divisions.
“Our approach has had unhealthy side effects on important slices of public content, such as politics and news,” a team of Facebook scientists said in documents obtained by the Wall Street Journal, calling the problem “an increasing liability.”
Further, Facebook was also aware that its platform was being used for nefarious purposes from sex trafficking to inciting violence against minorities to organ selling to pornography to governments quashing political dissent and of course, the widespread dissemination of misinformation and conspiracy theories. There is also evidence that Facebook was used by the January 6 insurrection plotters. Leaked documents show that Facebook was aware of these problems and failed to take action to stop them.
“I saw Facebook repeatedly encounter conflicts between its own profit and our safety. Facebook consistently resolved these conflicts in favor of its own profits,” Haugen said in her testimony. “As long as Facebook is operating in the shadows, hiding its research from public scrutiny, it is unaccountable. Until the incentives change, Facebook will not change.”
The picture painted by Haugen’s testimony is of an evil corporation bent on profits at all costs. The revelations about Facebook’s internal workings and the bevy of coverups make the company seem to rival - if not outpace - the tobacco companies as damaging to the entire world.
At the same time, Facebook allows us to exist in bubbles of confirmation bias. Users need not ever encounter a contradictory opinion, at least one that isn’t shouted down by likeminded users in the comments, unless they make the choice to seek out opposing points of view. Most people would rather have their beliefs affirmed than questioned so massive blind spots to reality have developed.
The question is what to do about it. The problem is made more complicated by the fact that Facebook is heavily used by a large number of small businesses which rely on the platform for marketing and sales. Around the world, 200 million businesses use Facebook’s various tools. At this point, shutting the company down would be damaging to the economy. The six-hour shutdown on Monday cost some businesses thousands of dollars.
I’m no tech expert, but there are a few changes that seem like obvious ways to start fixing the problem. First, trash the algorithms. If you’re on Facebook, you’ve probably noticed that you never see posts from a lot of your friends. That’s because Facebook’s algorithms reward posts that generate engagement in terms of likes, comments, and shares. This also rewards controversial and anger-inducing posts from internet trolls, however.
A quick fix would be to just let posts appear on users’ walls in chronological order. If users want to seek out clickbait that’s one thing, but it’s another entirely to shove it in their faces.
In the meantime, users can help themselves by not engaging with trolls. If we don’t feed the trolls, engagement is limited and their posts won’t be prioritized as highly.
Second, as Haugen pointed out, Facebook’s tools designed to deter the spread of misinformation are woefully inadequate. We’ve probably all had the experience of having an innocuous post removed or flagged because it was flagged by some aspect of the algorithms.
Haugen said that Facebook is “overly reliant on artificial intelligence systems that they themselves say will likely never get more than 10 to 20 percent of the [malicious] content.” Judging from the anti-vax, stolen election, and pandemic fascism posts that I see as I scroll, she’s probably right.
Better AI and a large, large team of human moderators could solve this problem. It would take an army to police Facebook’s 1.8 billion daily active users, but the company can afford it. Facebook’s profit in 2020 was $32 billion.
For years, Facebook’s Mark Zuckerberg has said that the company welcomes federal regulation of the social media industry. In response to Haugen’s testimony this week, the company put out another statement that argued that Facebook was being mischaracterized but nevertheless called for regulation.
“We agree on one thing; it’s time to create standard rules for the internet,” Facebook’s statement said. “Instead of expecting the industry to make societal decisions that belong to legislators, it’s time for Congress to act.”
The congressional Facebook hearings did show two things. One is that the evidence shows that Facebook needs to be held accountable for actions and decision-making. The second is that we cannot trust Mark Zuckerberg and Facebook to be accountable to themselves.
Now, if you’ll excuse me, I’m off to schedule The Racket’s social media posts for the day.
If you haven’t subscribed to the Racket yet, click the button below to do so while it’s still free. And remember, with the Racket you get MORE than what you pay for!
You can also find The Racket News (@newsracket) on Twitter and Facebook. Join the discussion online with our Racketeers Facebook group.
Follow The Racketeers on Twitter: Jay, Steve, and David.
As always, we appreciate shares. If you see something here that you like, please send it to your friends and tell them that all the cool kids read the Racket!
I'd say that the best thing that the gov't could do is force Facebook (the social media network) to divest itself of Instagram, WhatsApp, and most importantly Messenger.
At this point in time, the social media network (the part with the feeds) is largely immune from competition from another social media network not because their social media network is so valuable, but either because they acquired the competition (Instagram) or they control complementary products that DO produce a great deal of value (Messenger, WhatsApp), but keep people locked into the social media platform. And the incentives of the compliments are to lift the social media network, as that's where the advertising revenue comes from. If the messaging components were split from the social media network, there would be much less incentive to engage with the toxic part of the platform, which would allow competitors to do to Facebook what it did to MySpace. Snapchat and Ticktok are "kind of" attacking that angle, but are more akin to a different kind of social media network (one that's primarily video-driven) then trying to better serve the other (read: OLD) constituencies that Facebook's captured.
And if you're an entrepreneur seeking to build a better Facebook, the FIRST thing that you need to build after you get your basic interface up and running is a mechanism for people to move their content from Facebook (that users can download themselves) to your network with a minimum loss of fidelity. Because while social connections are a strong force that Facebook uses to engender lock-in, a person's chat histories, posts, and contact lists also lock users into a platform. Finally, if you are doing this, get in touch via GitHub, because I've been writing a TON of open-source commercial-friendly code[1] to extract and process Facebook data on behalf of some of my research clients who are ALSO looking into behavior on Facebook and a variety of social networks.
[1] https://github.com/audacious-software/Passive-Data-Kit-External-Data/blob/master/importers/facebook.py
Great article. On your point about adding more human moderators, you should watch this harrowing interview with a former Facebook moderator to see how inadequate and abusive that side of the business can be for those involved. https://m.youtube.com/watch?v=cHGbWn6iwHw