Facebook’s reputational crisis in recent weeks is renewing calls for more regulatory oversight over the company and in some cases proposals for establishing alternative, democratically controlled social media platforms.

This week, a Leger poll found that 78 per cent of Canadians believe Facebook amplifies hate speech, and 58 per cent think the company — which also owns Instagram and WhatsApp — should be broken up or regulated. Nearly one-third of Canadians feel the platform has a negative impact on their lives.

These unfavourable public perceptions of the social media giant follow a U.S. Senate testimony earlier this month by Facebook whistleblower Frances Haugen, who accused the company of knowingly allowing its products to stoke division, promote eating disorders among teenage girls and destabilize democracies.

"The company's leadership knows how to make Facebook and Instagram safer, but won't make the necessary changes because they put their astronomical profits before people,” Haugen told the Senate Oct. 5.

The day before Haugen’s testimony, an hours-long outage took Facebook and its family of apps offline, raising questions about how reliant much of the internet has become on the platform for everything from reading the news, to communicating with friends and family.

“When something we've become accustomed to disappears or is non-functional, and makes us aware of our reliance on it, it gives an opportunity to ask questions about do we really want to be so reliant on it,” James Turk, director of the Centre of Free Expression at X University, told The Maple.

Turk noted that a large number of small businesses rely on Facebook and Instagram to advertise and communicate with customers, meaning the domino effect of the outage was widespread.

Part of the problem, Turk explained, is that Facebook’s apps are so deeply integrated that when one fails, the others are at risk of falling with it. He noted that even Facebook staff could not use internal communication channels or use their staff badges to enter company buildings and conference rooms during the outage.

Facebook acquired Instagram in 2012, and WhatsApp in 2014. At the time, Turk said, observers were concerned about the company’s growing control over social media, prompting calls for antitrust legislation (U.S. federal laws that are intended to promote competition and prevent corporate monopolies).

“There were various antitrust initiatives that were started and others were considered, and Facebook made it very clear that it was going to work aggressively to integrate Instagram and WhatsApp into Facebook in a way that they couldn't be disentangled as part of an antitrust initiative,” said Turk.

“In part, they succeeded to this point so that when Facebook goes down, it takes down everything else.”

Turk said these moves have made it much harder for regulators to disaggregate Facebook by forcing it to sell Instagram and WhatsApp. However, in Turk’s view, the main problem with the social media giant is not that it is so large, but rather the fact that such a basic and widely used service uses an algorithm that is subject to little public oversight and accountability.

“In most other sectors, when one company becomes so large with regard to something that's so important to the public, they often are brought under public control, or public regulation, whether it be hydro electricity or water,” said Turk. “Part of the solution, I think, is a requirement that Facebook be more transparent about its algorithm, and there be some regulation.”

“(Facebook has) sort of become the public space for discourse, and for news and for content, and so its internal, private blackbox decisions about how it's going to allocate what each of us gets to see has enormous impact, and is totally beyond the reach of any kind of democratic decision making by people of any country. That's the problem,” he added.

Haugen’s testimony reinforced longstanding accusations that Facebook’s algorithm prioritizes hyperbole, extremism and controversy, giving users a distorted perception of the news.

The social media platform does this, said Turk, not for ideological reasons, but for “pure profit,” because users are more likely to stay on the site if they are agitated or engaged by provocative content.

“They know they make money by keeping people's eyes on their site, and what keeps people's eyes and their site is not factual, contemplative, thoughtful discussion of issues. It's outrage; it's extreme,” said Turk.

On the one hand, Turk continued, many rely on Facebook to aggregate free news content — but, he noted, Facebook’s algorithm has all but destroyed the traditional business model for many news organizations, forcing some publications to install paywalls, therefore cutting off a number of trustworthy sources for those who cannot afford to pay steep subscription fees.

Still, said Turk, plenty of reliable news content is still freely available. Under these circumstances, he explained, “if people become less reliant on (Facebook), so much the better from my point of view.”

Public Sector Has A Role To Play

For some, calls for regulation and antitrust legislation are important steps, but do not go far enough. Paris Marx, a media Ph.D. student at the University of Auckland and host of the Tech Won’t Save Us podcast, told The Maple that the public sector has a key role to play in building alternatives to Facebook.

Marx cited the example of British author Dan Hind’s proposal for a publicly run digital co-operative in the United Kingdom, which, Hind writes, would “take primary responsibility for developing digital resources with which to articulate and inform a revived democracy,” and “act as a space for egalitarian collaboration as well as rapid technical innovation.”

Such an institution, Marx explained, could create new digital tools and social media platforms, and “ensure that those tools are not designed in ways that are about creating profit for a particular company and locking us into a company's business model, but it's actually around serving the public (and) responding to public needs.”

“It's about enhancing community relations; enhancing the public wealth of communities,” Marx added.

Establishing a new public institution would be better than simply nationalizing Facebook, Marx explained, because Facebook has for more than a decade developed deeply entrenched systems and practices that are harmful to society.

“I think it would be more fruitful to start from the ground up ... instead of taking over a platform that has a lot of baggage,” said Marx.

Other social media platforms, such as Parler and Gab, were launched from within the private sector (and are primarily used by those with far-right political beliefs) in attempts to rival Facebook and Twitter, but failed to gain any real traction. A public provider would have a better chance of success, explained Marx, because it would have the power of the state behind it.

“That is where I think that antitrust and regulatory measures also come in,” said Marx. “I think it's not just about creating a public platform, but it's also about using measures like competition policy and light regulation to break up Facebook.”

Marx noted that during Haugen’s Senate testimony, the whistleblower emphasized that she did not think Facebook is irredeemably broken and doesn’t support antitrust legislation or measures that would break up the platform. Her proposals, therefore, need to be understood in that light, said Marx.

“Her proposals are largely around creating a regulatory framework that proposes to provide more transparency into Facebook's actions, into its algorithms, and more accountability for Facebook with the actions that it takes,” explained Marx. “I would argue that doesn't go nearly far enough.”

Marx said that by placing emphasis on a few very egregious problems with Facebook, rather than fundamental issues that apply to the entire tech industry, Haugen’s proposals would risk allowing the rest of the sector to largely continue with business as usual.

“I think we need to question whether we do actually want to save Facebook, and whether that is something that is socially useful, socially desirable,” Marx added.

Others Say Facebook Is Too Centralized

Other critics say the centralization of Facebook’s operations is itself a serious problem that has created an extremely fragile system.

Matt Hatfield, campaign director at OpenMedia, told The Maple: “I think (the outage) was a good reminder of the risks of centralizing so many of our communications to a platform. If you go back in the history of the internet, it was originally set up as a decentralized information source that was mostly very resilient to disruption.”

“We have allowed the internet to become highly, highly centralized, in a way that's very, very inflexible and very vulnerable,” Hatfield added.

This month’s outage, he explained, had a particularly major impact on people in developing countries where WhatsApp and Facebook are the primary platforms for communication, making calls for users to simply log off and delete their Facebook accounts in light of the platform’s harmful effects on society a false option for much of the world's population.

“There's a growing movement for platform to be encouraged or potentially legislatively forced to be more open,” said Hatfield, adding: “I think (the outage) just really highlights not necessarily that we need to do away with Facebook as we need to make it much easier to function without Facebook.”

Alex Cosh is the managing editor of The Maple.