Social Networks Need To Be More Transparent and Responsible


Whistleblower Frances Haugen’s testimony to the U.S. Senate, on top of which we can add The Wall Street Journal’s series of articles on thousands of pages of research and communications that Haugen made public, could transform the debate about the need to regulate social networks. They should also clarify measures that Canada needs to take to supervise them.

Very obviously, controversy surrounding Facebook and other digital platforms abound. There have been other whistleblowers, a good number of leaks and regular public hearings pretty much everywhere on Earth. This time, the situation is different — for three reasons.

First of all, even though leaders of civil society, scientists, investigative reporters and political analysts have been showing the harmful effects of social media for a while now, Haugen gave us detailed documents of their own internal research. Now we know that they knew. Social media is especially harmful to our children’s well-being, helps to proliferate hateful speech and compromises the integrity of our democratic norms. And even if the effects are felt in the United States, Canada and Europe, they are even worse in the Global South, where the majority of Facebook’s users are, since they only receive a fraction of the resources set aside for moderation.

Second of all, the computer engineer shed light on the source of these issues. Facebook and other social networks would like us to believe that they are simply mirrors of society, the reflection of our prejudices, divisions and social problems. Yet, quite to the contrary, they play an essential role in perpetuating them. Their algorithms shape our behavior; they have a hugely negative impact on the messages and the people that their users see and hear; they are also calibrated in order to retain our attention.

Too often, content meant to get our attention has harmful effects. This, the operators of social media know.

Thirdly, the documents that Haugen unveiled, just like her testimony, show the limits and failures of self-regulation. This is where we learn that when it comes to choosing between mitigating the dangers that their own research has shown and maximizing profits, Facebook often opts for the latter. None of that is particularly surprising, since the company is one of the most profitable in history. Since it was founded, it has in fact essentially focused on its own development, particularly by dedicating incentive structures to it.

In short, the whistleblower will have, in the end, allowed us to recenter the debate around the heart of the problem: Business decisions, product design and incentive structures, which too often favor profit and growth, have a deleterious effect on public security and democratic responsibility. Rather than focusing on the result — namely, hateful discourse — of a structural problem, Haugen is asking governments, with good reason, to impose upon the companies in question measures that will make them more transparent and responsible. Fortunately, we now have a good number of the necessary regulatory mechanisms to solve the problem.

The Canadian government has been working for a year on a law on the diffusion of hateful speech in the digital space.

It is easy to understand the desire to work on it; hateful speech reaches both politicians and the public — minority and marginalized groups in particular.

There are no doubt reasonable measures that can be taken to crack down in the case of speech already found to be illicit, as we recommended last year. Even though the law currently proposed plans for a governing structure that would rein in social networks (a new regulatory body coupled with a panel of experts to guide it), the government has been criticized for giving these authorities the role of public censor and for focusing too much on the symptoms of the problem (hateful speech), with no regard for the causes (the reach of the platforms and their incentive structures).

We think that now is the time for the government to give the new body the power to oversee transparency and responsibility, as Haugen is asking. That is why, this year, the Canadian Commission on Democratic Expression and the Citizens’ Assembly on Democratic Expression (an initiative of the Public Policy Forum) have worked on policies that favor the transparency and responsibility of social media operators in regard to their own conduct and the consequences of their actions.

The idea is not new. It corresponds precisely with the access demanded in all other industries.

In fact, we do not simply ask Pfizer to be careful when developing a vaccine against COVID-19; we demand that they show proof that their vaccine is safe. When it comes to social media, the series of measures could include better sharing of data and greater transparency in its advertisements; the rigorous verification of algorithms; the ordering of studies on harmful effects and risks, as well as looking for new ways of being accountable to the companies concerned.

This approach would allow for the greatest possible reduction of risks associated with limiting individual freedom, by recommending that the same measures that are imposed on other sectors be imposed here. Heightening our degree of transparency and responsibility will allow us to clean up our public sphere and reinforce democracy. This is where governments ought to begin.

About this publication


Be the first to comment

Leave a Reply