Facebook, Google and Politics


Research proves the power of networks and search engines to manipulate public opinion without it noticing.

On the first week in April’s cover of the British magazine, The Economist, Mark Zuckerberg posed like a Caesar, with the earth’s globe at his feet and the motto “coniunge et impera,” “unite and rule,” a clever inversion of a maxim attributed to Philip of Macedonia. The magazine referred to the “dominance of the next era of computing,” and not a literal political supremacy, but why not? Information technology stopped being just a sector of the economy to become the woof* over which is constructed not only virtually all production, commerce and finance, but also culture, daily life and the power of the State.

The employees of Zuckerberg perceive this clearly. One month before that edition hit the newsstands, they concluded an internal opinion poll about questions to be raised at the weekly discussion with the boss. One of these was “What is the responsibility of Facebook in helping prevent Donald Trump from being president in 2017?” It is not known how Zuckerberg responded, but a leak of the poll by the technology blog Gizmodo has sparked a debate over how much Facebook and Google can influence politics.

The response is worrying: They can, yes, drastically change the course of elections; at least in the United States there is no legal mechanism to stop it and it is difficult to prove this type of intervention, even after the election is over.

One can only count on the informal promises of businesses to not do this, whose value is the same as that for the protection of the privacy of its users.

Since 2008, Facebook has included an “I voted” button every election day in the USA. When someone clicks on it, a message is sent with his name and photo to all his friends. In 2010, researchers at the University of California concluded that the campaign brought at least 340,000 voters to the polls, 0.6 percent of the total participation. In another experiment, the network increased the amount of hard news stories at the top of the news feeds of 1.9 million users in the three months prior to the elections of 2012 and with it increased its electoral participation by 3%.**

It would be enough for the business to use the profile details of its users at its disposal to identify their political inclinations, encourage voters in tune with their own interests and direct those with contrary views to leisure tips and entertainment journalism. This could be done without entering into the use of the unrestricted power of the internet to promote or exclude posts and users, without having to answer to anyone [or] impose strict and arbitrary rules over what content is acceptable

In the U.S., 64 percent of adults use Facebook and 30 percent access news through it, an audience much larger than any newspaper, television or cable network. In Brazil, eight in ten internet users are on Facebook. Today, publications who opt to post directly on Facebook receive 25 percent of their access through it.

These numbers only seem to increase. Zuckerberg’s goal is to gradually eliminate reasons for searching other sites and to isolate its users from the rest of the internet, which is already a reality for those who use the Free Basics service or internet.org offered to several peripheral countries.

Even more impressive is the potential of Google. Starting in 2013, psychologists Robert Epstein and Ronald E. Robertson conducted experiments whose results were published in the Aug. 18, 2015 edition of PNAS, an official publication of the National Academy of Sciences in the United States.

Initially they made simulated elections in the U.S., in which a person would have to give an opinion before and after researching about the candidates with a fictitious search engine. In an experiment with 2,000 participants in all 50 states, the candidate on top received an average advantage of 37 percent, ranging from 15 percent for “people with no specific party,” 17.9 percent for “moderate Independent women,” 73 percent for “moderate Libertarians” and 80 percent among “moderate Republicans.” In another experiment, biased search results changed the opinions about the validity of hydraulic fracturing (the controversial technique of extracting gas and oil) by 33.9 percent.

What was missing was to test if this maneuver would be effective in a real election, in which the voters would have access to other sources of information and some previous knowledge about the candidates. For imaginable reasons, this experiment was not done in the USA, but it was in the Indian elections in 2014, in which the fundamentalist Hindu Narenda Modi, defeated the center-left candidates Rahul Gandhi and Arvind Kejrial and became prime minister. The 2,150 manipulated voters received $1 to $4 for participation.

There were groups who leaned against the results of the search engine; for example, -11.8 percent for “conservative women,” but they totaled just 3.1 percent of the sample, and even with their inclusion, they obtained an average of 12.3 percent of the vote in the intended direction. For the remaining, the distortion varied from 0 for “employed women without a political ideology” to 72 percent for “unemployed men from Kerala,” with an average of 24.5 percent. Even those .05 percent who said they suspected manipulation had their votes skewed. This was the result of one search, but what if a candidate were favored for weeks or months?

Google, explains Epstein, has become the principal gateway of knowledge and generally is good at giving information sought in top positions. About 50 percent of our clicks go for the first two items and more than 90 percent for the first ten. Because we constantly use it to check facts in routine searches, we are conditioned to trust the first results and discard the rest as irrelevant or doubtful.

If the company wants, it could use its database to identify the likely undecideds (and also those who are “against”) and provide them personalized rankings to favor a candidate. This would not just be very difficult to detect, it would be legal. Not only the targeting but also the right to keep it secret is protected by the freedom of speech and press guaranteed in the First Amendment to the U.S. Constitution.

Researchers estimate that Google could actually turn the result of 25 percent of national elections in the world without anyone perceiving it. In the case of the U.S., where the majority of presidential elections were won with margins of about 7.6 percent (3.9 percent in 2012), Epstein estimates that Google could give 2.6 million to 10.4 million votes – 2 percent to 8 percent of the probable total – to Hillary Clinton, in secret without leaving evidence.

There may be those who believe that the ends justify the means in this case, but the power of Silicon Valley to decide electoral results and the “truth” about issues of public interest should worry us even more than the risk of a Trump government. Just as shopping centers have the power to dictate private rules and living spaces, which substitute for squares and parks in a way to exclude collective demonstrations or even the presence of anyone who is considered undesirable, social networks and search engines privatize the space of public debate, so as to impose their own rules.

If it is questionable for a large newspaper to deny space to an issue, candidate or current opinion, it is much more serious if this policy is part of a social network or search engine on which users increasingly depend to find out what the media is saying. Even worse is when that same network knows the political and social profile of every user and can give or deny them information in accordance to what suits their private interests. In the very near future, no democracy will be real if its networks, search engines and too many means of connection are not subject to supervision and public scrutiny. “Coniunge et impera.”

*Translator’s note: “Woof” is a weaving term. The “warp” are the fixed strings on a loom and the “woof” represents the strings woven across, at right angles to the warp to form a fabric.

**Translator’s note: More details on these experiments can be found here and here.

About this publication


About Jane Dorwart 199 Articles
BA Anthroplogy. BS Musical Composition, Diploma in Computor Programming. and Portuguese Translator.

Be the first to comment

Leave a Reply