Money, technology and power combine to make a perfect mix that arouses astonishment and animosity. This mixture is more attractive when it involves the possibility of manipulating the wills of hundreds of millions of people. All this is present in the complaints about the use of data from 50 million Facebook users. It is a scandal that, for the time being, has cost the company $36 billion, which is the value their shares lost in a few days. But there is some exaggeration and mythologizing in that history of conspiracy and deception.
For several years, it was known that Cambridge Analytica used data obtained from Facebook to try to influence election outcomes. The news that has circulated for a couple of weeks is not, strictly speaking, fresh. On Dec. 11, 2015, The Guardian reported that Republican presidential candidate Ted Cruz hired that company to create psychological profiles by managing vast databases taken from Facebook without the consent of the users. Later, the director of that company, Alexander Nix, boasted of having contributed to Donald Trump’s triumph in 2016 and, prior to that, to the British right wing that prevailed in the referendum to break with Europe.
Created in 2013 with the resources of millionaire Robert Mercer, a Republican supporter linked to Stephen Bannon, the now former adviser to Trump known as a promoter of fake news, California Analytica is said to be working on the technique of “microtargeting behavior” to guide the vote in different countries. In January 2017, journalists Hannes Grassegger and Mikael Krogerus explained in “The Data That Turned the World Upside Down,” published on vice.com, the persuasion techniques developed by Michal Kosinski, of the Psychometric Center of the University of Cambridge.
The psychologist and data scientist Kosinski has worked with information obtained online to create predictability models since 2008. Using a program called “MyPersonality,” Kosinski sent questionnaires to Facebook users. His observations were released in April 2013 in the report “Private Traits and Attributes are Predictable from Digital Records on Human Behavior,” published in the Proceedings of the National Academy of Sciences of the United States.
Grassenger and Krogerus explained that the Polish specialist “proved that, on the basis of an average of 68 Facebook ‘likes’ by a user, it was possible to predict the user’s skin color (with 95 percent accuracy), the user’s sexual orientation (88 percent accuracy), and the user’s affiliation with the Democratic or Republican party (85 percent). But it didn’t stop there. Intelligence, religious affiliation, as well as alcohol, cigarette and drug use, could all be determined.”
The traces we leave on social networks such as Facebook are indications of preferences, opinions and biographies that define us. With enough traces it is possible to know a great deal about a person. But it does not follow that from such information, those using it can persuade us to orient our political opinions, or even less so that we can be persuaded to orient, much less modify, our political opinions.
These studies attracted the attention of Aleksandr Kogan, an assistant professor in the Department of Psychology at the University of Cambridge, who, at the beginning of 2014, asked Kosinski to collaborate with him at Strategic Communication Laboratories. When Kosinski learned that SCL wanted to use his methodology to influence electoral behavior, he rejected the invitation. At that time, SCL created a subsidiary – Cambridge Analytica. For his part, Kogan, who was born in Moldova but studied in the United States, went to work in Singapore and changed his name. For some years he was known as Aleksandr Specter.
All that information was published more than a year ago. What has now been confirmed is that, with funding from Cambridge Analytica, Kogan obtained data from 270,000 people on Facebook, and from that, the data of those users’ online friends. Accordingly, he gathered information from 50 million people. Of these, he was able to construct the psychological profiles of some 30 million people.
Kogan compiled that information with the help of a Facebook application called “thisisyourdigitallife,” and asked users to take a personality test. Thus, it was easy to gather information about age, gender, sexual preferences and other traits. Kogan obtained authorization from Facebook to extract that information, saying he would use it for academic purposes. This data management was not illegal but it was delivered to a third party, Cambridge Analytica, as part of a business arrangement. Now it is known that, since 2015, Facebook executives knew how Kogan used that information.
The construction of profiles based on preferences posted openly on Facebook, as well as responses to questionnaires such as Kogan’s, allows the creation of algorithms, that is, small computer programs, that work to place practically individualized messages online. If I say on Facebook that I like red cars, it is very possible that vehicles of that color will appear on my wall in that network. But also, if other users who prefer those cars also like James Bond movies, the algorithms will present products associated with that personality.
People usually express political opinions based on sympathies and antipathies already defined on social networks. In the study of mass media, it has been proven that people make political decisions in a complex variety of circumstances that include, among others, the experience and the environment of each decision-maker, weighing the opinions of closest friends, and changes in the political scene. In the United States, for example, Trump did not win the presidential election because of any manipulation of individualized preferences of Facebook users. What the Cambridge Analytica data allowed was the ability to organize the candidate’s campaign so that he frequented the places where voters were willing to vote for him.
All that was already known. In addition, two weekends ago, The Observer of London published the revelations of Christopher Wylie, a specialist in predicting trends and a former employee of Cambridge Analytica, who exposed the fact that the data of 50 million users were used. Wylie, a 28-year-old Canadian, says that in addition to participating in the handling of this information, he was at meetings between Nix and millionaire Mercer at the end of 2013, in which Nix sought to persuade Mercer to invest $15 million in Cambridge Analytica. Wylie later left the company.
Wylie’s revelations have nurtured the research that already existed in the European Union and the United States into the handling of massive data for electoral purposes. In turn, recognizing that Cambridge Analytica was a client, Facebook cancelled its dealings with the firm. On March 20, several videos recorded surreptitiously by Channel 4 in the United Kingdom were released showing Nix. On the tapes, the director of Cambridge Analytica boasted that among the resources of the company used to compromise their clients’ political opponents were blackmail and bribes in addition to the employment of “very beautiful” Ukrainian girls. “I find that works very well,” Nix said on the tape. Nix was fired from Cambridge Analytica, and his statements, obtained with journalistically questionable methods, indicate that not all the work of that company is supported by sophisticated computer algorithms.
The discontent of Facebook users is understandable. Nobody likes to have one’s data known, much less manipulated. But do not forget that when we join that network, we accept the company’s use of “any content” that we place there. As part of its data policy, Facebook warns, “We use the information we have to improve our advertising and measurement systems in order to show relevant ads.” There is no deception in that warning. What is not expressly stated in the Facebook contract with users is the possibility that these data will be transferred to another company.
The mess at Facebook emphasizes the fact that the handling of personal data merits more precise rules, and above all, broad transparency. But also keep in mind the unpredictability, or naiveté of most social network users who like that. We are uncomfortable with the fact that corporations have so much information about us, but every day we feed them with the data we display on these platforms.
The data collected by Kogan, Nix, Wylie and Cambridge Analytica were obtained legally from users who agreed to download and use the application, answering questionnaires with uninformed innocence. Facebook was not hacked. Incidentally, those 50 million are just 2.3 percent of the 2.13 billion users that Facebook has internationally.
To suppose that the 30 million people whose profiles were processed by Cambridge Analytica were manipulated to vote in one way or another is a simplification without substance. But this case allows us to highlight the risks of making decisions, commercial but above all political, for a society that tries to understand and act on big data harvested on computer networks.
To suppose that all citizens who have some preferences in common will always have the same ideological or political orientations implies that we are ignoring the singularity of each individual and changes in society. This tendency leads to the belief that, within the diversity indicated by personal preferences, there is no additional diversity in decisions about public affairs. Understanding the possibilities, but also the limitations, of managing big data is one of the new dilemmas for democracy.
Leave a Reply
You must be logged in to post a comment.