User data: Facebook once again did not manage to protect user data. Is this the last straw?
"We have a responsibility to protect your data, and if we can't, then we don't deserve to serve you." With those words, Mark Zuckerberg seemed perfectly contrite for a moment. According to the Facebook founder, just a few large changes are necessary: "The good news is that the most important actions to prevent this from happening again today we have already taken years ago," he said. This response was posted on Facebook on Wednesday after four days of deafening radio silence, the end of a train accident in slow motion that could be followed live via the internet.
The data scandal concerning Cambridge Analytica has turned into the biggest crisis so far for Facebook. The yield after one week: the chief security officer quit, the company’s market value plummeted by more than $40 billion, and founder Zuckerberg might have to testify under oath before Congress and the British parliament.
Is this the umpteenth scandal surrounding Facebook that will blow over again? It does not look that way. Hidden in the seventh paragraph of his statement, but maybe the most important sentence in Zuckerberg's statement, was this: "... it was (...) a breach of trust between Facebook and the people who share their data with us and expect us to protect it." Trust is crucial for Facebook. Only when people trust you will they share their entire personal lives on your platform. The consequences of the breach of trust could be much larger than Zuckerberg seems to think now. Is this the last straw for Facebook users and legislators?
The newest crisis started last weekend when a whistleblower explained in The Guardian and The New York Times how the British company Cambridge Analytica secretly obtained data of 50 million American Facebook profiles. According to the whistleblower, Cambridge Analytica uses those data to build a "psychological warfare tool." Thanks to the data, political messages could be fine-tuned precisely to political preferences and psychological weaknesses of individuals, deduced from their posts, likes and friends. The company worked, among others, for Donald Trump.
Even though Facebook knew about the scandal for two years, it did little to effectively stop the abuse and failed to inform users. In fact, Facebook threatened journalists at The Guardian and The New York Times with a lawsuit until shortly before publication. Also, it turned out that a person directly involved in the scandal went on to work at Facebook. There are currently multiple judicial investigations.
Drastic Change Needed
Facebook is getting harsh criticism and not just from the customary critics. Roger McNamee, an early investor in Facebook and Zuckerberg’s former mentor, actively sought out the American media. "There's been an increasing understanding that when you're using Facebook, a lot of bad things are going to happen to you, as a user," he said. According to McNamee, drastic change is needed, or the survival of the company will be endangered. The greater loss this week is the loss of trust. This crisis of public trust "is going to destroy the company," McNamee said.
Brian Acton, co-founder of WhatsApp, ironically a subsidiary company of Facebook, tweeted: "It is time. #deletefacebook." That hashtag trended, although no data exist on how many people actually left.
The scandal touches an open nerve. It is not the first time that, in times of crisis, Facebook does exactly the opposite of what is needed. In November 2016, after Trump had just been elected following an election campaign with the most untruths ever, Zuckerberg said: "... the idea that fake news on Facebook (...) influenced the election in any way is a pretty crazy idea." He only returned to that statement months later, corrected by a congressional investigation and scientific studies. Also, Zuckerberg initially ignored warnings about polarization, Russian bots, social media addiction and various privacy violations.
Zuckerberg took steps to prevent the spread of fake news and published a pompous "manifesto" last year, in which he wrote that he acknowledges responsibility for the data and the well-being of 2.2 billion users and that he wants to make sure that Facebook is a "connecting" and "inclusive" social network.
But this week, things went wrong yet again. It happens so often that the question arises: are these type of problems inevitable as long as Facebook's earnings model is based on massive data surveillance? That is what Harvard researcher Zeynep Tufekci writes. "The real problem is that billions of dollars are being made at the expense of the health of our public sphere and our politics," Tufekci wrote.
The stakes – the integrity of the democratic process – seem sky high thanks to the Cambridge Analytica scandal. This year, there are elections in Brazil and India, two of the largest democracies in the world, and two of the biggest markets for Facebook. In various countries, there are calls to completely forbid political ads via Facebook. In the European Union, the stricter privacy law – the General Data Protection Regulation – will become effective in May. Blind trust is already being replaced by stricter laws in Europe.
Zuckerberg has one important trump card.* As the owner of WhatsApp and Instagram, Facebook practically has a monopoly over social media. This has particularly enabled Facebook to get away with a great deal for years.
Typical for the entire crisis is an email conversation from 2004. With amazement, Zuckerberg saw that users shared all kinds of things with him. "I don't know why," he said. "Dumb fucks."
*Translator's note: No pun intended.