• The Listener
  • North & South
  • Noted
  • RNZ

Facebook won't give up its insidious practices without a fight

Facebook CEO Mark Zuckerberg awaits to testify before a combined Senate Judiciary and Commerce committee hearing in 2018 in Washington, DC. Photo/Getty.

In the wake of the live-streamed attacks on the Christchurch mosques, legitimate criticism of Facebook this week rightly reached a crescendo.

In Menlo Park boardrooms, decorated in Facebook blue, more discussion about New Zealand will have taken place than at any other time in the company’s history, leading ultimately today to news that Facebook will review its live streaming policy.

Why hasn’t it just suspended live-streaming full stop until it sorts out its content moderation issues, as many have quite rationally suggested? Because even back in 2015 during the nascent days of Facebook’s live-streaming service, which only celebrities and public figures could access, it became clear that it significantly increased engagement with Facebook content.

Facebook is not going to unplug that powerful attention grabber without a fight; one bigger than what has been brewing this week. Neil Finn may have quit social media in protest, a handful of New Zealand advertisers pulled their social adverts and Milford Asset Management dumped its Facebook shares.

But even the Cambridge Analytica scandal didn’t massively damage Facebook. It continued on, pretty much with business as usual, while ramping up its PR and lobbying campaign.

The inadvertent live-streaming of mass murder has sparked a visceral response not just here, but around the world. Experts I’ve spoken to in Australia and the US this week were genuinely upset and angry at what had happened.

But it has also focused the discussion on the failings of content moderation, which all social media networks are understandably struggling with. We should be angrier about the long term behaviour modification techniques Facebook has been perfecting over the years, glimpses of which we occasionally see when Facebook’s researchers publish in peer-reviewed journals or internal documents are leaked.

Emotional contagion

The last time New Zealand featured prominently in Facebook-related news was in 2017, when the Australian newspaper got hold of a 23-page confidential document that showed Facebook’s Australian office had done research aimed at Australian and New Zealand high school and tertiary students to attempt to identify emotionally vulnerable and insecure youth, who might need a “confidence boost”.

That could be useful if Facebook had some altruistic goal in mind of helping at-risk teenagers in conjunction with mental health agencies. But the report prepared by two Facebook executives, David Fernandez and Andy Sinn, and using internal Facebook data not publicly available, was written to serve the interests of the social network’s advertisers.

Using its algorithms to monitor newsfeed posts and photos and a young user’s interaction with content through comments, likes and shares, Facebook detailed in the report how it was able to ascertain a person’s emotional state – categorising them with tags such as “anxious”, “nervous”, “defeated”, “stressed” and “useless”.

The document was to be shared with advertisers under a non-disclosure agreement and noted that Facebook had the power to target over 6.4 million "high schoolers, tertiary students," and "young Australians and New Zealander...in the workforce".

Facebook even revealed how it could track the change in a young person’s emotions over the course of the week.

“Anticipatory emotions are more likely to be expressed early in the week, while reflective emotions increase on the weekend,” the report read, according to the Australian, which never published it.

“Monday-Thursday is about building confidence; the weekend is for broadcasting achievements.”

The clear implication was that advertisers could use Facebook to exploit the “biology of buying” to convert young people into customers.

Facebook initially apologised and said the report represented a “process failure” in the company. But it later defended it, saying the research was “intended to help marketers understand how people express themselves”, not to target ads at them.

Surveillance capitalism

It wasn’t the first time that Facebook users had become unwitting participants in an experiment designed to ascertain their behaviour so it could be exploited for commercial gain.

In her weighty new book, The Age of Surveillance Capitalism: The Fight For A Human Future At The New Frontier of Power, social psychologist Dr Shoshana Zuboff details how Facebook and Google in particular, have built a massive behavioural science capability to mine our data for insights about us that can be used by advertisers.

“Facebook owns an unprecedented means of behaviour modification that operates covertly, at scale, and in the absence of social or legal mechanisms of agreement, contest and control,” she writes.

“At no other time in history have private corporations of unprecedented wealth and power enjoyed the free exercise of economies of action supported by a pervasive global architecture of ubiquitous computational knowledge and control constructed and maintained by all the advanced scientific know-how that money can buy.”

The book details Facebook’s previous experiments, some of which it proudly and naively presented to the world. There was the ‘emotional contagion’ research, conducted in 2012 and involving nearly 700,000 Facebook users.

It effectively showed that Facebook could change how its users behave in their own posting, by altering the emotional expressions in posts displayed in their newsfeed. The effect was small but evident whether people felt happier or sad – their tone changed to reflect their newsfeed.

Spread across millions of social media users, those subtle manipulations could have major impacts.

The research, published in the prestigious journal Proceedings of the National Academy of Sciences in 2014, immediately met with controversy in the research community, because Facebook had manipulated people’s newsfeed without informed consent, in some cases, triggering negative emotions in people.

The backlash led Facebook to review its research processes and ethics. An earlier experiment conducted during the 2010 midterm elections in the US, showed that by manipulating voting-related messages in newsfeeds, Facebook could influence how many people participated in the democratic process.

Its researchers, who published their results in the journal Nature, had inserted a single voting-related message into the newsfeeds of a staggering 60 million US Facebook users, on November 2, 2010, election day. Comparing groups that received differing messages to a control group that received none at all, they were able to ascertain that they could persuade people to vote. Displaying information on where to vote and displaying photos of a user’s friends who had flagged themselves as having voted, led to more users clicking Facebook’s “I Voted” button.

The researchers estimated that the voting-related messages sent 60,000 extra voters to the polls in 2010, with a further 280,000 voting as part of the social contagion effect – having seen and been influenced by friends who had advertised the fact they’d voted.

While Facebook had not tried to nudge voters toward a Democratic or Republican vote, the size of voter turnout itself can make a big difference to an election outcome. It also showed the staggering power of the social network to influence elections, an ability Cambridge Analytica was later found to have tried to exploit using Facebook data to influence the 2017 Presidential election.

These are just two examples of thousands of experiments Facebook researchers have conducted over the years, the vast majority of which haven’t been made public.

Their results, when occasionally aired, aren’t as confronting as live-streamed mass murder, but they represent the real underlying problem with Facebook and its largest rivals.

An un-mutable crescendo

As Zuboff puts it: “In declaring the right to modify human action secretly and for profit, surveillance capitalism effectively exiles us from our own behaviour, shifting the focus of control over the future tense from ‘I will’ to ‘You will’.

We were all able to decide for ourselves our level of outrage over the gunman’s 17-minute video spreading across the world’s largest social network. But it is the subtle tweaks and manipulations designed to modify our behaviour and serving the interests of advertisers that is more insidious and potent.

The answer to all of this is perhaps best summed up by former Facebook executive, Antonio Garcia-Martinez who around 2012, was working on Facebook’s data and adverts platform.

“The hard reality is that Facebook will never try to limit such use of their data unless the public uproar reaches such a crescendo as to be un-mutable,” he wrote in the Guardian in the wake of the emotional contagion research coming to light.

“They’ll slip that trap as soon as they can. And why shouldn’t they? At least in the case of ads, the data and the clickthrough rates are on their side.”

The Age of Surveillance Capitalism: The Fight For A Human Future At The New Frontier of Power, Shoshana Zuboff, $55, Allen & Unwin.