Next week’s ‘Christchurch Call’ summit in Paris will see Prime Minister Jacinda Ardern in the international spotlight again, as she joins French president Emmanuel Macron in leading an effort to stop the spread of violence and extremism online.
We are unlikely to see the seeds of new legislation formulated in Paris, but rather some action points the various tech companies will sign up to, a mix of voluntary company policy changes and technical measures that the governments involved will agree to monitor.
The high-level phone calls have been made to soften up Mark Zuckerberg and his opposites at the social media giants. The policy wonks have a draft agreement ready to go. But even the best outcome from Paris – action to ensure the next terror attack won’t be beamed out via a service that was designed for live-streaming birthday parties, will be insufficient in the scheme of things.
Digital threats to democracy
In fact, the tech companies will secretly relish the chance to prostrate themselves at the feet of world leaders looking for their own political quick wins. The summit’s focus on the moderation of online content diverts attention away from the more fundamental issues the digital giants pose to society.
Those issues are well canvassed in the new Law Foundation-funded report Digital Threats to Democracy released today and one of the most in-depth reports published locally to date on the negative side effects accompanying the rise of digital media.
The problems, the report explains, have three drivers – the sheer power a small handful of ‘monopoly platforms’ have amassed, the lack of transparency around the computer algorithms that underpin them and the attention economy that has turned our social media data points into digital gold for Silicon Valley.
“There have been plenty of signs that more government action was needed for quite some time,” says the report’s lead author Marianne Elliott, the co-director of research, policy and communication think-tank The Workshop.
“I find it difficult to make peace with the fact that it took an event like Christchurch to get the government's attention on the issue.”
New Zealand has taken a hands-off approach to regulating the activities of big tech companies, even while Australia has introduced laws to hold social media companies to account for spreading violent material, and the European Union has hammered Google and others with billions of euros in fines for anti-competitive behaviour and privacy breaches.
The argument oft-repeated, including by Ardern, is that New Zealand is simply too small to make a difference. Even worse, with digital platforms now so integral to the economy, we could risk Facebook, Google or Amazon reducing or blocking local access to some of their services if the cost of doing business here increased significantly.
“It's likely a risk that's at the forefront of their thinking when they are considering how to hold these companies to account,” admits Elliott.
One of the report’s key recommendations is to explore new “smart” regulation and legislation to apply the “downward pressure” required to force social media companies to change their behaviour.
But enforcing our existing laws would go a long way as well. From Google breaching court suppression orders with its automated publishing systems to Facebook refusing to assist the Privacy Commissioner with his privacy investigations, tech companies have flouted New Zealand law with few consequences.
“Even our existing regulatory frameworks are not being respected,” says Elliott.
Would a regulatory crackdown see them turn their backs on us?
“It's untested, to be honest,” she says.
“These are very big companies. They have been met with some hefty fines in other jurisdictions. We need to test some of those assumptions.”
Regulation could come on multiple fronts, from privacy and data protection to anti-trust action to lessen the digital platforms stranglehold on online advertising. But while the report features a fulsome literature review detailing the evidence of harm and risks posed by the digital platforms, there’s very little research to guide us on what the implications of reining in Big Tech would be.
However, Elliott says there are plenty of examples of where the companies have changed to comply with the law.
“Facebook has hundreds of human moderators in Germany ensuring that the content in Germany meets German regulatory standards, particularly around Holocaust denial,” she says.
“The only way they'll do it is if it costs them a significant amount of money if they don't do it.”
Harder to tackle, but essential to reducing harm in the digital realm, was a more ethical approach to the design of digital platforms. The incentives to continue mining our data to yield insights to attract advertisers, and to keep us scrolling, swiping and watching in their apps, were incredibly powerful.
But change could come from within, says Elliott, with growing signs that Silicon Valley’s workforce, highly-paid but often employed on tenuous, short-term contracts, was increasingly questioning the direction of their companies.
“These are people who understand exactly what is inside the ‘black box’. They have genuine ethical concerns about what they are building.”
Sensing the growing heat of regulation and pressure from their own employees, even the most profit-driven companies have been smart enough to move before they are pushed. The Time Well Spent movement, created by former Google product manager Tristan Harris, who became disillusioned at the psychological ‘hacks’ developers in the Valley were using to keep people on their platforms for longer, has resulted in Google and Apple introducing digital wellbeing features on their phone operating systems.
“You are already seeing Facebook moving away from the algorithmically-curated newsfeed to private messaging, where people have more control over what they see and where it is coming from,” says Elliott.
Concerns raised by engineers and software developers over uses of emerging technologies like artificial intelligence and facial recognition have spurred Microsoft and Google to rethink some of the projects it is working on, particularly those with a military or government application.
The other pressure point was collective consumer action, something Elliott believes “will happen, but not quickly enough”. The #deletefacebook movement that briefly peaked in the wake of the Cambridge Analytica data scandal ultimately didn’t dent Facebook’s growth in its user base which now sits north of 2.3 billion people.
Part of the problem was that a lack of transparency around how online products and algorithms worked made it hard for the average Facebook or Google user to know what was at the heart of their discontent.
“Consumers who demand more ethical eggs know exactly what it is that makes caged eggs unethical to them. They can name it, they can describe it.”
The secrets of Google’s search engine algorithms and Facebook’s systems for organising posts in our newsfeeds was the intellectual property on which the companies were built. But researchers interviewed by Elliott saw merit in the idea of a publicly-funded independent watchdog, which could scrutinise proprietary social network systems on our behalf.
“It feels like an idea that has legs, but it would have to be staffed by people who previously worked for those companies in order to understand what they were seeing,” says Elliott.
Likewise, an independent body could counter the “consent fatigue” that had set in among users of online services greeted with privacy policies and terms and conditions, by overseeing the use and sharing of types of data.
“As a citizen I could create a privacy profile which would set out my preferences for the types of data I'm willing to share and with whom.”
The idea of an identification verification system was explored in the report, a means of trying to counter the spread of misinformation and hate speech online. Aside from the fact that anonymity gave a voice on the internet to people who may otherwise be persecuted if their identity was known, an identity verification system could require us to put our faith in tech companies that had let us down repeatedly on data security.
“They are incredibly irresponsible with the data,” says Elliott.
“I wouldn't trust them with my real name if my real name needed to be protected.”
Elliott hopes the Christchurch Call would at least look beyond content moderation on social networks to identify ways of preventing people from disappearing into a filter bubble of hate speech and becoming radicalised.
Beyond Paris, New Zealand had to address its fragmented approach to tackling issues thrown up in the digital realm. Elliott says a new team at the Department of Internal Affairs had emerged to finally look more broadly at the issues.
But capacity across government to deal with the myriad issues thrown in the digital realm was thin.
“There are a lot of tack-on solutions, including on content moderation, that aren't going to get us where we need to get to,” says Elliott.
“The harm is less obvious than pollution or a major oil spill. But it is real harm and it has to be taken seriously by government.”
The study’s recommendations:
- Restore a genuinely multi-stakeholder approach to internet governance, including meaningful mechanisms for collective engagement by citizens/users;
- Refresh antitrust and competition regulation, taxation regimes and related enforcement mechanisms to align them across like-minded liberal democracies and restore competitive fairness;
- Recommit to publicly funded democratic infrastructure including public interest media and the online platforms that afford citizen participation and deliberation;
- Regulate for greater transparency and accountability from the platforms including
algorithmic transparency and accountability for verifying the sources of political advertising;
- Revisit regulation of privacy and data protection to better protect indigenous rights to data sovereignty and redress the failures of a consent-based approach to data management; and
- Recalibrate policies and protections to address not only individual rights and privacy but also collective impact and wellbeing.