Seeing a New Zealand flag flying at a neo-fascist rally in Germany in 2015 prompted Auckland University of Technology researcher David Hall to ask why violent radicalisation was affecting even his fellow Kiwis. He spoke to two specialists at the cutting edge of international counterterrorism.
It is possible, however, to use the hatemongers’ own online tools to reverse their radicalisation, including by offering them counselling. We can plant seeds of doubt among hateful worldviews, according to Vidhya Ramalingam, who co-founded a company that specialises in online methods of countering violent extremism, Moonshot CVE.
Ramalingam led the European Union’s first cross-government initiative on far-right terrorism and extremism, in response to the 2011 attacks in Norway. On April 30, she testified to the United States house committee on foreign affairs about the global threat of white nationalism.
Moonshot’s strategy is to work with the dynamics of the internet, rather than against them. It has developed artificial intelligence (AI) tools to identify extremist individuals and groups by their online behaviour. Ramalingam says people on the path to deradicalisation leave some sort of digital footprint, and not just on big, public platforms such as YouTube, Twitter and Facebook. Even “dark web” imageboards such as 4chan and 8chan are essentially public information, she says.
The next step is to engage and interact. This engagement can be passive; for example, targeted online advertisements for counselling services. Using controlled trials in the US, Moonshot found people looking to join or engage with violent far-right organisations were more than twice as likely as the general American population to click on mental-health ads. Committed extremists were 48% more likely. This does not mean violent extremism is merely a product of mental-health problems, but it does imply poor mental health is a risk factor. Targeted support offers an opportunity to divert people from the path of further radicalisation, reducing the likelihood of violent actions.
The Redirect Method, which Moonshot developed in partnership with Jigsaw, Google’s technology incubator, is similar. A pilot focused on Isis’ online audience, used Google’s Adwords targeting tools to redirect searches to YouTube videos that told a less heroic story of Isis’ incompetence, injustice and illegitimacy. During the eight-week pilot in 2015, the videos reached more than 300,000 viewers, planting seeds of doubt that might disrupt the extremist filter bubble.
Interventions can also be more proactive. Moonshot deploys teams of social workers to initiate and sustain conversations with online extremists. “The ultimate objective is to transition that conversation from an online one to an offline face-to-face meeting, which is where it connects with what [German deradicalisation expert] Daniel Koehler does.”
What Koehler, director of the German Institute of Radicalisation and Deradicalisation Studies (Girds), and Ramalingam have in common is an emphasis on the civic and domestic spheres as neglected dimensions for countering extremism. After an atrocity such as the Christchurch mosque attacks, Ramalingam says “the instinct is to assume that the only agencies that can act are governments or tech companies”, usually by restricting hateful material.
She says takedowns are warranted in egregious cases, such as explicit incitement of violence against a community. But she says a lot of far-right extremist content won’t meet this threshold, at least partly because of its reliance on memes, metaphors, allusions and obscure in-jokes. “Even if you take down the content, the person who created it still exists.” At the same time, government censorship “plays into the narrative [of victimisation] that they use to recruit others”.
So, Moonshot represents “a second option, which is about engagement with individuals before they go on to commit acts of violence”. Rather than striving to stamp out all hate speech – an unrealistic goal in any case – this approach “relies on information being in the public domain. It’s a response that should have freedom-of-expression activists behind it.”
Of the Christchurch Call Paris summit, Ramalingam says she hopes the initiatives go beyond state regulation. There are other effective interventions possible to “proactively interact with individuals going down this path”, she says. The Global Internet Forum to Counter Terrorism, an existing partnership between tech firms and governments, already focuses on restrictive measures. But these address only “the tip of the iceberg”, the most serious, obvious offenders, leaving untouched the larger group of people en route to radicalisation.
Ramalingam remains convinced radicalised people can change. She says it just requires a sense of care for people who may seem to least deserve it.
According to Koehler, who is both an academic researcher and a family counsellor, the “lone-wolf terrorist” is a myth in regards to violent right-wing extremism. On the contrary, we should be listening for the howling of the pack to provide timely and effective interventions. He works directly with white supremacists and violent jihadis to achieve their “exit” from extremist organisations.
Koehler’s publications, which include a policy briefing this year on far-right extremism for the International Centre for Counter-Terrorism in The Hague, are essential texts in the global battle against violent radicalisation. He says right-wing terrorism is poorly understood, especially the networks of support it relies on.
“The concept of the ‘lone wolf’ can be regarded as thoroughly disproven by in-depth research studies looking at lone-actor attacks,” Koehler says. “Almost all of them had intense contact with other groups or social networks, either offline or online.”
This is exemplified by the Christchurch mosque gunman, whose manifesto is a conglomeration of shout-outs, in-jokes and white supremacist myth-making, tailored to a niche audience, including those on 8chan who cheered on mosque atrocities, live via Facebook.
White supremacism has always been highly international and interconnected among different movements, groups and leaders, he says. “In the 1970s and 1980s, for example, US neo-nazis were regularly travelling to Germany, providing the Germans with propaganda material that was illegal there. The only limits white supremacists have regarding international connections are language and resources. The last one, unfortunately, becomes more and more obsolete through social media.”
Although far-right terrorist violence had been in decline over the past few decades, recent years have brought evidence of an upswing. In Europe, right-wing terrorist attacks rose by 43% between 2016 and 2017. In the US, they doubled over the same period. Though this spike coincides with the presidency of Donald Trump, the average rate of attacks was already on the rise, increasing threefold during the second term of his predecessor, Barack Obama. Arrests are also increasing.
As the New York Police Department’s intelligence division reported in the wake of the September 2001 terror attacks in the US, the internet is a driver and enabler of violent radicalisation because it provides resources such as technical know-how and tactical expertise under a cloak of anonymity.
This not only empowers established terror organisations, but also facilitates what Koehler terms “hive terrorism”, where individuals with no direct ties to extremist groups can advance their violent ends in a seemingly spontaneous way.
“Radicalisation is essentially a process of depluralisation of political ideals and values. There usually is a decrease in perceived alternative solutions to a certain problem and an increase in the ideologically defined urgency to act. This creates the ticking time bomb of violent radicalisation.”
Whose voice is that?
The echo chambers of the internet exacerbate this problem, he says. By participating in subject-specific online forums, or by building social networks of like-minded individuals, users become detached from the pluralism of real political life. This is further reinforced by the internet’s “veil of objectivity”, which can have a normalising effect on extremist ideologies.
So-called “filter bubbles” have a similar effect, where online information is personalised according to a person’s search history or click-behaviour. Whether it’s Google’s Search, Facebook’s NewsFeed or YouTube’s video recommendations, internet portals are rigged to deliver information that reinforces the revealed preferences of users, then supply content that sits ever further along the ideological spectrum. The growing fanaticism of anti-1080 and anti-vaccine protesters is a different manifestation of the same problem.
The literature on radicalisation is clear about one thing: the causes of violent extremism are complex. There’s no single trigger, but rather an alignment of factors at the individual, group and society levels.
Not everyone who comes into contact with violent ideas or extremist organisations, or experiences geopolitical events, will acquire a lethal hatred towards certain populations. How these factors align to lead to violent outcomes will be unique to every individual, although not without patterns.
Koehler says it follows that the responses to counter violent extremism also need to be complex and aimed at all its developmental levels: the individual, group and society. Besides intervention and prevention for high-risk offenders, it will take mentoring, vocational training, education and strategies for building self-awareness and positive identities to reduce the triggers for radicalisation.
His family counselling work aims at reintegrating radicalised individuals into the security of domestic life. He says it’s crucial the counter-programmes do not all involve expanding the scope of the state, because governmental intervention can reinforce a person’s sense of grievance and persecution.
Koehler believes early, pre-criminal intervention programmes should be community-led in most cases, and those for high-risk offenders or individuals already convicted of terrorism should be government-led. He says there’s room for public-private partnerships, too, to ensure counterterrorism isn’t too dependent on the crude, coercive powers of the government. Development of programmes should be encouraged among families, in social networks, workplaces, schools and sports clubs, as well as in other community-based activities, to alter the environment where the alienation that fosters radicalism can start.
Koehler says criminalising acts of speech “has only limited reach regarding countering radicalisation”. Moreover, hate speech can be a “window” to alerting friends and families to signs of a person becoming alienated and radicalised. Where someone is talking about certain ideas and expressing fear, anger or hatred, this can enable others to recognise the signs and respond.
For New Zealand, recognising this broader spectrum of possible interventions is vital. It is about more than managing incoming threats, such as that posed by the Australian-born gunman charged over the Christchurch mosque shootings. It is also about taking responsibility for outgoing threats such as Mark Taylor, the “Kiwi jihadi” who posed a threat to Syrian citizens by joining Isis, and reducing the risk of our own domestic extremists turning violent. Violent extremism is a globalised problem that requires a globalised response.
This article was first published in the May 25, 2019 issue of the New Zealand Listener.