This Researcher Juggled Five Different Identities to Go Undercover With Far-Right and Islamist Extremists. Here’s What She Found – TIME
Wearing a blond wig and walking through the streets of central Vienna in October 2017, Julia Ebner reminded herself of her new identity: Jennifer Mayer, an Austrian philosophy student currently studying abroad in London. It was one of five different identities that Ebner, an Austrian researcher specialized in online radicalization and cumulative extremism, adopted in order to infiltrate far-right/Islamist extremist networks. That day in October, she met a local recruiter for Generation Identity (GI), the European equivalent of the American alt right, which is mostly an online political group that rejects mainstream politics and espouses ideas of white nationalism. GI is the main proponent of the Great Replacement Theory, the baseless idea that white populations are being deliberately replaced through migration and the growth of minority communities. The theory has inspired several recent extremist attacks, including the murder of 51 people in Christchurch, New Zealand last March, and the mass shooting at a Walmart in El Paso, Texas last August, which left 22 people dead.
The meeting with GIs local leader proved to be significant. Ebner learned about how important the group considered social media for their strategy to expand and recruit members in schools, public baths and other public venues that young people visit. She found out that GI were planning to launch an App, Patriotic Peer, that would connect a silent majority (in the words of the leader), which was funded by donations from around the world.
Securing the meeting wasnt easy. It took several months of setting up credible accounts within the various GI networks online and a couple of weeks of messaging with GI members. But it was necessary for Ebners research: the 28-year-old is a resident research fellow at the Institute for Strategic Dialogue, a London-based think-tank that develops responses to all forms of hate and extremism. She has advised the U.N., parliamentary working groups, frontline workers and tech firms on issues around radicalization, and her first book, The Rage: The Vicious Circle of Islamist and Far-Right Extremism, was published in 2017.
Two years ago, Ebner started to feel like she had reached the limits of her insights into the world of extremism. She wanted to find out how extremists recruit members, how they mobilize them to commit violence, and why people join and stay in the movements. Ebner believed she could only get her answers by being a part of these groups. Over the past two years, she has spent much of her spare time talking to people on online forums. They include the Discord group, used by the alt-right to coordinate the violent Charlottesville rally in August 2017, the Tradwives (short for Traditional Wives), which is a network of some 30,000 far-right women, who perceive gender roles in terms of a market place where women are sellers and men buyers of sex, and an online trolling army, Reconquista Germanica, which were active in the 2017 German federal election.
Ebner, whose new book Going Dark: The Secret Social Lives of Extremists is published Feb. 20, spoke with TIME about what she discovered. The conversation below has been edited for length and clarity.
Ebner: My first attempt of creating and maintaining a credible profile didnt work. I was kicked out of a group and had to start all over again.
I found switching between different identities stressful and confusing. Remembering exactly what I had said in my online profiles, previous chats and real-life conversations in these various roles could get challenging. Sometimes staying in my role and not being able to talk back as my real self was also difficult. There were many moments when I wanted to debunk a crazy conspiracy theory, or say youre not funny! instead of laughing at a racist joke, or convince younger members to cease their involvement with a group.
As youd imagine, I made made plenty of stupid mistakes. Dropping my real credit card was only one of them. Once I even signed an email with Julia instead of Jenni. Im not a professional MI5 agent, I did acting in high school but going undercover didnt come naturally to me.
I received some tips from a friend who has done undercover investigations himself and also trained people to infiltrate dangerous groups. I probably did appear nervous but I imagine most people who go to a first recruitment meeting with a white nationalist group leader probably would be, so I didnt think that it would be too suspicious.
In many cases, they offer an escape from loneliness and a solution to grievances or fears. A lot of the time it was a fear of a relative loss of status, which the networks blamed on migration and changing demographics. They offered easy explanations oversimplified rationalizations to complex social and political issues.
The networks also offered support, consolation and counselling. They can turn into a kind of family. Some people spend so much time online that I doubt they socialize in the real world.
On the surface, there was no clear profile. Users were from different age groups, social classes, educational backgrounds and depending on the group different ethnic backgrounds. The lowest common denominator was people who were in a moment of crisis. The recruiters did a good job of tailoring their propaganda to pick up vulnerable individuals. The Tradwives reached women who had relationship grievances, Islamist extremists recruited alienated Muslims whod experienced discrimination, and white supremacists exploited people who had security concerns.
It was a major part of the recruiters strategy. White supremacist networks, like the European far right, have a clear step-by-step radicalization manual, which they call recruiting strategies. The Tradwives, for example, made themselves seem like a self help group and I think thats what attracted even women from different ideological backgrounds, and even those who dont subscribe to traditional gender roles.
Some groups, the European Trolling Army for instance, had tightly-organized hierarchical structures. Neo-Nazi groups often have military-like structures, positions in the groups are even named after military ranks, and a person could rise to the top by running hate campaigns against political opponents.
Other networks, like the ones used by the perpetrator of Christchurch and the attack in Halle, Germany last October, had looser structures. They would get together on an opportunistic basis when they saw that something could be gained by cross-border cooperation. They use their own vocabulary and insider references when they decide to collaborate on a campaign or a media stunt. The Matrix is one of many internet culture references from Japanese anime to Taylor Swift. And they would be very effective at advancing these operations.
Far right groups have undergone a rebranding and have reframed the ideas held by traditional neo-Nazis. Generation Identity use euphemisms like ethno pluralism instead of racial segregation or apartheid, and combine video game language with racial slurs, creating their own satirical language.
Not only are extremist groups better at spreading their real ideologies behind satirical memes, theyre also being given a platform by politicians. Language which mirrors that used by proponents of conspiracy theories like the Great Replacement are retweeted by politicians and repeated in their campaigns. This is likely to become more prevalent in the next few months in the run up to the U.S. presidential election. The 2016 U.S. election proved to be one of the key turning points in uniting far right groups globally.
Trans-Atlantic cooperation between the far right in Europe and the alt right in the U.S. has been growing. Some of the ideologies that inspired the GI and other far right groups have been propagated by leading far right figures in the U.S. And the European far right have adopted some of the strategies of gamification and propaganda used by the Americans alt right. They both see themselves as fighters in a war against white genocide or the Great Replacement and there is loyalty between them that makes the idea of ultra nationalism obsolete.
One of the biggest problems is in the infrastructure of social media and tech companies. Algorithms give priority to content that maximises our attention and to content that causes anger and indignation. Its like handing a megaphone to extremists. Its allowed fringe views to get a much bigger audience. Developments in deepfakes, cyber warfare and hacking campaigns are likely to help extremists to refine their strategies.
Firstly, we need a global legal framework that forces all the tech companies not just the big ones but also the fringe networks, like 8chan and 4chan to remove content that could inspire terrorism. After the shootings in Christchurch and Halle, the documents the manifestos left behind by perpetrators were translated into several languages and shared on the fringe corners of the internet. We need a global approach because people can always find a way to circumvent national laws.
But content removal alone wont work. In my book I suggest 10 solutions for 2020; this includes more digital literacy programs in education settings, which can enhance critical thinking skills, help Internet users to spot manipulation and ultimately weaken extremists. We also need more deradicalization projects that use social media analyses to identify and engage with radicalized individuals. Counter-disinformation initiatives with the help of fact checkers and social media campaigners could be formed, as they have done in the Baltics, to debunk online manipulation.
Technology and society are intertwined. So, our response has to be integrated. We need an alliance across not only politicians and tech firms, but civil society and social workers.
Thank you! For your security, we've sent a confirmation email to the address you entered. Click the link to confirm your subscription and begin receiving our newsletters. If you don't get the confirmation within 10 minutes, please check your spam folder.
Contact us at editors@time.com.