The violence in August this year in Charlottesville shocked the US public and sparked a debate about the influence of extreme right movements on the internet. The central question was where the limits to freedom of speech lie.
Just a few days after the far-right marches, the major web hosting company GoDaddy cut off The Daily Stormer, probably the largest neo-Nazi website in the US, with over 300,000 registered users. Authors on the site had repeatedly made insulting remarks about Heather Heyer, the 32-year-old killed when a rightwing extremist ploughed a car into a group of counter-protesters. Google then blocked the website editor’s attempt to switch to one of its servers. The site then appeared for a time under a Russian domain name.
The choice of Russia was no coincidence. In their ongoing search for sympathizers online, far-right groups all too gladly use Russian web hosting services and network providers. In Russia, they are subject to comparatively few restrictions on racism, anti-Semitism and other forms of hate speech. Using Russian web hosts also allows them to avoid prosecution by the authorities in their home countries.
However, the virtual emigration of rightwing extremists does not mean that their political influence is restricted to Russia – on the contrary. Because their content is not censored in Russia, they can disseminate their propaganda even more widely than elsewhere.
Western democracies in particular are faced with huge challenges: on the one hand, the internet offers far-right movements a range of exceedingly powerful propaganda instruments. With comparatively little effort, they can use social networks, websites and chats to reach a worldwide public. On the other hand, the global internet hampers nations’ ability to enforce laws against hate speech, whose jurisdiction is inherently national.
Virtual hate speech in Germany
Prosecutions for incitement to hatred and defamation have increased sharply in Germany in recent years. In 2014, German police recorded 2,670 cases of incitement; two years later, that figure more than doubled to 6,514. There are two main reasons for this. On one hand, far-right propaganda is now spread primarily on Facebook, Twitter, etc. Users now report offensive statements to the authorities more frequently. On the other hand, the political tone has lowered dramatically since the beginning of the refugee crisis: ‘Drop dead, you faggot’; ‘Merkel should be stoned’; ‘Gas the lot of them’ – today, this kind of language is all over the social networks.
Xenophobia, racism and anti-Europeanism rose sharply in Germany at the height of the global refugee crisis in summer 2015. This was also when the rightwing populist party Alternative for Germany (AfD) began to climb the opinion polls – hitting 20 per cent on some occasions. The German government desperately needed a means to stem this racist tide, which to a great extent was expressed via online hate campaigns.
In autumn 2015, the Social Democrat minister of justice, Heiko Maas, set up a task force bringing together representatives of Facebook, Google and Twitter, alongside numerous NGOs. The committee came up with a set of recommendations for dealing with online hatred. However, these remained largely ineffectual: according to one analysis at the beginning of 2017, YouTube deleted around 90 per cent of content reported by users, but Facebook only 39 per cent and Twitter just one per cent.
The ‘Network Enforcement Law’
For that reason, in June 2017, the Bundestag passed the so-called ‘Network Enforcement Law’ (Netzwerkdurchsetzungsgesetz – NetzDG). This confirmed that voluntary self-regulation of social networks had definitively failed. The NetzDG is supposed to force social network operators to delete ‘fake news’ and hate speech. According to justice minister Maas, freedom of speech would thereby be protected, ‘for today […] online defamation and threats are silencing people and they are thus being robbed of their freedom of speech by these criminal acts’.
The NetzDG, which comes into effect on 1 October 2017, forces network operators to delete within 24 hours any ‘obviously illegal’ content reported to them. Where content is not unequivocally illegal, the law allows a period of seven days in which to act. Only in exceptional cases can this limit be exceeded. If systematic failures to enforce the law occur, the employees responsible risk being fined up to €5 million; companies can be fined – if only in theory – up to €50 million. Additionally, network operators must establish a system so that users can report potentially illegal content. Any social network with more than two million users must also appoint a contact who the authorities can deal with. This move spells an end to the ‘facelessness’ of Facebook and co.
Incitement to hatred: Lessons from Nazism
In Germany, restrictions on free speech are much tighter than in the United States. Simply put, in the United States freedom of speech includes the right to disseminate lies, whereas in Germany it is ‘freedom of opinion’ that is protected. This does not cover statements that constitute incitement to hatred or defamation. Under German criminal law, you are guilty of defamation if, against your better knowledge, you assert or disseminate an untrue statement relating to another person that is likely to cause that person to become an object of contempt, to lower public opinion of them, or to threaten their reputation. Convictions carry a prison sentence of up to five years.
Incitement to hatred, on the other hand, covers breaches of the peace through incitement of hatred against parts of the population, through calling for violent or repressive measures against them, or through attacking their human dignity through abuse, malicious slander or defamation. This offence includes publicly approving of, trivializing or denying the Holocaust. Convictions carry a prison sentence of between three months and five years.
The incitement paragraph in its current form originated in an amendment to the criminal law code in 1960. It was a response by the Adenauer government to a series of anti-Semitic offences, including a series of arson attacks on synagogues. In subsequent decades, the section has been repeatedly changed and tightened. In 1994, the ‘simple’ denial of the Holocaust – i.e. denial without explicit identification with Nazi ideology – was also criminalized as incitement to hatred.
Restricting freedom of opinion
In principle, it is to be welcomed that policy-makers are taking strong action against fake news and online hate speech. The law against incitement to hatred is intended to prevent the emergence of a climate of opinion where certain groups are aggressively excluded and where they could also become the victims of physical violence. In each case, it is incumbent upon public prosecutors and the courts to weigh up the constitutional right to freedom of opinion against the safeguarding of the public peace. However, it is precisely here that the NetzDG may do more harm than good.
Civil rights and internet activists have criticized the NetzDG for allegedly restricting freedom of opinion instead of protecting it. Normally, it is the job of public prosecutors or courts to determine whether specific statements constitute defamation or incitement to hatred. According to the new law however, this decision is now to be left to the social networks themselves. The government is therefore appointing these corporations, which are already parties to each case, as both judges and opinion police.
Given the sheer mass of decisions that they are now supposed to take, social media service providers are now confronted with a Herculean task: the German ministry of justice has estimated that social networks receive over 500,000 complaints annually, which among other things include hate speech. This is equal to the total number of crimes recorded every year for the whole of Berlin, for example.
As a rule, it is not a lawyer at Facebook who decides whether content is to be deleted, but a ‘content moderator’. In Germany, around 700 employees of the Bertelsmann subsidiary Arvato are responsible for moderating Facebook – for an hourly rate of just over the €8.84 minimum wage. Working under assembly line conditions, their job is to decide which posts – ranging from naked body parts to videos of sadistic violence – should and should not be removed from users’ newsfeeds. Until now, moderation was based on Facebook’s ‘community standards’, a secret catalogue containing several hundred rules regulating the deletion of content. From now on, however, Facebook will also have to consult the German penal code.
It is clear what the consequences will be. In order to avoid costs and lengthy court cases, not to mention the draconian fines that hang over the heads of their employees, social media companies will delete reported comments if in doubt. Conscientious assessment based on legal criteria is precluded. Given that most users would be reluctant to cover the costs of challenging the decision to remove their content, an arbitrary culture of deletion will emerge that contradicts the guarantee of freedom of opinion made in Article 5 of Germany’s Basic Law.
A struggling judiciary
A broad alliance of trade associations, internet campaign groups and civil rights organizations warned the Bundestag in advance not to entrust Facebook and similar companies with ‘the state’s task of deciding on the legality of content’. Neither should enforcement be allowed ‘to fail because of the judiciaries’ lack of resources’.
For this is where the next problem lies. The German courts are already massively overstretched. According to the German Association of Judges, there is a shortage of around 2,000 judges and public prosecutors nationwide, with criminal justice particularly affected. Furthermore, there is still no public prosecution office specifically responsible for digital hate speech. It would need a great deal more time and money before the judiciary had sufficient resources to cope with the flood of reports.
Even if the resources were at some point to materialize, there is no guarantee that hate speech would actually disappear from the internet. For judges and public prosecutors also find it difficult to glean from the law on incitement what exactly constitutes a criminal act. They have to carefully examine ‘the context in which such statements are found and interpret them in the light of freedom of opinion, which also protects drastic, exaggerated and polemical statements’, explains the Berlin lawyer Ali B. Norouzi.
The difficulty of contextual interpretation is also why many cases are dropped or suspended – including the case of Lutz Bachmann, founder of the Islamophobic and rightwing Pegida movement. After the sexual attacks on women in Cologne on New Year’s Eve 2015/16, the perpetrators of which included numerous refugees, Bachmann sold T-shirts with the slogan ‘rapefugees not welcome’. The Green Party politician Jürgen Kasek brought charges against Bachmann for incitement to hatred. Kasek argued that the slogan was presenting refugees per se as rapists and thus fuelling xenophobia. Leipzig’s public prosecutor did not share this opinion and interpreted the words literally to mean that refugees who commit rape are not welcome. In the public prosecutor’s eyes, there was no grounds for charges under the law on incitement to hatred.
Prosecuting incitement to hatred is even more difficult when offences are committed abroad – albeit for different reasons. According to the law on incitement, these are to be prosecuted no differently from domestic offences, regardless of whether the offenders are resident in Germany or not. Here, too, the condition is that such offences impact on the public peace in Germany and injure the human dignity of its inhabitants. As such, it suffices for criminal content to be accessible on the internet in Germany.
In order for there to be a conviction, it must be proved beyond reasonable doubt that the statement is attributable to the accused. This is relatively easy to establish if the offender makes their statements in public with witnesses present. On the internet though, it is not so simple to prove whether a particular person is indeed the author of a statement – especially if the social network operator refuses to cooperate with the German prosecution authorities.
VK.com: Fleeing to Russia
This is another reason why open and organized rightwing extremism is increasingly migrating to the so-called Runet – above all to VK.com. The multilingual social network has become the new home of far-right groups banned from Facebook, which are now linking up with other rightwing movements from other countries.
VK.com was founded in October 2006 by the brothers Pavel and Nikolai Durov as VKontakte and looks like a Facebook clone. It has over 400 million Russian-language users, making it Russia’s most popular website and the fifth largest website worldwide. Close to 70 per cent of its visitors are from the Russian speaking space, however the network has also established itself in Germany, albeit on a far smaller scale. Around two per cent of visitors are from Germany, equating to around 14 million visits a year.
On VKontakte, German Nazis, anti-Semites and racists can deny the Holocaust, mock victims of the Shoah and insult non-white people and Jews – all for a German public. While the social network does have conditions of use that forbid racist content, for example, it is extremely rare for posts to be deleted. In Germany, the political use of the swastika is forbidden, however on VK.com Nazis can use this and other symbols without fear of censorship. What is more, all this is publicly visible: unlike Facebook, most VK webpages are directly accessible without having to register and log in.
What was once the largest far-right Facebook page on the German-language web, Anonymous.Kollektiv, migrated to the Russian web in spring 2016. The page had over two million followers. By way of comparison, Germany’s two largest parties, the CDU and SPD, have just 300,000 followers between them. Alongside all manner of conspiracy theories and Russian propaganda, the Anonymous.Kollektiv page primarily contained incitements to racial hatred towards refugees. Refugees were described as a ‘sex-hungry, paedophile horde’ or ‘human trash’, for example. Videos were linked to that trivialized the Holocaust.
The site’s operator was probably Mario Rönsch. Though relatively little is known about him, Rönsch is known to come from Erfurt in Thuringia and is thought to have been an AfD member until around 2014. He has multiple previous convictions – including for fraud and delay in filing for insolvency. In the past, he liked to be seen alongside Jürgen Elsässer, a figurehead of the new Right scene and editor-in-chief of Compact magazine. The latter’s writing was also promoted on the Anonymous.Kollektiv website. When, after numerous complaints, Anonymous.Kollektiv’s Facebook page was blocked in May 2016, the page switched to VK.com.
The content on the new page mainly comes from the website anonymousnews.ru, which evidently belongs to Rönsch too. Until the beginning of 2017, he also ran Migrantenschreck.ru, an online weapons shop. Among other things on offer was the MS55 handgun, whose ‘muzzle energy’ could fell any ‘asylum supporter’. The German security forces had difficulty closing the shop down because its data was stored on a Russian server and the weapons were distributed from Hungary, where they are not forbidden. It was only when the German authorities directly pursued Rönsch’s German customers that he shut up shop.
Rönsch has meanwhile gone underground. Although his VK page has ‘only’ 50,000 followers, it continues to play an important role for the far-right and rightwing populist community on VK.com. In addition to Anonymous.Kollektiv and Compact, VK.com also plays host to the ‘identitärian movement’, the AfD, the rightwing publisher Kopp, the portal ‘PI-news’ (‘Politically Incorrect News’), Pegida and its founder Lutz Bachmann, as well as numerous far-right brotherhoods and pro-Hitler groups.
In the Kremlin’s pocket
VK.com could, like Facebook, clamp down on racist and anti-Semitic hate speech – were the Kremlin not pulling the strings. Some time ago, the Russian government took over the social network. Quite how this happened is unclear. However it is certain that VKontakte founder Pavel Durov entered the sights of the Putin regime and was put under increasing pressure. In April 2014, he resigned as VKontakte’s CEO. ‘Complete control over VKontakte is being passed to Igor Sechin and Alisher Usmanov’, Durov wrote on his VK profile at the time.
Both of the men he referred to have a high profile: Usmanov is a Russian oligarch born in Uzbekistan and the largest shareholder in the investment group Mail.ru. The group bought its first shares in VKontakte in 2010 and in 2014 became the network’s sole owner. Sechin is executive chairman of the oil company Rosneft and considered the second most powerful man in Russia. Like Usmanov, he is among Vladimir Putin’s closest advisers.
According to Roman Dobrokhotov, the editor-in-chief of the Kremlin-critical portal The Insider, Russia’s domestic intelligence service, the FSB, probably has full access to the network’s infrastructure and content. However, the Russian government has little interest in sanctioning rightwing extremism; news critical of Putin is the much higher priority.
The struggle against the ‘transglobal white class’
The fact that the Kremlin appears to ignore rightwing extremists means that they can continue posting racist and anti-Semitic content on VK.com, insulting individuals and spreading fake news. Legal measures like the German law against incitement to hatred had already proved to be blunt weapons. The NetzDG will also fail to solve the problem of hate speech and fake news, but will instead push it into further corners of the internet. Far from falling silent after having been barred from Facebook, far-right movements have merely adapted their propaganda strategy to the new circumstances.
Far-right extremists in Germany increasingly use VK.com to radicalize users. The network also has a strong centripetal effect in the far-right scenes in other countries. At the same time, the far-right also continues to recruit supporters on Facebook, using various codes and caricatures to escape censorship. ‘Down with the West, down with the Lügenpresse [‘lying press’] and down with the “Jewbook” ’ – these are the slogans with which they attract sympathizers on Facebook and then entice them onto VK.com. On the Russian network, the new Volksgenossen (or ‘comrades’) are then integrated into the groups organized by rightwing extremists.
There is not much the German authorities can do other than to simply look on. Their hands are tied, since they receive no user data from VK.com. The most the German police can do is to pursue individual offences – as long as they can establish the identity of the author of illegal posts. They seldom succeed in doing so.
It is by no means only rightwing extremists in Germany who cluster around VK.com. The US far-right is also flocking to the social network: white supremacists, alt-right members and neo-Nazis. In the early 2000s, the Klu Klux Clan held dozens of meetings every year all over the country. In the interim, such gatherings have become rare. Instead, networking activities have largely shifted to the internet. This has prompted the sociologist and cyber-racism researcher Jessie Daniels to warn of the formation of a ‘transnational white class’: the World Wide Web not only allows rightwing extremists from all countries to recruit largely anonymously and with few resources, but also enables them to network with like minds all over the world.
The fatal consequences of this development were to be seen in June 2015, when Dylann Roof shot nine people, all of them African Americans, at a bible class in a church in Charleston, South Carolina. As far as is known, Roof maintained no contact at all with radical right movements in his neighbourhood. Instead, he had been radicalized on the internet and regularly visited the website ‘Council of Conservative Citizens’. The Council propagates the idea of white domination and see black people as a ‘retrograde species of humanity’. Roof obviously also regularly read and commented on content in The Daily Stormer – the ‘world’s biggest hate site’ (Heidi Beirich).
In order to halt hate speech and strengthen measures against far-right networking, alongside opposition and engagement on the part of civil society, one thing is needed above all: cooperation between host servers and between national regulatory authorities. This is both possible and effective, as attempts by The Daily Stormer to find a digital hiding place show. The portal only managed to operate under a Russian domain name for a short time following the violence in Charlottesville. After just a few hours, the Russian supervisory authority Roskomnadzor denied the site permission to register in their part of the internet – public pressure had obviously become too great.
The Daily Stormer had no other choice than to migrate to the dark web – the internet’s underworld. While this offers anonymity and protection, content in this part of the internet cannot easily be found. Search engines like Google do not link to it; instead, you need special software to gain access. Rightwing hate speech that publicly calls for or legitimates violence now encounters at least a significant hindrance.