The fact that Meta and Alphabet succeeded in overturning the obligation to report criminal content from the Network Enforcement Act (NetzDG) is a setback for lawmakers, even if central aspects in the Digital Services Act will hopefully still be implemented at European level in the future. The decision of the Cologne Administrative Court shows once again that digital corporations continue to successfully play cat and mouse with overburdened authorities. The different legal responsibilities (including the EU, the federal government, and the states) and the confusion of existing law make it easy for digital corporations to use such dodges to avoid urgently needed regulation.
They are thus gaining more and more time in their successfully advancing takeover of our media system. The year 2021 marks an epochal threshold here – because, for the first time, more investment was made in advertising in digital media than in all other analog media combined. This measure provides an excellent indicator of the total audience attention available – because advertisers prefer to invest where users consume content.
The platforms will therefore be allowed to continue to grow largely unregulated in the future, whereas the analog media will increasingly erode. Since 2021 at the latest, digital media have been the leading media. However, the legislative perspective of the NetzDG, the Telemedia Act, but also the Digital Markets Act, and the Digital Services Act are not at this level in terms of content. Such initiatives attempt to remedy deficiencies in retrospect, which were already wrongly saddled in the legal foundations when the platforms enjoyed puppy protection and their business models were not yet taken seriously – which is why they have a unique liability privilege.
Currently, the legislator allows platforms to operate on the basis of a business principle in which they are allowed to offer a “program” of punishable content (for example, false statements of fact, defamation, incitement to hatred, incitement to commit a crime, defamation, defamatory criticism, Holocaust denial, etc.) unchallenged.
Massive unequal treatment
To illustrate with a recent example: if podcaster Joe Rogan offers his content on Spotify, the platform is not responsible (no matter how many millions it pays in royalties or how many profits it makes from the content). If Joe Rogan were to offer the same content on RTL, for example, RTL would have to assume full distributor liability. First, this is a massive inequality of treatment between media companies that are in direct competition with each other. At the same time, it is inconsistent for a platform to assume economic responsibility for content (through fees or advertising, for example) even though it denies responsibility for the same content. This is actually so irrational that it has to be spelled out for once:
1. We enable platforms’ business models that monetize criminal content.
2. This results in massive social problems (such as the current high number of vaccination opponents with massive economic damage to society); these are problems that arise causally from the platforms’ business model.
3. We allow platforms to generate billions in profits from a “program” with criminal content through advertising or fees.
4. In contrast, we turn the social problems of the platforms into our own problems after the fact, for example when authorities demand mandatory reporting of criminal acts in the NetzDG.
5. The digital companies are avoiding this (on the grounds that they are based in Ireland and there is no obligation to report hate postings there).
This will only change if we reject the systematic exploitation of regulatory loopholes by digital corporations – and make the emerging social problems the platforms’ problems as well. This is precisely what will only happen when we have the political courage to question the liability privilege in connection with monetization.
Therefore, the currently prevailing unequal jurisdiction should be stopped, which, ironically, also actively punishes the analog media of press or broadcasting for checking the published information according to the common journalistic criteria and standards. The flourishing digital corporations check nothing and are additionally exempted from liability (with the exception of something like the new copyright directive).
Instead, any form of monetization through advertising should be seen as a clear signal that a company has “taken ownership” of the content used (this idea is thus in conflict with the current Telemedia Act, § 10). To put it directly: Whoever assumes economic responsibility would necessarily also have to bear responsibility for the content.
All platforms are free to introduce additional alternative offerings or feeds on their platforms that do not involve any liability for dissemination (let’s call it a playful example: “Facebook / YouTube unfiltered”). However, in such a “program” or feed, there may then also be no economic monetization through advertising. In this way, it remains open to every user to freely (and “unfiltered”) express opinions on such platform offerings.
This possibility enables us to reject the platforms’ reflexive references to “freedom of opinion” – especially since the digital corporations have long been in bad company here. For reasons that are easy to see through, many new platforms that disseminate radical right-wing positions cast themselves as champions against “censorship” and guardians of the grail of “freedom of speech” – such as Parler, Gettr, gab, MeWe, Telegram, or Trump’s Truth Social.
It is obviously one thing to enable the free expression of opinions. But it is quite another thing to then earn money with this (in case of doubt illegal) content and to additionally take the severely damaged society hostage in the name of freedom of expression.
Indeed, the closer one gets to this phenomenon of monetization, the more difficult it becomes to draw a clear dividing line beyond which responsibility for distribution of illegal content would be meaningfully suspended. If Spotify pays a podcaster like Joe Rogan $100 million for content, there is a strong case to be made that Spotify has “owned” that content and consequently should be liable like a television broadcaster. Ethically, that is so, but legally it is not.
Platforms must take responsibility
What about a prominent YouTuber who receives payments from YouTube that distribute shares of the advertising revenue generated for the creator? Here, too, YouTube pays the creator for the content. Isn’t YouTube then also responsible for the content?
The situation is different again with Facebook or Instagram – but here, too, the platform monetizes all transmitted content through advertising. Can it really be argued that the company has not “made this content its own”?
So there is something to be said for including the aspect of monetarization in the debate – because it helps break the current stalemate in an ideologically deadlocked debate (control versus freedom). This is precisely what could create new options for constructive solutions. For example, it would easily be possible to evaluate specific and defined forms of monetization as a clear signal that a company has “made its own” the content used.
Such rules with defined thresholds and limits could unleash a new and innovative dynamic on the social media market. The platforms would finally be forced to “grow up” in line with their current market position and take responsibility. Because this nasty problem is now really their problem for the first time, we can assume that they themselves will swiftly propose constructive solutions and quickly demonstrate their much-lauded innovative strength.