One of the main goals of the new acts is to improve security online and to limit the spread of the so-called “illegal content”, such as dangerous disinformation and hate speech.
All providers of digital services, regardless on which category they fall into, will need to fulfill minimal requirements, such as being transparent in spreading information, respect for the fundamental rights of their users, cooperation with relevant national institutions and providing a point of contact and an official legal representative.
The very large online platforms and search engines, however, those used by more than 10 per cent of the EU’s total population, will be under direct surveillance of the European Commission, Charles Manoury, a digital economy press officer for the European Commission, told BIRN.
“The very large online platforms and very large search engines will have four months to adapt to the Digital Services Act upon being designated by the European Commission,” Manoury said.
He added that by January 2024, governments of the EU member states will need to impose new rules on regulating smaller platforms and non-systemic aspects of the work of larger platforms.
According to Trpevska, the large online platforms and search engines will be subjected to the strictest public scrutiny to make sure that they respect the rules.
“Those platforms carry the biggest risk of spreading disinformation, hate speech and other harmful or illegal content. The member states will have the major role in surveying their operations, with the support of the European Digital Services Board, while surveillance of the largest platforms will be an obligation of the European Commission,” she added.
Platforms will not be obliged to check content before publishing it, nor will they bear responsibility for what their users publish. Their main obligation will be to react immediately if certain content is reported as illegal.
“The spread of disinformation is a profitable business for online platforms. The more provocative and shocking the content spread through the platforms, the longer the users stay on them. Besides, the automated ‘recommendation systems’ and the algorithms used by the platforms are designed to serve users similar content to ones that they’ve already viewed,” Trpevska explained.
She recalled the testimony to the European Parliament of a Meta whistleblower who revealed how this platform encourages the spread of disinformation and violent content, using its algorithms for recommending content.
He said that when extreme content and the disinformation appear in the feed, the users tend to view them more, and therefore stay longer on the platform, generating more income.
The rules on content will apply to all platforms, regardless of whether some of them already have pre-publishing checks. Some platforms have introduced active content control and are able to refuse to publish certain content or remind the users that it’s not allowed and goes against their rules. It was unclear whether they could be held responsible if certain illegal content “slips into them”, but the new act defines that they will also enjoy immunity.
Besides having to react immediately when requested, the very large online platforms and search engines will be obliged to publish reports on moderating content and to allow the European Commission access to all data needed to monitor whether they’re compliant with the Digital Services Act. In some cases, that means allowing access to the algorithms and their premises. They will also need to pay a surveillance tax to the Commission. The fines for failing to abide are up to 6 per cent of their global revenue.
Companies like Meta and Google, for example, have reported annual revenues of 120 to 260 billion US dollars, which means that the 6 per cent fine would mean paying around 10 billion dollars, close to the annual GDP of North Macedonia.
‘Trusted flaggers’ to report illegal content