HAKOM is already implementing the EU act on the removal of illegal content on the Internet

Source: Hina /M.E. Habek
HAKOM is already implementing the EU act on the removal of illegal content on the Internet
HND / CANVA
The Croatian Regulatory Authority for Network Industries (HAKOM) has been appointed as the coordinator for digital services in the implementation of the EU Digital Services Act, which aims to remove illegal content from the internet, including hate speech, disinformation, and harmful content for children, reports Hina.

HAKOM, after the expected adoption of the law on the implementation of the EU Act in the Parliament, will also need to issue regulations, including one for granting “trusted flagger” status – to Croatian institutions, organizations, or associations responsible for determining what constitutes, for example, hate speech, the spread of fake news and disinformation, and harmful content for children and minors on the internet.

As HAKOM highlights, part of their task as a coordinator is to “aggregate” the work of the relevant institutions that issue orders to act against illegal content online – the State Attorney’s Office (DORH), the Ministry of the Interior (MUP), the State Inspectorate, the Customs Administration, the Ministry of Health, and, especially in the case of hate speech, the Agency for Electronic Media (AEM) – and to deliver their consolidated reports to the European Commission.

Additionally, HAKOM will be able to impose fines on online platforms and their authorized representatives for non-compliance with the law, ranging from €6,630 to €66,360.

They emphasize that there are three ways to remove illegal content from the internet. The first is when the platform takes action on its own initiative, based on its algorithms and measures it has implemented to assess systemic risks, and thus “recognizes” and removes illegal content.

“Algorithms already work quite well when it comes to removing illegal content related to copyright, as well as detecting and removing child pornography,” notes Domagoj Maričić, assistant director of HAKOM, in an interview with Hina.

The second way to “clean” the internet is through orders from the aforementioned relevant Croatian institutions. These orders are formal and are sent directly to the platform, regardless of the country in which it is registered or its representative. The platform is obliged to comply and notify HAKOM.

Maričić also explains the third approach, which involves trusted flaggers, who are to be granted status based on their application to HAKOM after the law is passed. “Since there is no central body that monitors all illegal content on the internet, trusted flaggers are broadly defined – from public authorities to associations that deal with this and know how to recognize illegal content,” says Maričić.

Platforms must prioritize the removal of content reported by trusted flaggers, and if there are complaints about the flagger’s work, it will be determined whether they are potentially abusing their role. Particular attention, both from the broader and expert public, has so far been focused on removing disinformation campaigns, fake news, and political bots during election campaigns.

“Our next step is a meeting with the State Election Commission to establish a protocol for recognizing disinformation – who informs whom and how, and which platforms are identified as generating such illegal content,” says Lidija Antonić, a legal expert at HAKOM.

Recently, she adds, a meeting of the National Cooperation Network was held, where this was discussed with representatives from the Office of the National Security Council (UVNS), the Agency for Personal Data Protection (AZOP), CARNET, the Agency for Electronic Media (AEM), the State Election Commission, the Ministry of the Interior, the Information Systems Security Bureau (ZSIS), SDURDD, and the Ministry of Justice, Administration and Digital Transformation.

HAKOM notes that the EU Digital Services Act (DSA) is already being implemented in Croatia, highlighting that in May, Meta removed about 150 fake profiles connected to members of the HDZ Youth on Facebook and Instagram following a report from GONG. “Meta is applying the DSA and already considers GONG as a sort of trusted flagger. Given their long-standing election monitoring reputation, we don’t see why GONG shouldn’t soon meet the criteria for trusted flagger status,” Maričić explains.

He adds that addressing disinformation and fake news online, hate speech, and harmful content for children will require contributions from public institutions, NGOs, and fact-checkers as “trusted flaggers.”

“We cannot designate who will be a trusted flagger without receiving applications from interested parties. There is interest, especially in the protection of children and minors. And the principle is simple – everything that is illegal offline is also illegal online,” concludes Lidija Antonić from HAKOM.

The Center for Safer Internet (CSI), Croatia’s leading organization in protecting children from abuse and sexual exploitation online, confirmed their interest in obtaining trusted flagger status to Hina.

“We are actively advocating for the designation of our Center as a ‘trusted flagger’ under the proposed law, and this possibility has been discussed multiple times in our conversations with HAKOM. Cooperation at this level would greatly contribute to creating a safer online environment for children and youth,” CSI stated.

On the other hand, the Office of the Ombudsman for Children expressed hope that the Act’s implementation would contribute to a high level of protection for minors online but also noted that “the Office currently has no information about potential cooperation with HAKOM regarding online content regulation.”

The Office of the Ombudsman also commented on hate speech and disinformation online, pointing out that the draft law does not provide detailed conditions for trusted flaggers.

“One of the key questions will be the balance between freedom of expression and hate speech, and the related freedom of information, as well as the right to an effective legal remedy, the right to privacy, and the issue of disinformation. We will focus on human rights while considering these issues through the lens of the rule of law in Croatia,” the office concludes, noting that “it is currently unclear who, what, and how actions should be taken,” and that this needs to be clearly prescribed for the second reading of the draft law.

Finally, the Agency for Electronic Media (AEM), which has so far been the only entity legally required to decide what constitutes hate speech in the media, and is also the lead on Croatia’s fact-checking network project, commented on their cooperation with HAKOM.

“The modalities of cooperation between HAKOM and the Agency for Electronic Media regarding the implementation of the Digital Services Act are being defined. Further information will be provided as soon as it becomes available,” AEM stated.