Ireland’s media regulator said it has begun an investigation into ByteDance-owned TikTok and Microsoft-owned LinkedIn over concerns about their content reporting mechanisms.
The authority said the platforms’ systems may not be easy to access or allow people to report child sexual abuse material anonymously, as required under the EU’s Digital Services Act.
“In the case of these platforms, there is reason to suspect that their illegal content reporting mechanisms are not easy to access or user-friendly, do not allow people to report child sexual abuse material anonymously, as required by the DSA, and that the design of their interfaces may deter people from reporting content as illegal,” said the Irish regulator’s digital services commissioner, John Evans.
Harmful content
The DSA, which places additional requirements on very large online platforms, requires parts of the law to be investigated by the country in which a country has its headquarters.
If found to be violating the DSA, the regulator can impose a fine of up to 6 percent of a company’s annual turnover.
The investigations were spurred by a review of online providers’ compliance of reporting mechanisms, the regulator said.
Evans said a number of other platforms have made significant changes to their reporting mechanisms for illegal or harmful content after engagement with the regulator, which he said has requested further information from some of them and is not ruling out further action.
Stringent rules
The regulator began its first probe under the DSA last month with a formal investigation into social media platform X over its content moderation mechanisms, citing concerns that users are not being given a chance to appeal decisions and that complaint-handling systems are difficult to access.
The probe of X was based on a user complaint and information provided by nonprofit HateAid, which took legal action against X in 2023 on behalf of a Berlin-based researcher who was repeatedly banned from the platform.


