Italy’s competition and consumer authority, the AGCM, has fined TikTok €10 million (almost $11 million) following a probe into algorithmic safety concerns.
The authority opened an investigation last yr right into a “French scar” challenge through which users of the platform were reported to have shared videos of marks on their faces made by pinching their skin.
In a press release Thursday, the AGCM said three regional firms within the ByteDance group, Ireland-based TikTok Technology Limited, TikTok Information Technologies UK Limited and TikTok Italy Srl, had been sanctioned for what it summarized as an “unfair business practice.”
“The corporate has didn’t implement appropriate mechanisms to observe content published on the platform, particularly those who may threaten the security of minors and vulnerable individuals. Furthermore, this content is systematically re-proposed to users in consequence of their algorithmic profiling, stimulating an ever-increasing use of the social network,” the AGCM wrote.
The authority said its investigation confirmed TikTok’s responsibility in disseminating content “prone to threaten the psycho-physical safety of users, especially if minor and vulnerable,” corresponding to videos related to the “French scar” challenge. It also found the platform didn’t take adequate measures to stop the spread of such content and said it failed to completely comply with its own platform guidelines.
The AGCM also criticized how TikTok applies the rules — which it says are applied “without adequately accounting for the particular vulnerability of adolescents.” It identified, for instance, that teens’ brains are still developing and young people could also be especially in danger as they could be susceptible to peer pressure to emulate group behavior to try to slot in socially.
The authority’s remarks particularly highlight the role of TikTok’s advice system in spreading “potentially dangerous” content, declaring the platform’s incentive to drive engagement and increase user interactions and time spent on the service to spice up ad revenue. The system powers TikTok’s “For You” and “Followed” feeds and is, by default, based on algorithmic profiling of users, tracking their digital activity to find out what content to indicate them.
“This causes undue conditioning of users who’re stimulated to increasingly use the platform,” the AGCM suggested in one other remark that’s notable for being critical of engagement driven by profiling-based content feeds.
We’ve reached out to the authority with questions. But its negative assessment of the risks of algorithmic profiling looks interesting in light of renewed calls by some lawmakers in Europe for profiling-based content feeds to be off by default.
Civil society groups, corresponding to the ICCL, also argue this might shut off the outrage tap that ad-funded social media platforms monetize through engagement-focused recommender systems, which have a secondary effect of amplifying division and undermining societal cohesion for profit.
TikTok disputes the AGCM’s decision to issue a penalty.
In an announcement, the platform sought to play down its assessment of the algorithmic risks posed to minors and vulnerable individuals by framing the intervention as related to a single controversial but small-scale challenge. Here’s what TikTok told us:
We disagree with this decision. The so-called “French Scar” content averaged just 100 day by day searches in Italy prior to the AGCM’s announcement last yr, and we way back restricted visibility of this content to U18s, and likewise made it ineligible for the For You feed.
While the Italian enforcement is proscribed to at least one EU member state, the European Commission is liable for overseeing TikTok’s compliance with algorithmic accountability and transparency provisions within the pan-EU Digital Services Act (DSA) — where penalties for noncompliance can scale as much as 6% of world annual turnover. TikTok was designated as a really large platform under the DSA back in April last yr, with compliance expected by late summer.
One notable change in consequence of the DSA is TikTok offering users non-profiling based feeds. Nevertheless, these alternative feeds are off by default — meaning users remain subject to AI-based tracking and profiling unless they take motion themselves to shut them off.
Last month the EU opened a proper investigation of TikTok, citing addictive design and harmful content and the protection of minors as amongst its areas of focus. That procedure stays ongoing.
TikTok has said it looks forward to the chance to offer the Commission with an in depth explanation of its approach to safeguarding minors.
Nevertheless, the corporate has had numerous earlier run-ins with regional enforcers concerned about child safety lately, including a toddler safeguarding intervention by the Italian data protection authority; a superb of €345 million last fall over data protection failures also related to minors; and long-running complaints from consumer protection groups which can be apprehensive about minor safety and profiling.
TikTok also faces the potential of increasing regulation by member state–level agencies applying the bloc’s Audiovisual Media Services Directive. Corresponding to Ireland’s Coimisiún na Meán, which has been considering applying rules to video sharing platforms that might require recommender algorithms based on profiling to be turned off by default.
The image is not any brighter for the platform over within the U.S., either, as lawmakers have just proposed a bill to ban TikTok unless it cuts ties with Chinese parent ByteDance, citing national security and the potential for the platform’s tracking and profiling of users to offer a route for a foreign government to govern Americans.