by Almira Louise S. Martinez, Reporter
TikTok, a short-form video social media platform, encourages its Filipino users to report misinformation and harmful content on the platform in step with the Philippines’ 2025 midterm elections.
“In a worldwide community, it’s natural for people to have different opinions but our goal is to operate on a shared set of facts and reality,” Peachy A. Paderna, Philippine Public Policy Manager at TikTok, told reporters on Thursday.
In January, the social media platform launched an in-app Philippine Elections Center site in partnership with the Commission on Elections (COMELEC), the National Residents’ Movement for Free Elections (NAMFREL), and the Legal Network for Truthful Elections (LENTE) to avoid the spread of misinformation, and promote reliable and trustworthy election-related content.
TikTok’s Philippine Election Center site houses verified “critical election resources” resembling voting procedures, polling locations, key election dates, and other essential information regarding elections.

In line with Ms. Paderna, the platform’s community guidelines are based on three key themes to make sure the protection of its users – balancing harm prevention and expression, embracing human dignity, and ensuring actions are fair.
“We also depend on the larger TikTok community to assist us spot content that we may not have caught within the initial phase,” Ms. Paderna added.
Although the social media company has over 40,000 professionals and machine technology that handles content moderation, Ms. Paderna said users are still encouraged to report harmful content.
Users can find the in-app report button under the share feature of the platform. Violence, hate and harassment, self-harm, nudity, and misinformation are a number of the available reasons to file a report.
“We wish to be certain that that our community of users stays protected at the same time as we promote the variety of ideas on the platform,” she said. “We don’t allow misinformation that will cause significant harm to individuals or society no matter intent.”
Reported accounts and videos
From July to September 2024, the video hosting site took down 4.5 million videos within the Philippines, of which 99.7% were removed proactively attributable to violations of the platform’s community guidelines. As well as, 98% of the reported videos were removed inside 24 hours.
“When content is taken down or acted on by our enforcement team, that doesn’t necessarily mean that the [content creator’s] account shall be taken down on a regular basis,” Ms. Paderna said.
Getting banned on TikTok is determined by the gravity of the violations. “Sometimes all it takes is one post, sometimes it takes multiple posts,” the TikTok executive added.
The severity of violations may be categorized as significant and moderate harm. Content that results in severe types of physical harm, resembling life-threatening injury or death, falls under ‘significant harm’. Meanwhile, moderate harm is fake or misleading content regarding treatments or prevention of health-related issues that would not result in life-threatening concerns.
Ms. Paderna noted that mass reporting wouldn’t help the video or account be faraway from the platform.
“One thing that we would like to remind everybody is that it’s not a matter of individuals reporting one account,” she said. “We don’t need multiple reports to take down or concentrate to a violation.”
“We wish to make sure that actions are fair in order that once we take enforcement motion, it’s at all times in fair way, it’s at all times just, it’s at all times rational,” Ms. Paderna said.