To combat misinformation, TikTok notifies users who share videos with questionable information. The application now warns that the content may contain data that has not been verified by the platform.
With the new feature, TikTok wants to encourage people to think before publishing content with incorrect information. According to the social network, the first tests reduced video sharing with misinformation by 24%.
Even with the notification, the user can run the publication normally. However, the distribution will be reduced. Additionally, a message indicating that the content may contain unverified information is displayed to other users.
The warning is also displayed to users who share a previously flagged video as questionable information. The person can also choose to publish the content, but it will not appear on your followers’ timeline.
Unlike other social networks, TikTok takes a more rigid stance on disinformation. Today, the platform partners with third-party fact-checking organizations, in addition to immediately removing all inappropriate content.
Fight against misinformation
In recent years, various social networks have worked to combat disinformation. For example, Twitter encourages users to open the link and read the content before sharing it with followers.
Facebook is currently displaying a warning before anyone posts information about covid-19 or articles that may be out of date. Instagram also shows a message linked to the Ministry of Health in Stories about coronavirus.
However, fact-verifiers are still prone to error. A recent example was the fake news publications involved in the US Capitol invasion. Fact that social networks were forced to make late decisions.