Meta, the parent company of Instagram, is gearing up for two significant elections in the U.S. and India next year by implementing a fact-checking program on its social networking platform, Threads. Currently, the fact-check ratings from Facebook or Instagram are applied to Threads. However, the company’s goal is to empower fact-checking partners to independently review and rate misinformation on the Threads app.
Meta set to start fact-checking posts on Threads by 2024
In a recent blog post, Meta announced a forthcoming feature for Threads users in the U.S., allowing them to adjust the level of demotion for fact-checked posts. This new capability will enable users to increase, decrease, or maintain the same demotion settings, aligning with preferences set on Instagram. Notably, settings applied to avoid sensitive content on Instagram will carry over to Threads, providing a seamless user experience across both platforms.
While Meta and Threads have historically downplayed the amplification of news on their platforms, the upcoming elections necessitate a more proactive approach to combat misinformation. Adam Mosseri, head of Instagram, clarified in October that Threads is not “anti-news” but emphasized that the platform won’t actively amplify news content. Despite this stance, the platform continues to implement measures such as blocking searches for specific keywords, including “covid” and “covid-19,” as reported in September.
In an effort to enhance user engagement and information dissemination, Threads introduces new features, including tags (excluding the hash symbol) and trending topics, the latter of which is yet to be rolled out. These additions provide users with more ways to search for and share information within the Threads environment, fostering a dynamic and interactive user experience. Meta’s cautious approach to addressing misinformation stems from past challenges on its platform.
Taking the proactive step to curb misinformation
By taking early steps to address this issue, the company aims to mitigate the impact of misinformation, especially during crucial election periods. However, the success of these initiatives hinges on Meta’s ability to provide detailed information about the fact-checking program. Users and stakeholders will likely raise numerous questions about the nature of post-labeling, the display of correct information, and the overall effectiveness of the fact-checking measures.
Despite Threads’ historical avoidance of news content, the introduction of fact-checking programs signals a shift toward greater involvement in monitoring and regulating information on the platform. As Meta navigates this terrain, users and stakeholders will likely raise numerous questions about the nature of post-labeling, the display of correct information, and the overall effectiveness of the fact-checking measures.
Meta’s proactive steps to introduce a fact-checking program on Threads ahead of major elections reflect the company’s commitment to curbing misinformation. As the platform evolves with new features and policies, it remains to be seen how effective these measures will be in maintaining the integrity of information shared on Threads during critical periods of political activity. Users can anticipate more updates and details from Meta as the fact-checking program is further developed and implemented on the Threads platform.