For many, Twitter is a place for self-expression and exercising freedom of speech. It is also inevitably a place for the mass spread of misinformation.
But, not for long.
With the election just 25 days away, Twitter announced a series of policy updates Friday aimed at countering the spread of misinformation.
Starting October 20, users will be prompted to add their own comment before retweeting a post in hopes to slow down users from amplifying tweets that may be false or misleading.
This significant change will last through November 3.
Twitter will also change the algorithm of what shows up on timelines and in trends, removing posts that are recommended from people who users don’t follow, The Hill reports.
Trends will only show in the “For You” tab if they have additional context attached to them.
“Twitter has a critical role to play in protecting the integrity of the election conversation, and we encouraged candidates, campaigns, news outlets and voters to use Twitter respectfully and to recognize our collective responsibility to the electorate to guarantee a safe, fair and legitimate democratic process this November,” Twitter executives Vijaya Gadde and Kayvon Beykpour wrote in a blog post.
Twitter is adding a new feature that hides misleading tweets from popular accounts.
If users try to share content that the platform has flagged as false, a notice will warn them that they are about to share inaccurate information.
Furthermore, any tweet related to the election that declares a premature victory or encourages interference in the voting process, especially if it includes calls to violence, will be removed, according to The Hill.
While many of these changes will be temporary, the new regulations come in efforts to avoid a repeat of the 2016 election, during which many users were able to freely spread misinformation and dangerous content.