YouTube bans trolls for up to 24 hours
Critics have long claimed that YouTube’s comments section is riddled with spam and toxicity. Now the streaming video service is trying to weed out some of the worst offenders, albeit for a short time.
YouTube has updated its policy on spam and abusive comments, stating that it will now implement a short-term ban policy for repeat offenders. A new feature will initially warn users who violate the Community Guidelines, but if they persist, their comment rights will be suspended for up to 24 hours.
“Our testing has shown that these warnings/timeouts reduce the likelihood of users leaving hurtful comments again,” the Alphabet property said in a statement.
The changes will only affect comments in English for now, but YouTube hopes to expand this to other languages in the coming months. YouTube says it tries to protect creators from trolls and provide transparency to people whose comments are removed.
(In the past, offensive comments were simply removed, but there were no real consequences for the people who left them.)
As part of the cleanup, YouTube says it’s also improved bot detection in live chats, which it hopes will stem the tide of spam and promotions that often accompany discussions at live events. The company says it is working to improve its machine learning models to also remove spam in comments.
The changes come three years after YouTube banned comments on most videos made by children to protect young creators from potential predatory behavior.
In the first six months of 2022, YouTube removed over 1.1 billion spam comments.
The company notes that spammers and trolls often change their tactics, which can confuse the algorithms and even lead to inappropriate alerts.
“Reducing spam and abuse in comments and live chat is an ongoing effort, so these updates will continue as we continue to adapt to new trends,” it said.
Our new weekly Impact Report newsletter explores how ESG news and trends are shaping the roles and responsibilities of today’s leaders. Subscribe here.