YouTube is unveiling new features to inspire respectful discussions. The video streaming platform will allow the customer to reconsider when posting offensive comments through this feature.
Before posting an offensive comment, the commenter will be asked if you want to post the comment this way, or take the time to change it.
In a report, technology site The Verge said that the notification would be sent to YouTube’s AI-based system only if it could identify a comment as offensive.
Joanna Wright, vice president of product management at YouTube, said the company would test the new filter at YouTube Studio to help content creators better manage comments and engage with viewers.
Inappropriate and painful comments will be automatically stored for review through that filter.
In a blog post on Thursday, Wright said, “If content creators don’t want to, they should never read the comments. We’re also developing comment verification tools to make the process easier for developers. ”
According to YouTube, since the beginning of 2019, the number of hateful comments has increased 47 times every day.