Twitter will now encourage you to pause and reconsider when you post something unkind in reply to someone else.
The company has been testing a function that identifies potentially harmful, offensive and inconsiderate replies and then prompts users to potentially reconsider their Tweet before sending. The functionality rolled out to all English-language users Wednesday, according to a blog post written by Anita Butler, director of product design, and Alberto Parrella, a product manager.
In their tests, the product team found that 34% of users decided to change or delete their reply after the prompt, those same people were 11% less likely to Tweet unkind things in the future, and those people were less likely to receive harmful and offensive replies in response.
Based on points of concern that arose in the test scenarios, the team incorporated a few new factors that shape when the prompts occur, including the relationship between the replier and the poster, scenarios where sometimes offensive language could reclaimed by underrepresented communities, and new ways to accurately detect profanity.
Twitter has been emphasizing user choice and conversational “health” in its product announcements over the last year, with a heavy emphasis on tools that could make the platform a friendlier place for users. The company also hired a conversational safety team lead and has been expanding that team over the last year.