Elon Musk’s Twitter has disbanded its Trust and Safety Council, which was formed four years ago to protect users from hate speech, child exploitation, suicide, or self-harm. The council included around 100 independent civil rights and human rights groups that provided valuable insight into these matters.
The council was slated to meet with Twitter’s representatives on Monday night. Just hours before the meeting, Twitter notified the group by email that it would be disbanding the Trust And Safety Advisory Board. This news came as a shock to many members of the group.
The council members, who presented screenshots of the email from Twitter, chose to remain anonymous as they were wary of potential retaliation. In this email, it was stated that Twitter is “reevaluating how best to bring external insights” and declared that the council might not be “the best structure” for achieving those goals. We are confident that our focused efforts to make Twitter a secure and informative platform will be more effective than ever. We value your feedback and look forward to hearing from you about how we can best achieve this shared goal, the email noted, signed by “Twitter.”
The volunteer group gave Twitter expert advice on combating hate, mistreatment, and other wrongdoings. However, even though they were immensely helpful in providing advice, the group was not authorized to make decisions or survey content disagreements – merely offering support. Shortly after acquiring Twitter for the hefty price of $44 billion in late October, Musk initially suggested the formation of a new “content moderation council” that would be responsible for making major decisions. However, he later reconsidered and withdrew his proposal.
Alex Holmes, a member of Twitter’s Trust and Safety Council, tweeted that many volunteers have devoted their time to providing counsel concerning online harms and security issues when requested by Twitter personnel over the years. Contrary to popular belief, they were never tasked with governing decisions or judgments. On Thursday, Twitter – the San Francisco-based company – officially confirmed their meeting with the council. The email also included an invitation for “an open conversation and Q&A” between Twitter personnel, including Ella Irwin, who is now in charge of trust and safety.
On the same day, three of its council members declared their resignation in a public statement posted on Twitter proclaiming: “It is clear from our observation that in contradiction to Elon Musk’s assertions, safety and welfare amongst Twitter users has been declining.” Following Musk’s public criticism, recently-departed council members faced a barrage of online attacks for purportedly failing to prevent child sexual exploitation on Twitter. Musk angrily tweeted that it was a heinous offense for the authorities to ignore child exploitation for so long.
As the number of attacks on the council escalated, many of its members began to express their trepidation by emailing Twitter requesting that they stop any further misleading statements about the council’s role. The email mentioned that the spurious claims posed by Twitter executives put past and present Council members in jeopardy.
The Trust and Safety Council formed a specialized advisory group to address child exploitation, comprising the Rati Foundation, National Center for Missing & Exploited Children, and YAKIN (Youth Adult Survivors & Kin in Need). This collective was dedicated to ensuring that our most vulnerable citizens are safeguarded from harm.
On Monday, Patricia Cartes, who had been responsible for forming the council in 2016 while employed at Twitter, declared that its termination signifies a complete lack of oversight and accountability. Cartes declared that the corporation aspired to create a global orientation for the council by assembling experts from around the world who could articulate worries about how Twitter’s new policies or products may impact their localities.
She then diverged to Musk’s present approach of inquiring about his Twitter followers before effecting a reform in how content is managed. “He doesn’t emphasize much what the experts think,” she proclaimed.