Bluesky ramps up content moderation as millions join the platform

More users means more problems, and social media wunderkind Bluesky is no exception. On Monday, Bluesky announced new moderation efforts to address a growth in concerning user content amid its incredible growth.In an exclusive with Platformer, Bluesky explained that it would be quadrupling its content moderation team, currently a 25-person contracted workforce, in order to curb a worrisome influx of child sexual abuse material (CSAM) and other content that violates the sites’ community guidelines — cases that have so far fallen through the existing moderation systems and warrant human oversight. “The surge in new users has brought with it concomitant growth in the number of tricky, disturbing, and outright bizarre edge cases that the trust and safety team must contend with,” the company wrote. “In all of 2023, Bluesky had two confirmed cases of CSAM posted on the network. It had eight confirmed cases on Monday alone.” SEE ALSO: Leaving X for bluer pastures? What to know about Bluesky’s owners and policies. At large, the platform is navigating an explosion in user reports being handled by an extremely small company. On Nov. 15, the platform posted that it was receiving 3,000 reports per hour, compared to only 360,000 reports for all of 2023. “We’re triaging this large queue so the most harmful content such as CSAM is removed quickly. With this significant influx of users, we’ve also seen increased spam, scam, and trolling activity — you may have seen some of this yourself,” the platform wrote at the time. “We…Bluesky ramps up content moderation as millions join the platform