Bluesky Announces Moderation Changes Focused on Better Tracking and Improved Transparency

Decentralized social network Bluesky, a competitor to X and Threads, announced on Wednesday that it was making new changes to its moderation process. Specifically, the company said it is introducing new updates to how it tracks violations of its community guidelines and enforces its policies. This includes the inclusion of new reporting categories in the app, changes to its “strike” system for violations, and more guidance provided to those who break the rules.
The moderation changes are rolling out with the latest version of the Bluesky app (v. 1.110), which also includes a dark mode app icon and redesigned functionality to control who can reply to your message.
The company says the moderation updates are a result of Bluesky’s rapid growth and the need for “clear standards and expectations for how people treat each other” on the platform.
“On Bluesky, people meet and fall in love, are discovered as artists, and debate niche topics in cozy corners. At the same time, some of us have developed a habit of saying things behind screens that we would never say in person,” the company shared in a statement, explaining the changes.
However, the news also follows the latest moderation case on the platform, which saw a user suspended for making a comment that Bluesky interpreted as a threat of violence. Author and influencer Sarah Kendzior wrote in an article on Bluesky that she wanted to “shoot the author of this article just to watch him die” – a reference to the lyrics of a Johnny Cash song. This choice of words was appropriate because she was commenting on an article about Johnny Cash that she didn’t like.
Bluesky’s team said Kendzior was suspended because she expressed “a desire to shoot the author of the article” — a very literal reading of her comment.
With the updated rules, Bluesky appears to be working to ensure the platform retains a sense of community and doesn’t devolve into the toxicity that now fuels X, where snide comments, dunks, and hateful comments are often the norm.
For starters, Bluesky expands the reporting options on posts from six to nine, allowing users to report issues more accurately and helping moderators act more quickly on critical reports. For example, you can now report things like harassment, bullying, or eating disorders among young people, which would help address Bluesky’s need to comply with the multitude of new laws designed to protect minors online. In addition, it will allow users to report possible content related to human trafficking in order to meet the requirements of the UK Online Safety Act.
To achieve this, Bluesky has enhanced its internal tools to automatically track violations and enforcement actions in one place. The system will also ensure that people get clear information about what happened and where they stand.
The company notes that it is not changing what it enforces, but has simply improved its tools so that they can be more consistent and transparent in their application.
As part of this, Bluesky’s warning system will now assign content a severity rating, which will help dictate enforcement action taken. For example, content flagged as “critical risk” would result in a permanent ban. Other content may receive a lower, medium, or higher penalty. And if an account accumulates violations, the user could also risk a permanent ban instead of a temporary suspension.
Additionally, the company says users will be notified when they are subject to enforcement action with information about which community guidelines they have violated, the assigned severity level, the total number of violations, how close they are to the next account-level action threshold, and the duration and end date of any suspension. Enforcement actions can also be appealed, the company said.
The changes also follow Bluesky’s rollout of updated community guidelines in October, as part of its broader goal to become more aggressive with moderation and enforcement on the platform.
But even as the company emphasizes its stricter rules, some Bluesky users remain unhappy that the company is still allowing a user widely criticized for his writing on trans issues to keep his account on the platform. This controversy erupted again in October, when Bluesky CEO Jay Graber appeared to dismiss user criticism in a handful of posts.
At the root of the problem is how Bluesky wants to be perceived versus how it actually is today.
The company doesn’t want to be known as just a left-wing or liberal version of Twitter; it wants to be a home where many different communities can grow their networks and thrive, without the problems of a centralized social network. However, much of the community that adopted Bluesky did so because they no longer felt represented on Twitter/X, which has become more right-wing under new owner Elon Musk.
In addition to wanting to shape its image, Bluesky must balance its goals with a growing number of laws and regulations that require social platforms to protect their users from harm or face potentially serious consequences, such as massive fines. For example, earlier this year, Bluesky blocked its service in Mississippi, saying it didn’t have the resources to comply with the state’s Age Guarantee law, which would fine the network up to $10,000 per user for non-compliance.




