Trust, Safety, And Communication
Techdirt. 2023-07-21
One thing that should be evident by now is that every online community eventually learns the need for some level of “trust & safety” or basic moderation to take place. And they quickly find that things are a lot more complex than they seem from the outside. Just try to moderate a medium sized Facebook Group if you want an example. Or play our Moderator Mayhem game.
It’s always interesting to see the points at which community organizers realize this and try to figure out how they’re going to handle this issue or that issue — and begin to realize what an impossible task it is. And while some people think that it’s now been long enough that any new community should have “the basics” figured out, it’s important to recognize that (1) there are always new problems, and (2) the “simple” problems are often a lot more complex than they seem. On top of that, there are thousands of things that any new community “should” have, and at some point the people building them need to weigh “releasing something” against “having every feature in place.” You can disagree with where the line is drawn, but everyone has to draw a line somewhere.
I was thinking about all this over the last week or so as there was some discussion when the (still invite-only beta) Bluesky ran into some issues regarding a username filter (specifically, the filter allowed users to sign up with slurs as their user names). This is, obviously, not good.
The debate on Bluesky morphed over the course of a few days from criticism regarding the pretty major omissions on the filter list, to the lack of communication from the company and its (normally communicative) employees. Basically the entire company went silent, followed eventually by some more bland “corporate” sounding responses that went against the “poasting” style the team had embraced earlier. Indeed, the sudden silence from the team stood out even more given their normal willingness to engage in all sorts of ways on just about everything else. Going from super talkative to silent at the moment of notable controversy is, perhaps, the opposite of a compelling communications strategy.
Still, it’s somewhat understandable when looked at in context. The teams have repeatedly talked about how much they need to accomplish in building both a protocol (which may change the nature of some of these issues) and their own platform as a reference app of that protocol. The service is still in beta for a reason. And when there are thousands of trust & safety things you need to set up in addition to building the platform and the service, raising money, finding a business model, and everything else, it can get a little overwhelming. And that’s especially true when the company had made earlier moves and statements suggesting that they took these issues seriously and were working on solutions. So, when things blow up because they missed some things, it can feel like an attack. The team believes their heart is in the right place, and they’re trying to balance the variety of things they need to do, and yet… they’re still getting yelled at.
But, alas, this is the general rule when you run any sort of online community: you will get yelled at, and at some point you need to decide what issues to deal with and what to focus on. Getting yelled at sucks. And often makes people clam up. Of course, the obvious (and very true!) counter to this is that having to deal with hate, abuse, and racism also sucks. And also makes people want to clam up. So if you’re taking a job to build a social network, you’re signing up for this specific kind of abuse, and you need to be ready for it in order to protect others from abuse.
Back in May, I had written a thing about social media Nazi bars, tradeoffs, and the impossibility of content moderation at scale, which I think remains quite relevant here. There are always tradeoffs, and unlike, say, Substack (which is much larger and much more well resourced), I’ve seen no indication that the Bluesky team is simply abdicating its responsibilities here, but rather prioritizing as best it can, meaning some things that everyone agrees are important won’t get put in place as quickly as some would hope.
For example, regarding the filter list, while it was an obvious failing in how the system was set up, any sort of brute filter list runs into problems over time. If you don’t want to deal with a “Scunthorpe” problem, you need a more sophisticated solution, and more sophisticated solutions require more time and thought, and we’re right back to the line-drawing exercise I mentioned above, where the long list of thousands of things you need to accomplish is at least one item longer (and more complex).
The real difference here seemed to be how much the communications problem exacerbated the more classic trust & safety failing.
And it made me start to think about how communications itself is a strategic trust & safety tool, though rarely considered as such. Some of the communications issue was, as many people noted, the failure of the company to come out and say they were sorry for the errors. And, yeah, it seems like this is a case where company leadership should have done so. But sorry only goes so far. Mark Zuckerberg has to keep going on apology tours, and it’s not clear that it’s really helpful.
Instead, I think the failure might be in the lack of clear communication on the larger roadmap from Bluesky. This applies to lots of other online communities as well, but I’m focused on Bluesky to make this point (although arguably, it applies even more to others). Many people (I think, falsely) focused on this one error regarding to the username filter list, insisting that it showed the company “didn’t care.” That struck me as unlikely, given earlier statements and actions by the Bluesky team, which seemed to indicate not just that they cared about this, but they cared deeply, to the point that they wanted more thoughtful, serious, and comprehensive approaches to dealing with it, rather than slapdash duct tape fixes.
But, unless you’re paying close attention, you might miss all of that. And there’s no clearly laid out roadmap that people might have pointed to to alleviate the concerns of others.
Things might have gone a bit differently if Bluesky had a page with a roadmap regarding its plans for federation, composable moderation, trust & safety tooling, trust & safety hiring, and the like. If such a roadmap existed, that showed exactly how the team was thinking about these things, and made it clear that the team was working towards them deliberately, including at least some public explanation of the tradeoffs of various approaches, it would be more difficult for users to fill in the void with “they just don’t care.”
On top of that, it would similarly give the team breathing room to keep working on that roadmap, rather than having to respond to every emergency (some emergencies will still require emergency reactions, but not every emergency will grind everything else to a halt). This isn’t the answer to everything, of course. Nothing is.
But having clear communications, especially regarding a project that is designed to be decentralized and is being designed for the public benefit, is a key element of building trust, which I guess would be somewhere around 50% of the point of building out trust & safety.
Creating such a roadmap is quite a process in and of itself. As far as I can tell, no one else has done it either. And I’ve already been talking about how the team likely already has too much on its plate. But it does strike me that spending a bit more time on this at this early stage might help prevent some of the problems going forward, both in allowing users to point out some areas where the roadmap may need to be adjusted, or in simply having a better understanding of not just where Bluesky is today, but where it’s heading in the future.
I honestly think this understanding of the communications element of trust & safety could help many other communities as well. Many of the complaints and problems come from a mismatch between expectations and how a company actually makes decisions. And one way to deal with that is to better align the expectations. I think the last decade might have gone differently if Twitter, Facebook and others had been more public and upfront with some of their internal trust & safety discussions as well, so this is hardly unique to Bluesky.
But, at the very least, I think it’s important to start considering the role of communications as a part of a trust & safety strategy.