The more general dynamic is that, for a community to remain healthy, it must be tolerant of everyone, with one very specific exception: those intolerant of others.
The consequence is that all communities are intolerant of
something. The most welcoming ones are those which do not tolerate the one thing that will ruin them; others, by choice or by chance, are not so lucky, and tolerate much less than that.
Moderation is a manual implementation of intolerance; the admins, moderators, and the community reporting to them, in that order, makes the choice of what content is not tolerated. Ranked forums are much more democratic on average, but moderators retain veto power to promote and remove content at will.
This forum happens to be, more or less the, if not benevolent, then mostly indifferent, dictator model. For the most part, the community is left to its own devices, and is surprisingly well behaved.
Contrast to an example like 4chan: there, trolls are not only tolerated but encouraged. With tolerance for every possible opinion, "anon" shows preference for anything up to and including lynching and pedophilia. (Even
their mods have to deal with -- as a legal requirement -- that last one... Also, give or take what age of 4chan you argue was the "freest", but that's beside the point.)
There aren't many trolls here; heh well, seemingly so among threads I'm interested in reading, anyway; but most of the time when one shows up, they get adequately shouted down. Spam is reported; blatant advertisers and shills are called out or reported; bad-faith arguments are called out (and if repeated enough, eventually banned in some capacity).
This does leave open the meta-trolling exploit, the "any publicity is good publicity" troll that stirs up shit just to make people try to correct them or call them out or whatever, and that has indeed been exploited from time to time. No one can resist Someone Being Wrong On The Internet, after all.
I would love to see more personal recognition and responsibility in this regard -- that is, that the best way to kill a troll is to starve it to death. Alas, I probably lived through the peak of that practice -- I'd guess it gathered perhaps a 20% following, at the height of Usenet (say, early 2000s, or probably late 90s but I wouldn't know). Without unmoderated forums like Usenet (or, far fewer of them today anyway), the rate of people understanding and practicing this method is disastrously low, perhaps a few percent. The popularity of "call out" and "cancel culture" on Twitter today is the proof of that,,,hey wait, what was that about unmoderated forums and trolls gathering negative attention?
See, good thing I don't Twitter.
tl;dr: this discussion may seem tangential to your questions/statements, but where it brings things together is here: by wishing to read everything, even content that is unfit to share, you're encouraging the tolerance of intolerance. (On the assumption that intolerance is what was moderated away, but I also gave my assessment of that, rough though it may be, I admit.) By leaving that content up, others will read it, and with the aforementioned low restraint of the average person, hi-jinx ensues. Sure, threads can be locked and replies can be moved to quarantine, rather than deleted; but this leaves the option of starting a new thread discussing the old thread, and the negative content persists.
Personally, I balance the issue of censorship with the cynical comfort that no one ever has much of anything important to say, least of all the typical troll that gets moderated.
This is absolutely an important problem in forums that aren't so... libertarian (I guess?), in forums where important, relevant content (and meta-content, e.g. discussing the moderation itself) is prohibited. Again, the type of intolerance is a strong indicator of the health of the community. I'm guessing that's what you're really after -- the easiest way to see the intolerance is to see what's been removed, but as I've noted, that carries a risk, too. (Or you're just looking for all the cringe those deleted posts contain, I would argue that is also an unhealthy behavior, see earlier point etc.
)
Ed: also, one last final point, just because
someone's going to say or think it (assuming anyone gives a shit, this deep into a... clearly masturbatory philosophication?
...): no, it is strictly NOT BETTER to, y'know,
simply encourage everyone to use their best judgement! That's laying a footgun, and you know full well every damn person is going to plant their foot firmly under that footgun and pull that damn trigger.
You've dereferenced unsafe pointers in C before, and you'll do it again. Classic example. It generalizes.
Fuck. That.
Responsible, safe system design is the only approach. Poka-yoke. We don't need to condemn users to a footgun, just out of, what--habit? revenge?--that we were introduced through footgunning ourselves? Not only must we prevent users from footguns, we must guide them away from them as well. They must learn from our mistakes without having to be injured. We don't want, or need, to
shield them from ugly things -- that only encourages the ugly thing, raises it upon a forbidden pedestal. The thing shouldn't be ugly or dangerous in the first place. It should be natural to exist alongside the thing, and understand its workings and risks. And, once users understand why the system was set up that way, they can be allowed to take more risky actions, in the hope that they will be responsible with them. The expert knows not to dereference a freed pointer, and checks diligently for all apparent code paths that could do so. Often, the expert has footgunned their way to that wisdom, but it doesn't need to be the case.
Perhaps we don't use a footgun, perhaps we use a targetgun, it's mounted to a fixture that only points near a target. Perhaps we use a dartgun that won't kill things it hits. Perhaps we use a targetdart with both. Once the user is comfortable with the lesser responsibility, they can perhaps be trusted with the wrench to unbolt the gun from its fixture, and then they will understand that they should point it away from their foot when using it. Perhaps they will point it away from others' feet as well..
This is a direct argument, in terms of product design and usability, and UI and stuff; in terms of content, and user psychology, it's a more clouded connection, but I have no doubt that analogous practice exists in this sphere.
So um, tldr tldr: you seem to be asking for a footgun of sorts, and that may not be the healthiest thing for the community, to ask for.
Tim