This post is related to these previous discussions we've had about moderation:
social.trom.tf/display/dbc8dc4…
social.trom.tf/display/dbc8dc4…
So in those above posts we have shared some ideas regarding how we can improve moderation on the fediverse. Those posts were looking at moderation from a technical point of view. But after seeing some of the responses I got for my last post, I feel like I should write about the social impacts of it, because for some people moderation is a sensitive topic and TBH I don't think many people understand how complex this topic actually is.
Types of moderation issues
Usually when people say bad moderation, it can be of two types:
- Moderation is not strict enough and there are still a lot of bad content that needs to be removed.
- Moderation is too strict and content that shouldn't have been banned is being banned, leading to censorship.
Types of social media users
Many people disagree on how strict moderation should be, and what I noticed is that most of these disagreements come from 2 types of users who want different things from a social media:
Type 1 users
These are people who don't want to see or interact with others who disagree with them. They might be doing this for 2 reasons:
- They want to create a safe space for themselves. For example, if someone's getting bullied online due to their political believes or race or whatever, then they might want to block out everyone who might be a potential threat. These include vulnerable people, like those going through depression and such, so they might need more stricter blocking than others.
- They don't want to hear from people who disagree with them. For example, someone might wanna block out all anti-mask people because they don't agree with them and don't want to argue about it.
Type 2 users
These are users who want to reach a lot of people via social media, including ones who disagree with them. They might be doing this for 2 reasons:
- They want others to hear them. For example, an environmentalist would want to be heard so that more people become aware of its importance.
- They want to hear what others are saying and change/improve their own views & knowledge about the world.
So Type 1 users generally want strict blocking while Type 2 users prefer less strict blocking, keep this in mind.
What content should get blocked?
So in the previous post I made, some people object to the idea of a "Soft block" and this is their reasoning:
If my friend is on a poorly moderated instance, I want to get them off of that instance, not make an exception to my block just for them. With your system, there is no reason for people to leave bad instances because they can still interact with their friends, and there is no pressure on admins to actually moderate their instances, either. You make blocks completely toothless by having these exceptions.
I actually thought blocks were used for creating a safe space for the users of that instance, but reading this makes me realize that some people also see it as a way to punish other instance admins for not moderating properly. There are of course a lot of issues with this mindset, but in order to understand it you have to know a few things.
What should an admin block in order to not become a "poorly moderated instance"? I'm simply asking here which type of content should be blocked. So the usual answer to this question is to block all of the socially unacceptable content. So if someone's being racist with people, then that person should be blocked because its not socially acceptable, so far so good. But now if you look at the history of what was considered as "socially acceptable" or not, then it gets real confusing. Because in the past it used to be socially unacceptable for women to vote, but at the same time slavery was considered okay! And even just in the past decades or so, a lot has been changed as to what we consider socially acceptable. There are still some countries today where homosexuality is considered socially unacceptable, and they're having to fight for their rights. So what people consider as socially acceptable will change from time to time and place to place. And I'm sure this will change in the future too, because there might be some things that are considered normal today, but we'll realize later that we've been wrong about it. This is not a wild claim, we've been wrong in the past many times and it'll happen again.
If Fediverse existed in the past - social norms changes everything
If I want to explain this to you then we'd have to talk about the past, what people had to go through in order to change the norm, because if they didn't change that then we'll be still living in a society where sexism and homophobia is norm. What would have actually happened if fediverse were to exist at the time where sexism was the norm? Moderators would actually be blocking feminist instances from federating, because in their own sense of morality that seems like the right thing to do. But the interesting thing here is, how will the users of a feminist instance react to getting blocked? So Type 1 users of the instance will say:
"If they're blocking us for being feminist then we don't need to talk to them anymore, they're a bunch of sexists anyway. And users who really do care about feminism will move away from that instance. In fact, our admin should have blocked that instance way before they blocked us, because some of our members have been harassed by the sexists on that instance"
So Type 1 users are not directly affected by the censorship, and they are criticizing their admin for not blocking the big instance long ago. But Type 2 users will be quite upset over this, because they actually want more people to read their posts that explains the discrimination they face in society, to spread feminist ideas as much as possible, and getting blocked by big instances mean they lose a lot of their followers and will also limit their exposure to others. So now if these Type 2 users want to get their followers back, there are 2 ways for them to do so:
- They can try to tell their followers to move to a different instance, maybe to the feminist instance they're currently on. And this seems like the right thing to do because they were unfairly blocked. Now I just wanted to ask you this, if your instance blocked mine for unfair reasons like this, then would you be willing to switch instances for me? TBH I don't think anyone would do that for me, maybe some of my very close friends may do that, but that's it. So yeah, switching instances is not as easy as people make it sound like, if it was then this wouldn't be an issue.
- Type 2 users can't get all of their followers to move for them, so now their only option is to move themselves to a different instance. This is not a good choice because it sends the message that what the other admin is doing is okay, which it isn't. So keep that in mind. Now if their aim is to get their followers back then they'll have to move to an instance that federate with this big instance that blocked them. And by now we know that all of the feminist instances (if there exist any) will be blocked already, so they'll have to go with a generic instance. But after this move is happened, it'll put the admin of that generic instance in a hard position. Because if blocks are used as a way to punish instance admins for not moderating properly, then this admin will of course be scared of getting blocked by big instances. Remember that feminism was considered socially unacceptable at the time, so if the admin don't kick out these feminists then that instance will be considered a "bad instance" and it'll get blocked out of fedi.
Social structure
If you notice the above example, you'll see that Type 1 users don't have a problem with censorship, even when someone's blocking them. This is because they don't really want their posts reaching people who disagree with them, all they want is a safe space devoid of haters that harass them, and this is understandable when you're part of a marginalized group. So because of these, Type 1 users don't see any issue with strict blocking because they're not directly affected by any censorship, so they might be unaware of its long term consequences in our society. On the other hand, Type 2 users such as activists want their posts to reach as many people as possible, they want their voice to be heard, but as a consequence they can also get a lot of harassment from haters, usually more than what Type 1 users get because of their extended reach. So in that case, Type 2 users could actually empathize with Type 1 users too. And the moderation tools available to users are very helpful for both people.
Type 2 users play an important role in changing social norms, because you can't create much of a social impact without reaching people who disagree with you. All activists are Type 2 users.
Now instead of focusing on the feminist instance, let's look at the social structure of the entire fediverse network:
- Type 1 users in support of feminism - These users will always be on feminist instances and they encourage their admins to block out all the other instances that isn't in support of feminism.
- Type 2 users in support of feminism - These users don't want their admins to block out too many instances because it'll reduce their ability to reach more people and spread feminist ideas.
- Type 1 users not in support of feminism - These users will stay away from feminist instances and they encourage their admins to block out all the feminist instances.
- Type 2 users not in support of feminism - These users don't want their admins to block out too many instances because it'll reduce their ability to reach more people and spread sexist ideas.
So when Type 1 users ask for stricter blocking, they could be trying to create a safe space for feminists, but they could also be trying to censor feminist content and create an echo chamber for sexists. And while Type 2 users ask for less strict blocking, they could be trying to spread feminist ideas, but they could also be trying to spread sexist ideas. This is why moderation is such a complicated issue, because there are 4 types of users who all want different things, and an admin will never be able to satisfy them all. In the case of that big instance that blocked the feminist instance, that decision was in favor of Type 1 sexist users, all of the other groups will be annoyed by this decision, except maybe Type 1 users because they don't care about censorship. Long before that, when the admin of that feminist instance chose not to block the big instance, that decision was in favor of Type 2 feminist users, Type 1 feminist users on the instance was annoyed by this because they prefer strict blocking.
The real problem
Maybe you got confused by the complex social structure of it all, but don't let that deviate you from the real problem here, if all of these users could choose for themselves what type of content they wanna block or not then this entire issue wouldn't exist. You see, the real issue here is that admins are making the decision for all users, and by doing that they are basically imposing their own sense of morality on others. Usually what will end up happening is that most admins will go along with the social norm, because that's how they can satisfy the most number of users. And so if sexism was the norm at the time, then feminism will be considered socially unacceptable and will get blocked.
All of the ideas I've shared before, like the concept of "Soft blocking" and the use of "Blocklists", they were all attempts at solving the real problem here, and that is to give the choice to the user itself, let them choose what they wanna block. If soft blocking was the default, then those Type 2 feminist users would've been able to get their followers back without having to switch instances. And if blocklists were a thing then those Type 1 feminists would've been able to protect themselves from harassment by subscribing to a blocklist that gets rid of all sexist users. Giving more choice to users is the sane way to solve this problem, because users know what they want better than their admins do. Letting the users choose for themselves is also good for moderators/admins, because now they don't have to make decisions on behalf of everyone and worry about offending people. If some users on an instance want stricter blocking but others want less strict blocking, currently the only thing moderators can do is to choose between these and risk offending others, but if blocklists were a thing then the admin could recommend users to apply a stricter or less strict blocklist depending on what they want. And if such a blocklist doesn't exist then the admin can make one themselves, remember that anyone can easily make a blocklist and publish it online for others to use. If you look at how the blocklists for ad-blockers are made today you'll see that its maintained by many people and anyone can contribute to the list, also because most of these are managed using git repos every decision they make along the way is public and transparent. Can you expect the same level of quality blocking when a few admins or moderators are doing all the work for you? Blocklists are just better in so many ways.
A real world example
I've talked a lot about the past, but now I want to give you a real world example that's ongoing. @Cleo of Topless Topics is someone who had to deal with a lot of censorship because they question social norms. She is trying to normalize non-sexual nudity, and because nudity is considered as "socially unacceptable" in our society she gets censored everywhere. Speaking of that, did you know that men were once arrested for baring their chests on the beach? Here is an article that explains it all and how men fought for the right to go topless - washingtonpost.com/history/201…
Today if we go to the beach we see lots of topless men, but its not considered socially unacceptable or looked at in a sexual way. But if a women shows her bare chest, maybe even to feed her baby, then that is not considered okay. This is the issue that she's trying to address.
But of course, she gets blocked everywhere, so much so that she even has a page on her website to showcase screenshots of every time she got banned by different platforms - toplesstopics.org/banned/
Going through that page you'll understand that her content is being disproportionately censored compared to other nude content. And one of the reason why this happens is because her content gets false reported a lot by what she describes as her "Dedicated Cabal of Haters". You see, she's not only having to go through censorship but she also faces a lot of harassment from haters, all because she tried to question a social norm. And this is interesting, because the job of moderators is to stop people from harassing others, but in this case her haters are making use of moderators to harass her. A perfect example of how moderation can be a double edged sword.
She didn't want to upload her videos to porn sites because she isn't making porn or any sexual content. And if you look at federated platforms like Peertube, then instances that allows nudity also allow all kinds of crap like pseudoscience, conspiracies, etc, so this is not a nice place to be. But finally she found @Tio who had a well maintained Peertube instance devoid of all the crap, and he agreed to host her videos there. So here's her channel now - videos.trom.tf/c/toplesstopics
But now if we were to spread this mindset that we should punish other instance admins for not moderating properly, then this peertube instance might get blocked simply because there is one person there that questions social norms. So is it a "poorly moderated instance" just because it has nude videos? Now you see how subjective these things are, because different people will have different opinions about what type of content should get blocked, so allowing users to choose for themselves is the best we can do here.
And BTW, I also wanted to point out that content warnings are not a solution here, the whole point is to normalize non-sexual nudity, so putting content warnings for it works against that goal. It is like asking users to put content warnings when writing about homosexuality or something like that, it goes against the whole point. On her mastodon account she sometimes put content warnings like "heinous nonsexual female nipples" to show how ridiculous it is.
Addendum
After publishing this post, the follow up conversations made me realize that I missed one important point. While I've made various suggestions to improve moderation like the idea of "Soft blocking" and of "Blocklists", I just forgot about a much more simple feature that already exists on most fediverse platforms. If you're worried about things like online bullying or just want a more strict protection, then one thing you can do is to change your post visibility to "followers only", this along with manual approval of followers would mean that no one can see or interact with your posts unless you manually approve them as your follower. You can find more info on how to do that here - mstdn.social/@feditips/1072085…
Platforms like Friendica would also allow you to put people into different groups and then when you write a post you can make it only visible to the members of that group. This will be very useful if you have a lot of followers but still want to restrict who can see some of your posts.
So instead of relying on moderators to filter out the bad guys from the entire network, you're just filtering in good guys and only they can see your posts. This will of course be much more strict than relying on moderators or a blocklist, and can be very useful for those who want that level of protection.
Currently this is an opt-in feature, but it would be nice if the admin of an instance can make this setting default for all their users, so that all posts will be private by default unless you opt out. If an admin feels like their users need stricter protection, then having this option can be very helpful.
GoatsLive
in reply to Rokosun • • •Rokosun reshared this.