This article originally appeared on 60 Second Social and is part of the Be Safe. Be Social series.

Reddit has announced that it is updating its policies to curb harassment and trolling between users on its website. It is a significant announcement as Reddit has come to be known as a place where pretty much anything goes (within reasonable limits). Up to this point there has been some basic ‘commandments’ in place on Reddit: Thou shalt not spam, thou shalt not post personally identifiable information and thou shalt not break the site.

In their blog post, Reddit say that they are unhappy with harassing behaviour on Reddit and that they have survey data which shows that Reddit users are unhappy too. The company said that they are updating site-wide policies to explicitly prohibit harassment against users so that users can promote free expression on Reddit without fear of retribution from a minority.

The move comes following Twitters decision to roll out new anti-harassment tools this year. Reddit users will now be able to report abuse to moderators who will then be able to remove the content and ban the offending user. This will no doubt ruffle some feathers among Reddit users who have been on the website for some years now and have enjoyed posting content free from interference from the company.

Reddit is comprised of thousands of micro-communities ranging from social media to Apple to Android and cute animals. Each community appoints their own moderators who create and enforce their own rules for that community, however this rule update is system wide and applies to every user on the platform.

As a result of having such freedom for nearly a decade on Reddit, some communities have become a breeding ground for abuse and hatred, a dark corner of the internet. Reddit says that it is communities like these that is keeping users from speaking up and is potentially keeping Reddit’s current user base from expanding beyond 200 million regular visitors.

However, Reddit will not be actively policing communities, they will be relying on users to report bad actions and intentions in the communities themselves and they will only be acting on reports that they receive.

It is a big step forward for Reddit in the fight against online trolling and cyber bullying. The freedom of being able to say whatever people feel like in these communities has in fact become restrictive to free speech in itself. Users are afraid and concerned they may post something which could be picked up and used against them in an abusive way.

The community as it stands is stifling free expression itself and of a survey of 15,000 Reddit users, the top reason why they would not recommend the site was not wanting to “expose others to offensive or hateful content.”

About The Author

Mark is the founder of 60 Second Social media where he provides social media news and digital marketing analysis. He has an Advanced Diploma in Psychology and a Diploma in Digital Marketing And Social Media. You can follow him on Twitter here. You can also follow 60 Second Social on Twitter here. Or you can drop Mark an email at, [email protected]

Pin It on Pinterest

Share This

Share this post with your friends!