By @TheMarkDalton

Twitter’s live streaming app has introduced a new method of reporting abuse and spam in the comments section of a broadcast. It is an interesting approach which will put power into the hands of the audience.

Live streaming is hard work when you have to focus on delivering engaging content which is live, try not to make a mess on the delivery, engage with the comments from the viewers. Now you are trying to moderate the comments section too and keep it nice and clean?

It is a lot of work, and if you have a team of people on the task then it is not an issue. However, most of us are streaming solo and everything is down to us. Periscope will now deploy a user led approach where the audience will review messages and flag what they deem to be junk. The community decisions that are made on comments will result in actively abusive members being blocked.

So how does it work?

When you tune into a live stream you will see the usual flurry of comments on the screen. If you see something appear which you think is abusive or offensive you can then flag it. So far its the same as what we have right?

Well here comes the nifty part that I think is really cool. Periscope will now present what you have flagged to a selection of viewers in the form of a vote. If the majority vote flags the comment as abuse or spam, the perpetrator will have the chatting function restricted for the remainder of the live stream. This will happen in real time as the stream is taking place.

Now, both broadcasters and viewers can opt out of this if they wish. So the person hosting the live stream can set it so that the broadcast is not moderated and viewers have a choice if they want to be part of the jury when it comes to voting or not.

Personally, I absolutely love this and I think it is about time a social media site/app came out with a system which gives users power in moderation. We complain and we talk about the growing abuse problems online, in particular cyberbullying.

Twitter has been repeatedly condemned for not dealing with trolls and bots on their main service. If Periscope’s moderation tool turns out to be a success it could be adapted to be used on other platforms in the future.

The pressure is on social media companies to provide safe environments for users, however with the sheer scale that they are growing to it can be hard to police them. Combined with assistance from the online community, it could give them a chance to stop online abuse much faster than what they are able to at present.

Pin It on Pinterest

Share This