Moderating a community

It’s like being a parent, just to millions of people! -Ouch?


Moderation is an integral part of community management and ensuring that the content by users is both suitable and relevant to it’s audience. It ensures that members of the community are safe and free from abuse or attack from other members.

With the web open and free to everyone, different online spaces and communities appeal to different people, from gender, age, interests and abilities.


Moderation Levels

Club Penguin is an online community for children to speak online and make friends. Privacy is at the forefront of its aims and priorities to ensure that members are safe and secure. Without a strong stance against privacy, would parents let their children use the service? — The answer is probably not.

A way in which they have moderated their service is to allow only pre-moderated phrases to be sent in game. This means that there is no chance for children to be targeted online or abused and at risk. This is an automatic process of moderation set by the admins and maximises privacy.

In contrast, 4Chan’s community is completely unmonitored and unregulated. This is because the users of 4Chan tend to be older and the community had a demand for an unregulated space on the web free to post whatever they wish. Users tend to be much more up for an argument or debate and have ‘thicker skin’ than other communities on the web.


Active VS ProActive Moderation

An example of Active Moderation is the BBC. When it posts an article, a user can submit a comment to give their opinion. This is actively moderated by a team hired by the BBC to individually check every comment posted and to accept or decline it, this is a form of Pre-Moderation as it occours before it is posted on the site.

Facebook is an example of Proactive moderation. It relies on its large and very active community to report any unwanted content such as sexual content, abusive, promotional or spam. Any content reported is then filtered and looked at by a team.

This process as you can imagine is very expensive in terms of physical resources having to pay people that often a generated and automatic moderation system could have done such as banning swear words.


My Projects Moderation

In the community I aim to build, it is important that moderation of the content posted is done in a way it allows peoples opinions to be put across without being abusive or offensive to other users.

A way in which we can tackle this issue would be to allow users with a level of trust to self moderate as a community. They can report users comments which will be then fed back to our ‘Trusted users’ (Users who have been active in the community for a long period of time and in contact with the website admin). They can then permit a warning to users on the three strike esculation plan than YouTube follows for it’s community guidelines.

Trusted users will feel empowered that they are making changed to better the community they love and want it to be a respectful place without costing us money to employ community managers.

Tom.


Web Media Level 1. Ravensbourne.
WEB14104
Tom Sharman.