There’s one fact that all successful forum owners will immediately agree upon: Forums require moderation. The success of a forum relies on the quality of content contributed by the community. User retention suffers when inappropriate or irrelevant content is posted. Plus, this type of content creates poor search engine results which hinders new user acquisition. Moderation is essential to ensure the long term health and growth of forums.
What Exactly is Moderation?
The term moderation is often overloaded, so I’ll begin with a quick definition of what is meant by the word moderation in this post:
Forum moderation is the process by which content and the users that generate content are monitored in an Internet Forum for adherence to the forum policies.
Moderators who take on the monitoring task can be aided by automated tools such as text & spam filters and will employ various methods of reviewing content including pre-moderation, post-moderation, and community moderation. By investing time fine tuning the workflow process and having the right tools in place, your moderation efficiency and user experience can be greatly improved.
Clearly Understand what is Acceptable
First things first- without clear policies in place, the inevitable result of your moderation practices will be utter confusion. Your choice in platform and/or tools will also be difficult. Consider the following questions when defining your forum policies: Who is your target demographic? What topics are you trying to promote and what will be prohibited? What should you do if someone attempts to submit a post that contains “I hate you all!!!”? When should content be approved by a moderator prior to being published? Once you have addressed these questions and many others, what tools are available to enforce the moderation policies?
Choosing the Right Tools
Applying the right tools will greatly assist moderators in reviewing high-risk content in a timely manner. First, your automated filtering solution should outright reject content that is obviously inappropriate (See: Filtering Forums). Next, prioritize high-risk content using the following criteria:
- Posts from new subscribers
- Posts from anonymous users (if anonymous posts are allowed)
- Posts that have been flagged by community members
- Posts that have been identified by a filter as possibly containing spam, profanities, or other prohibited content
- A combination of all of the above
You will need to determine whether the review process should take place before the post is displayed (pre-moderation) or after (post-moderation), which will depend on the forum policies. For example, posts are always reviewed after being published in the case of community flags.
Pre-Moderation Comes at a High Cost
Pre-moderating content is expensive in terms of moderator time and there is the potential to lose users that are looking for real-time interaction. Community managers need to determine what content should be pre-moderated and remember that this policy should be reviewed and adjusted regularly.
For example, a forum owner may start by pre-moderating all video submissions. By keeping records on moderator and user activity, the forum owner may discover that the risk of a video being inappropriate is essentially never if the user has already submitted 3 videos that were approved by moderators. Thus, videos can be immediately published by trusted users and reviewed sometime afterward. This not only lowers moderation time, it also shows respect to community members which encourages more engagement. The forum owner may even forego the post-review process once a community member reaches a type of uber status.
The User’s Perspective
The vast majority of users who visit a forum do so with good intentions like seeking information, looking for a good conversation, etc. Take the perspective of a user with good intentions when applying filters, moderation policies, and other gates. For instance, to keep the spam-bots out, shy away from captchas that can frustrate users and go with other well-documented techniques (see footnotes below).
Also consider adding what I call self-moderation: Give users a chance to adjust and resubmit a forum post if it’s flagged by a filter as having inappropriate content. Community members will be very grateful if they have a chance to edit and resubmit a post that took them a while to write rather than having to start over or wait for a moderator to approve it. This privilege can be abused, so community managers must first consider if all users should have this ability or only trusted users.
Moderators may have different skill sets, experience, and tasks delegated to them. Identify what your forum requires and be sure you implement a forum platform and other tools that allow for roles and permissions to be assigned uniquely for each moderator. The ability to ban and take other punitive actions toward community members should be restricted to community managers.
If you came across this post hoping to find all the details in how to successfully moderate your forum, I hope you’re not too disappointed. The simple fact is that no forum moderation policy is globally perfect since all forums are truly unique (an endearing trait I love about them!). Also, there are a great number of considerations I did not touch upon in this post such as legal concerns, moderating forums in conjunction with real-time chat, and many others (otherwise this post would be a book). Check out our other posts or reach out to us directly if you would like to delve deeper!