Content Moderation
Content Moderation

Content Moderation

Whether you are running an online community or you are relying on content generated by your website’s visitors, you are generally exposed to the risk of inappropriate user-generated content being published on your platform. Inappropriate content includes, but is not limited to, profanity, racism, violence, nudity, false information, spam and outdated information.

Moderation is the review of user-generated content and the decision to publish, edit or delete the content or at times to engage with the online community.

The online content moderation services that we usually offer to our clients fall within two types of moderation: pre-moderation and post-moderation.

Pre-moderation

Pre-moderation means that all user content is moderated before it appear online. Our professional pre-moderation services provide our clients with high control over what appears on their platforms.

We employ pre-moderation when it is crucial to validate the entire content before it is published. We can also pre-moderate non-text content, such as images, video and audio recordings.

We deliver pre-moderation services to our clients in German, French, Italian, Dutch and English. You can read a case study on moderating trilingual content for one of our clients here.

Post-moderation

Post-moderation implies that content appears online once it is generated and is only moderated afterwards. We employ post-moderation for projects where immediacy of content is vital, such as in chat-rooms, forums or gaming communities.

By using our post-moderation services, our clients make sure that content is delivered in real time, giving their communities the expected speed of interacting on the web, while also ensuring that content still is filtered for inappropriate elements.