Automated and Crowd-sourced Moderation
Telligent Community includes a fully automated moderation system that reviews and evaluates content and user to determine their likelihood to be SPAM or abusive. The automated moderation is built on a series of rules that can be tailored for your specific community’s needs. Content that is moved into the moderation queue through either automation or via abuse flagging goes through a work flow and appeals process. Moderated content also collects and shows information to moderators detailing why and/or who performed the moderation action.
Members of the community can flag content as abusive. Based on the number of abuse votes, and the reputation of the voting members, published content will be moved into the moderation workflow. Crowd-sourced abuse management ensures that large communities can benefit from members identifying content which requires moderation or removal.
Moderate users to ban them for abusive content manually or automatically based on configured abuse thresholds.