We have a variety of tools to help you manage and moderate your members to maintain a healthy and positive vibe of your community.
Generally, you will need to decide on your community guidelines on what is and is not allowed in your app, share and agree these guidelines within your team or anyone else who has access to your App's CMS. Disciple's "Term Of Service" cover the legal guidelines that all app members would need to stick to. For details see Section 4 here
Here are the moderation tools available on the Disciple platform:
- Members can block others users. This means they will stop seeing content from those members.
- Members can report other member's posts. Reports get sent to the Hub where you can review them and action accordingly.
- App admin can shadowban members, meaning they will be able to post their content that no other members will see
- App admin can disable and disable & unpublish content for any app member via Disciple hub. This action can be reversed where you can enable member's account when the issue has been resolved.
- Email verification is available for your app to prevent blocked members to return to the app with a fake email address.
- Any app member (including app admins) can be assigned to be a "trusted reporter" which means when this member reports a post, this post will be automatically deleted. This is a great tool if you want to recruit some of your app members to help moderate the community.
- We have implemented Google's AI technology that gives automatic ratings to published content in posts from 0 to 100 where higher rating indicates higher risk of content being inappropriate. This is a reporting system only which means risky posts do not get deleted unless this is actioned by an app admin via the Hub.
Whenever you do any banning, blocking and deleting posts, we advise to get in touch with a problematic member to explain what they have done wrong and what action has been taken.
Disciple do not provide moderation support but we can recommend some suppliers.