Описание
Toxic comments are becoming a major challenge to have meaningful online discussions.
This plugin uses a pre-trained toxic classifier from TensorFlow to classify a comment as toxic. See more technical details on the quality of the model here.
Once a comment is flagged as toxic, the comment is blocked and the plugin alerts the comment author and asks to modify the text before trying again.
In the default Settings->Discussion page you can enable the detection of toxic comments and define the threshold confidence level for the prediction.
Скриншоты
Установка
Install and Activate the plugin through the ‘Plugins’ menu in WordPress
Часто задаваемые вопросы
- How did you train the predictive model?
-
We didn’t. We are using a pre-trained model provided by tensorflow itself.
- Can I improve or personalize the prediction by manually training the toxicity classifier on my site?
-
No. The classifier is pre-trained. But you could build your own classifier based on the [code to create and train] (https://github.com/conversationai/conversationai-models/tree/master/experiments) this one
- What external JavaScript scripts does the plugin import?
-
The plugin relies on tensorflow.js to analyze the comment on the browser. Therefore, the plugin enqueues tensorflow, the sentence encoder and the toxicity model.
Nevertheless, the JS code to execute the actual comment classification is only added to single post pages with comments (and the toxicity settings) enabled.
Отзывы
Участники и разработчики
«Serious Toxic Comments» — проект с открытым исходным кодом. В развитие плагина внесли свой вклад следующие участники:
УчастникиПеревести «Serious Toxic Comments» на ваш язык.
Заинтересованы в разработке?
Посмотрите код, проверьте SVN репозиторий, или подпишитесь на журнал разработки по RSS.
Журнал изменений
1.1.1
- Bug fix: Avoids calling bbPress functions when bbPress is not present in the site
1.1
- Added support for bbPress
- Possibility to configure the warning message when a toxic comment is detected
1.0
- Initial release