Toxic comments are becoming a major challenge to have meaningful online discussions.
Once a comment is flagged as toxic, the comment is blocked and the plugin alerts the comment author and asks to modify the text before trying again.
In the default Settings->Discussion page you can enable the detection of toxic comments and define the threshold confidence level for the prediction.
Install and Activate the plugin through the ‘Plugins’ menu in WordPress
- How did you train the predictive model?
We didn’t. We are using a pre-trained model provided by tensorflow itself.
- Can I improve or personalize the prediction by manually training the toxicity classifier on my site?
No. The classifier is pre-trained. But you could build your own classifier based on the [code to create and train] (https://github.com/conversationai/conversationai-models/tree/master/experiments) this one
The plugin relies on tensorflow.js to analyze the comment on the browser. Therefore, the plugin enqueues tensorflow, the sentence encoder and the toxicity model.
Nevertheless, the JS code to execute the actual comment classification is only added to single post pages with comments (and the toxicity settings) enabled.
Contributi e sviluppo
“Serious Toxic Comments” è un software open source. Le persone che hanno contribuito allo sviluppo di questo plugin sono indicate di seguito.Collaboratori
Ti interessa lo sviluppo?
Changelog (registro delle modifiche)
- Bug fix: Avoids calling bbPress functions when bbPress is not present in the site
- Added support for bbPress
- Possibility to configure the warning message when a toxic comment is detected
- Initial release