Warn about possible insensitive, inconsiderate language with retext.
npm install retext-equality
Say we have the following file,
He’s pretty set on beating your butt for sheriff.
And our script,
example.js, looks like this:
var vfile =var report =var unified =var english =var stringify =var equality =
node example yields:
example.txt1:1-1:4 warning `His` may be insensitive, use `Their`, `Theirs`, `Them` instead her-him retext-equality1:31-1:37 warning `master` / `slave` may be insensitive, use `primary` / `replica` instead master-slave retext-equality⚠ 2 warnings
Adds warnings for possible insensitive, inconsiderate language to the processed virtual files.
Array.<string> — List of phrases not to warn about
false — Do not allow binary references. By default
he is warned about unless it’s followed by something like
or she or
and she. When
true, both cases would be warned about.
rules.md for a list of rules.
Thanks, contributions are greatly appreciated! 👍
retextjs/retext for ways to get
started. This organisation has a Code of Conduct. By interacting
with this repository, organisation, or community you agree to abide by its
To create new patterns, add them in the YAML files in the
script/ directory, and run
npm install and then
to build everything. New rules will be automatically added to
Please see the current patterns for inspiration.
Once you are happy with the new rule, add a test for it in
test.js and open a Pull Request.
alex— Catch insensitive, inconsiderate writing
retext-passive— Check passive voice
retext-profanities— Check for profane and vulgar wording
retext-simplify— Check phrases for simpler alternatives