Robots Generator
Produces a simple, valid robots.txt to be parsed by web crawlers. Adheres to the specification provided by Google, however currently only supports one User-Agent rule. Requires Node 4+. Installed through NPM with:
npm install robots-generator --save-dev
Simply require the module and execute it with an optional array of configuration.
- User-Agent: A means of identifying a specific crawler or set of crawlers.
- Allow: An array of directories that a crawler is allowed to access.
- Disallow: An array of directories that a crawler is not allowed to access.
- Sitemap: Your website's sitemap URL.
var robots = ; ;
If you need an ES5 build for legacy purposes, just require the ES5 file:
var robots = ;
Outputs the following file:
User-agent: *
Allow: /folder1/
Allow: /folder2/
Disallow: cgi-bin/
Sitemap: http://haydenbleasel.com/sitemap.xml
To build the ES5 version:
npm install -g babel-clibabel --presets es2015 index.js --out-file es5.js