detects bots/crawlers/spiders via the user agent.
Parser for XML Sitemaps to be used with Robots.txt and web crawlers
Parse robot directives within HTML meta and/or HTTP headers.
Opensource Framework Crawler in Node.js
It uses the user-agents.org xml file for detecting bots.
Crawler made simple
A jQuery plugin that helps you to hide your email on your page and prevent crawlers to get it!
powered by npms.io 🚀