Search results
26 packages found
🤖/👨🦰 Recognise bots/crawlers/spiders using the user agent string.
Parser for XML Sitemaps to be used with Robots.txt and web crawlers
A simple redis primitives to incr() and top() user agents
Parser for XML Sitemaps to be used with Robots.txt and web crawlers
Parser for XML Sitemaps to be used with Robots.txt and web crawlers
A set of shared utilities that can be used by crawlers
Parser for XML Sitemaps to be used with Robots.txt and web crawlers
It uses the user-agents.org xml file for detecting bots.
detects bots/crawlers/spiders via the user agent.
Parser for XML Sitemaps to be used with Robots.txt and web crawlers
A jQuery plugin that helps you to hide your email on your page and prevent crawlers to get it!
A set of shared utilities that can be used by crawlers
A straightforward sitemap generator written in TypeScript.
🤖 detect bots/crawlers/spiders via the user agent.
Parser for XML Sitemaps to be used with Robots.txt and web crawlers
Parse robot directives within HTML meta and/or HTTP headers.
Parser for XML Sitemaps to be used with Robots.txt and web crawlers. (Extended version by mastixmc)
Parser for XML Sitemaps to be used with Robots.txt and web crawlers
Lightweight robots.txt parsing component without any external dependencies for Node.js.