Search results

26 packages found

🤖/👨‍🦰 Recognise bots/crawlers/spiders using the user agent string.

published 5.1.5 2 days ago
M
Q
P

Parser for XML Sitemaps to be used with Robots.txt and web crawlers

published 3.2.10 a year ago
M
Q
P

A simple redis primitives to incr() and top() user agents

published 1.2.3 2 months ago
M
Q
P

Parser for XML Sitemaps to be used with Robots.txt and web crawlers

published 1.1.5 3 months ago
M
Q
P

Parser for XML Sitemaps to be used with Robots.txt and web crawlers

published 3.2.9 2 years ago
M
Q
P

A set of shared utilities that can be used by crawlers

published 3.9.2 10 days ago
M
Q
P

Parser for XML Sitemaps to be used with Robots.txt and web crawlers

published 3.2.8 5 months ago
M
Q
P

It uses the user-agents.org xml file for detecting bots.

published 1.0.10 9 years ago
M
Q
P

detects bots/crawlers/spiders via the user agent.

published 2.3.0 6 years ago
M
Q
P

Parser for XML Sitemaps to be used with Robots.txt and web crawlers

published 3.2.5 2 years ago
M
Q
P

A jQuery plugin that helps you to hide your email on your page and prevent crawlers to get it!

published 0.1.0 9 years ago
M
Q
P

A set of shared utilities that can be used by crawlers

published 3.3.0 a year ago
M
Q
P

A straightforward sitemap generator written in TypeScript.

published 1.0.1 3 years ago
M
Q
P

🤖 detect bots/crawlers/spiders via the user agent.

published 3.3.3 3 years ago
M
Q
P

Parser for XML Sitemaps to be used with Robots.txt and web crawlers

published 3.0.2 6 years ago
M
Q
P

Parse robot directives within HTML meta and/or HTTP headers.

published 0.4.0 7 years ago
M
Q
P

Parser for XML Sitemaps to be used with Robots.txt and web crawlers. (Extended version by mastixmc)

published 3.2.0 5 years ago
M
Q
P

Parser for XML Sitemaps to be used with Robots.txt and web crawlers

published 3.2.8 a year ago
M
Q
P

crawlers

published 1.0.1 2 years ago
M
Q
P

Lightweight robots.txt parsing component without any external dependencies for Node.js.

published 0.0.5-dev 2 years ago
M
Q
P