Search results
450 packages found
A specification compliant robots.txt parser with wildcard (*) matching support.
xvideos.com api implementation.
Crawler is a web spider written with Nodejs. It gives you the full power of jQuery on the server to parse a big number of pages as they are downloaded, asynchronously
A tiny node module to detect spiders/crawlers quickly and comes with optional middleware for ExpressJS
[![npm](https://img.shields.io/npm/v/supercrawler.svg?maxAge=2592000)]() [![npm](https://img.shields.io/npm/l/supercrawler.svg?maxAge=2592000)]() [![GitHub issues](https://img.shields.io/github/issues/brendonboshell/supercrawler.svg?maxAge=2592000)]() [![
spankbang.com api implementation
Crawler is a web spider written with Nodejs. It gives you the full power of jQuery on the server to parse a big number of pages as they are downloaded, asynchronously
NPM Spider solitaire lib
Crawler (spider) of site web pages by domain name
A simple email extractor for obfuscated emails.
- email-address
- extractor
- obfuscated
- obfuscated-email
- obfuscated-emails
- parse
- regex
- obfuscator
- obfuscation
- harvester
- scraper
- crawler
- View more
xvideos.com api implementation.
Parses the wget spider output into an object
betterLoading library NOT
Simple WAF to integrate with Node.js web systems
- waf
- nodejs
- firewall
- blocker
- filtering
- bot
- spider
- robot
- crawler
- useragent
- user-agent
- detector
- detect
- detection
- View more
A lightweight robots.txt parser for Node.js with support for wildcards, caching and promises.
This is an ES6 adaptation of the original PHP library CrawlerDetect, this library will help you detect bots/crawlers/spiders vie the useragent.
Detect user-agent is a bot/spider/crawler
har download 从 chrome 生成的 har 文件中下载整个网站所有资源
Ananse is a lightweight NodeJs framework with batteries included for building efficient, scalable and maintainable USSD applications.
Lightweight crawler that works like a real browser