Search results

29 packages found

🤖/👨‍🦰 Recognise bots/crawlers/spiders using the user agent string.

published version 5.1.28, 18 days ago297 dependents licensed under $Unlicense
4,968,545

A set of shared utilities that can be used by crawlers

published version 3.13.5, 10 days ago16 dependents licensed under $Apache-2.0
184,857

Parser for XML Sitemaps to be used with Robots.txt and web crawlers

published version 4.0.2, 14 days ago66 dependents licensed under $MIT
130,759

Parse robot directives within HTML meta and/or HTTP headers.

published version 0.4.0, 8 years ago5 dependents licensed under $MIT
108,416

Parser for XML Sitemaps to be used with Robots.txt and web crawlers

published version 3.2.9, 3 years ago0 dependents licensed under $MIT
478

A simple redis primitives to incr() and top() user agents

published version 1.2.5, 8 months ago0 dependents licensed under $MIT
352

Parser for XML Sitemaps to be used with Robots.txt and web crawlers

published version 3.2.10, a year ago1 dependents licensed under $MIT
154

Opensource Framework Crawler in Node.js

published version 2.0.10, 6 years ago0 dependents licensed under $ISC
137

It uses the user-agents.org xml file for detecting bots.

published version 1.0.10, 10 years ago0 dependents licensed under $ISC
66

Lightweight robots.txt parsing component without any external dependencies for Node.js.

published version 0.0.5-dev, 3 years ago0 dependents licensed under $MIT
21

A jQuery plugin that helps you to hide your email on your page and prevent crawlers to get it!

published version 0.1.0, 10 years ago0 dependents licensed under $ISC
22

🤖 detect bots/crawlers/spiders via the user agent.

published version 3.3.3, 4 years ago0 dependents licensed under $Unlicense
20

Parser for XML Sitemaps to be used with Robots.txt and web crawlers

published version 3.2.10, 2 years ago4 dependents licensed under $MIT
15

isbot Bundle UMD

published version 1.0.4, 5 years ago0 dependents
18

Parser for XML Sitemaps to be used with Robots.txt and web crawlers

published version 1.1.5, a year ago0 dependents licensed under $MIT
17

Parser for XML Sitemaps to be used with Robots.txt and web crawlers

published version 2.1.19, 4 years ago0 dependents licensed under $MIT
17

Parser for XML Sitemaps to be used with Robots.txt and web crawlers

published version 5.0.0, a year ago0 dependents licensed under $MIT
15

Crawler made simple

published version 0.0.3, 10 years ago0 dependents
13

Parser for XML Sitemaps to be used with Robots.txt and web crawlers

published version 3.2.5, 3 years ago1 dependents licensed under $MIT
13

A straightforward sitemap generator written in TypeScript.

published version 1.0.1, 4 years ago0 dependents licensed under $ISC
12