| Package | Description |
|---|---|
| crawlercommons.robots |
The robots package contains all of the robots.txt rule inference, parsing and utilities contained within Crawler Commons.
|
| Class and Description |
|---|
| BaseRobotRules
Result from parsing a single robots.txt file - which means we get a set of
rules, and a crawl-delay.
|
| BaseRobotsParser |
| SimpleRobotRules.RobotRule
Single rule that maps from a path prefix to an allow flag.
|
| SimpleRobotRules.RobotRulesMode |
Copyright © 2009–2016 Crawler-Commons. All rights reserved.