See: Description
| Class | Description |
|---|---|
| BaseRobotRules |
Result from parsing a single robots.txt file - which means we get a set of
rules, and a crawl-delay.
|
| BaseRobotsParser | |
| RobotUtils | |
| SimpleRobotRules |
Result from parsing a single robots.txt file - which means we get a set of
rules, and a crawl-delay.
|
| SimpleRobotRules.RobotRule |
Single rule that maps from a path prefix to an allow flag.
|
| SimpleRobotRulesParser |
This implementation of
BaseRobotsParser retrieves a set of
rules for an agent with the given name from the
robots.txt file of a given domain. |
| Enum | Description |
|---|---|
| SimpleRobotRules.RobotRulesMode |
Copyright © 2009–2016 Crawler-Commons. All rights reserved.