| Package | Description | 
|---|---|
| crawlercommons.robots | 
 The robots package contains all of the robots.txt rule inference, parsing and utilities contained within Crawler Commons. 
 | 
| Modifier and Type | Class and Description | 
|---|---|
class  | 
SimpleRobotRulesParser
 This implementation of  
BaseRobotsParser retrieves a set of
 rules for an agent with the given name from the
 robots.txt file of a given domain. | 
| Modifier and Type | Method and Description | 
|---|---|
static BaseRobotRules | 
RobotUtils.getRobotRules(BaseHttpFetcher fetcher,
             BaseRobotsParser parser,
             URL robotsUrl)
Externally visible, static method for use in tools and for testing. 
 | 
Copyright © 2009–2016 Crawler-Commons. All rights reserved.