public class SimpleRobotRules extends BaseRobotRules
| Modifier and Type | Class and Description |
|---|---|
static class |
SimpleRobotRules.RobotRule
Single rule that maps from a path prefix to an allow flag.
|
static class |
SimpleRobotRules.RobotRulesMode |
UNSET_CRAWL_DELAY| Constructor and Description |
|---|
SimpleRobotRules() |
SimpleRobotRules(SimpleRobotRules.RobotRulesMode mode) |
| Modifier and Type | Method and Description |
|---|---|
void |
addRule(String prefix,
boolean allow) |
void |
clearRules() |
boolean |
equals(Object obj) |
int |
hashCode() |
boolean |
isAllowAll()
Is our ruleset set up to allow all access?
|
boolean |
isAllowed(String url) |
boolean |
isAllowNone()
Is our ruleset set up to disallow all access?
|
void |
sortRules()
In order to match up with Google's convention, we want to match rules
from longest to shortest.
|
addSitemap, getCrawlDelay, getSitemaps, isDeferVisits, setCrawlDelay, setDeferVisitspublic SimpleRobotRules()
public SimpleRobotRules(SimpleRobotRules.RobotRulesMode mode)
public void clearRules()
public void addRule(String prefix, boolean allow)
public boolean isAllowed(String url)
isAllowed in class BaseRobotRulespublic void sortRules()
public boolean isAllowAll()
isAllowAll in class BaseRobotRulespublic boolean isAllowNone()
isAllowNone in class BaseRobotRulespublic int hashCode()
hashCode in class BaseRobotRulespublic boolean equals(Object obj)
equals in class BaseRobotRulesCopyright © 2009–2016 Crawler-Commons. All rights reserved.