public class SimpleRobotRules extends BaseRobotRules
Modifier and Type | Class and Description |
---|---|
static class |
SimpleRobotRules.RobotRule
Single rule that maps from a path prefix to an allow flag.
|
static class |
SimpleRobotRules.RobotRulesMode |
UNSET_CRAWL_DELAY
Constructor and Description |
---|
SimpleRobotRules() |
SimpleRobotRules(SimpleRobotRules.RobotRulesMode mode) |
Modifier and Type | Method and Description |
---|---|
void |
addRule(String prefix,
boolean allow) |
void |
clearRules() |
boolean |
equals(Object obj) |
int |
hashCode() |
boolean |
isAllowAll()
Is our ruleset set up to allow all access?
|
boolean |
isAllowed(String url) |
boolean |
isAllowNone()
Is our ruleset set up to disallow all access?
|
void |
sortRules()
In order to match up with Google's convention, we want to match rules
from longest to shortest.
|
addSitemap, getCrawlDelay, getSitemaps, isDeferVisits, setCrawlDelay, setDeferVisits
public SimpleRobotRules()
public SimpleRobotRules(SimpleRobotRules.RobotRulesMode mode)
public void clearRules()
public void addRule(String prefix, boolean allow)
public boolean isAllowed(String url)
isAllowed
in class BaseRobotRules
public void sortRules()
public boolean isAllowAll()
isAllowAll
in class BaseRobotRules
public boolean isAllowNone()
isAllowNone
in class BaseRobotRules
public int hashCode()
hashCode
in class BaseRobotRules
public boolean equals(Object obj)
equals
in class BaseRobotRules
Copyright © 2009–2016 Crawler-Commons. All rights reserved.