Class SimpleRobotRules

  • All Implemented Interfaces:
    Serializable

    public class SimpleRobotRules
    extends BaseRobotRules
    Result from parsing a single robots.txt file - which means we get a set of rules, and an optional crawl-delay, and an optional sitemap URL. Note that we support Google's extensions (Allow directive and '$'/'*' special chars) plus the more widely used Sitemap directive. See https://en.wikipedia.org/wiki/Robots_exclusion_standard See https://developers.google.com/search/reference/robots_txt
    See Also:
    Serialized Form