Uses of Package
crawlercommons.robots
-
Packages that use crawlercommons.robots Package Description crawlercommons.robots The robots package contains all of the robots.txt rule inference, parsing and utilities contained within Crawler Commons. -
Classes in crawlercommons.robots used by crawlercommons.robots Class Description BaseRobotRules Result from parsing a single robots.txt file - which means we get a set of rules, and a crawl-delay.BaseRobotsParser SimpleRobotRules Result from parsing a single robots.txt file - which means we get a set of rules, and an optional crawl-delay, and an optional sitemap URL.SimpleRobotRules.RobotRule Single rule that maps from a path prefix to an allow flag.SimpleRobotRules.RobotRulesMode