Class CrawlerCommons


  • public class CrawlerCommons
    extends Object
    Crawler-Commons is a set of reusable Java components that implement functionality common to web crawlers: robots.txt and sitemap parsing, or URL normalization.
    • Constructor Detail

      • CrawlerCommons

        public CrawlerCommons()
    • Method Detail

      • getVersion

        public static String getVersion()