Packages 
Package Description
crawlercommons  
crawlercommons.domains
Classes contained within the domains package relate to the definition of Top Level Domain's, various domain registrars and the effective handling of such domains.
crawlercommons.filters
The filters package contains code and resources for URL filtering.
crawlercommons.filters.basic  
crawlercommons.mimetypes  
crawlercommons.robots
The robots package contains all of the robots.txt rule inference, parsing and utilities contained within Crawler Commons.
crawlercommons.sitemaps
Sitemaps package provides all classes relevant to focused sitemap parsing, url definition and processing.
crawlercommons.sitemaps.extension  
crawlercommons.sitemaps.sax  
crawlercommons.sitemaps.sax.extension  
crawlercommons.utils