| Package | Description |
|---|---|
| crawlercommons.fetcher.http |
This package concerns the fetching of files over the HTTP protocol:
Extending from
BaseHttpFetcher (which itself extends BaseFetcher) the
SimpleHttpFetcher provides the Crawler Commons HTTP fetching implementation. |
| crawlercommons.robots |
The robots package contains all of the robots.txt rule inference, parsing and utilities contained within Crawler Commons.
|
| Class and Description |
|---|
| BaseHttpFetcher
Deprecated.
As of release 0.6. We recommend directly using Apache HttpClient,
async-http-client, or any other robust, industrial-strength HTTP
clients.
|
| BaseHttpFetcher.RedirectMode
Deprecated.
|
| UserAgent
Deprecated.
As of release 0.6. We recommend directly using Apache HttpClient,
async-http-client, or any other robust, industrial-strength HTTP
clients.
|
| Class and Description |
|---|
| BaseHttpFetcher
Deprecated.
As of release 0.6. We recommend directly using Apache HttpClient,
async-http-client, or any other robust, industrial-strength HTTP
clients.
|
| UserAgent
Deprecated.
As of release 0.6. We recommend directly using Apache HttpClient,
async-http-client, or any other robust, industrial-strength HTTP
clients.
|
Copyright © 2009–2016 Crawler-Commons. All rights reserved.