CrawlingContext <Crawler, UserData>
Hierarchy
- RestrictedCrawlingContext<UserData>- CrawlingContext
 
Index
Properties
inheritedaddRequests
Type declaration
- Parameters- requestsLike: readonly (string | ReadonlyObjectDeep<Partial<RequestOptions<Dictionary>> & { regex?: RegExp; requestsFromUrl?: string }> | ReadonlyObjectDeep<Request<Dictionary>>)[]
- optionaloptions: ReadonlyObjectDeep<RequestQueueOperationOptions>- Options for the request queue 
 - Returns Promise<void>
 
crawler
getKeyValueStore
Get a key-value store with given name or id, or the default one for the crawler.
Type declaration
- Parameters- optionalidOrName: string
 - Returns Promise<KeyValueStore>
 
inheritedid
inheritedlog
A preconfigured logger for the request handler.
optionalinheritedproxyInfo
An object with information about currently used proxy by the crawler and configured by the ProxyConfiguration class.
inheritedrequest
The original Request object.
optionalinheritedsession
inheriteduseState
Returns the state - a piece of mutable persistent data shared across all the request handler runs.
Type declaration
- Parameters- optionaldefaultValue: State
 - Returns Promise<State>
 
Methods
enqueueLinks
- This function automatically finds and enqueues links from the current page, adding them to the RequestQueue currently used by the crawler. - Optionally, the function allows you to filter the target links' URLs using an array of globs or regular expressions and override settings of the enqueued Request objects. - Check out the Crawl a website with relative links example for more details regarding its usage. - Example usage - async requestHandler({ enqueueLinks }) {
 await enqueueLinks({
 globs: [
 'https://www.example.com/handbags/*',
 ],
 });
 },- Parameters- optionaloptions: ReadonlyObjectDeep<Omit<EnqueueLinksOptions, requestQueue>> & Pick<EnqueueLinksOptions, requestQueue>- All - enqueueLinks()parameters are passed via an options object.
 - Returns Promise<BatchAddRequestsResult>- Promise that resolves to BatchAddRequestsResult object. 
inheritedpushData
- This function allows you to push data to a Dataset specified by name, or the one currently used by the crawler. - Shortcut for - crawler.pushData().- Parameters- optionaldata: ReadonlyDeep<Dictionary | Dictionary[]>- Data to be pushed to the default dataset. 
- optionaldatasetIdOrName: string
 - Returns Promise<void>
sendRequest
- Fires HTTP request via - got-scraping, allowing to override the request options on the fly.- This is handy when you work with a browser crawler but want to execute some requests outside it (e.g. API requests). Check the Skipping navigations for certain requests example for more detailed explanation of how to do that. - async requestHandler({ sendRequest }) {
 const { body } = await sendRequest({
 // override headers only
 headers: { ... },
 });
 },- Parameters- optionaloverrideOptions: Partial<OptionsInit>
 - Returns Promise<Response<Response>>
Add requests directly to the request queue.