Crawling services

Using Crawling / Scraping Services / IP Rotation

External Importer supports multiple crawling services that automatically rotate IP addresses to prevent blocking and rate-limit issues. The currently supported services are:

You can view the crawler service used for the last request on the product import page, in the right-hand panel.

How to Route Requests Through a Scraping Service

1. Add API Keys

  1. Open External Importer → Settings → Extractor

  2. Enter your API keys for the providers you want to use

You may enable one or multiple providers at the same time.

2. Configure Routing Rules

Under the Routing rules table:

  1. Click Add rule

  2. Enter a domain or pattern

  3. Select the scraping provider

  4. (Optional) Add additional parameters for the provider API

Pattern Examples

You can use simple domain names or advanced patterns:

  • example.com

  • *.example.com (any subdomain)

  • example.com/path/* (match only specific URL paths)

Additional Parameters

Additional parameters are appended to the provider API request. Each provider uses its own parameter names and accepted values.

ScraperAPI Parameters

  • country_code=us — Geotargeting

  • premium=true — Premium residential/mobile IPs

  • ultra_premium=true — Advanced bypass mechanism

  • render=true — JavaScript rendering

Scrapingdog Parameters

  • country=de — Geotargeting

  • premium=true — Premium residential proxies

  • dynamic=true — JavaScript rendering

Combining Parameters

Multiple parameters can be joined using &:

country_code=us&premium=true&render=true

Rule Priority

Routing rules are evaluated from top to bottom. The first rule that matches a URL is selected, and the chosen provider will handle the request.

Last updated

Was this helpful?