How to set up a proxy in Scraper API

Comments: 0

Scraper API is a powerful tool designed for scraping, i.e., extracting data from websites. It enables users globally to access website data while circumventing any blocks and restrictions. Currently, the service enhances the efficiency and anonymity of your requests, streamlining the process. This article will provide a detailed explanation on how to set up a proxy in Scraper API, allowing you to use it without any complications or restrictions.

Step-by-step proxy setup in Scraper API

Setting up a proxy in Scraper API is a straightforward process that enhances your ability to efficiently scrape data from websites while bypassing blocks and restrictions. Here’s a detailed guide to get you started:

  1. Sign up on the Scraper API website. Once authorized, you'll receive an API key, which is crucial for authenticating your requests.


  2. Go to the “Dashboard” section in the main menu.


  3. Find the “Sample Proxy Code” section and copy the provided code. This is your starting point for configuring the proxy in the online web scraper.


    Modify the Sample Code: In Scraper API, you can use HTTP, HTTPS, and SOCKS5 proxies.

  4. In the line curl -x"" -k "" you need to replace:
    • “scraperapi” with your actual username;
    • "APIKEY" for the password;
    • “” to a new IP;
    • "8001" per port.

    After “-k”, specify the URL of the page from which you wish to scrape data.

Code examples for different languages:

  • For Node.js:

proxy: {

host: 'your new IP address',

port: port number,

auth: {

username: 'your login',

password: 'your password'


protocol: 'http'


  • For Python:

import requests

proxies = {

"http": "http://your username:your password@your IP address:port number"


You can set up several proxies by duplicating the desired section of code. This diversifies your scraping requests, minimizing the risk of IP blocking and enabling access to geo-restricted resources.

By following these steps, you can effectively set up a proxy in Scraper API, ensuring efficient and unrestricted data collection from various online sources.