New Amazon API: We've just released a brand new way to scrape Amazon at scale Start Free Trial 🐝

How to set up Axios proxy: A step-by-step guide for Node.js

03 October 2025 | 26 min read

If you've ever tried to send requests through a proxy in Node.js, chances are you've searched for how to set up an Axios proxy. Whether you're scraping the web, checking geo-restricted content, or just hiding your real IP, proxies are a common part of the toolkit.

This guide walks through the essentials of using Axios with proxies:

  • setting up a basic proxy,
  • adding username/password authentication,
  • rotating proxies to avoid bans,
  • working with SOCKS5,
  • plus a few fixes for common errors.

We'll also cover where a service like ScrapingBee can save you time if you don't want to manage proxies yourself.

The focus here is practical and hands-on: real code you can copy and run with just enough explanation to make it click.

cover image

Quick answer (TL;DR)

The fastest way to set up an Axios proxy in Node.js is to pass a proxy object with host and port into your request. If your proxy requires authentication, just drop in username and password too.

Here's a minimal example that makes a request to ipify.org so you can check which IP your request is coming from (your own, or the proxy's):

import axios from 'axios';

const res = await axios.get('https://api.ipify.org', {
  params: { format: 'json' },
  proxy: {
    host: '123.45.67.89',
    port: 8080,
    auth: { username: 'user', password: 'pass' }, // optional
  },
});

console.log(res.data);

If you see your proxy's IP in the response, congrats: Axios is using the proxy correctly! If you see your own IP, double-check the host, port, and credentials.

And if you don't want to fiddle with proxy lists or rotation at all, services like ScrapingBee handle all the proxy headaches for you automatically.

Setting up Node.js and Axios for proxy use

Before we dive into Axios proxy configuration, let's make sure you've got a working Node.js project. No proxies yet, just the basics to get Axios running smoothly.

Install Node.js

First, grab Node.js from the official site nodejs.org.

👉 Pro tip: pick the LTS (Long Term Support) version unless you need cutting-edge features. It's the stable release, and that's what you probably want when working with libraries like Axios.

After installation, confirm Node and npm are available:

node -v
npm -v

If you see version numbers (e.g. v24.7.0 and 11.6.0), you're ready to roll.

Create a new Node.js app

Pick a folder for your project and initialize it:

mkdir axios-proxy-demo
cd axios-proxy-demo
npm init -y

That last command creates a package.json file with default settings. We'll tweak it in a second.

Install Axios

Now add Axios to your project:

npm install axios

Create app.js

Inside your project folder, create a file named app.js. Drop in a simple Axios request (no proxy yet):

// app.js
import axios from 'axios';

async function testRequest() {
  try {
    const res = await axios.get('https://httpbin.org/get');
    console.log(res.data);
  } catch (err) {
    console.error(err.message);
  }
}

testRequest();

We're using import instead of require, so Node needs to know we're in ESM (ECMAScript Modules) mode. To do this, make sure your package.json has "type": "module" line. Here's an example:

{
  "name": "axios-proxy-demo",
  "version": "1.0.0",
  "type": "module",
  "dependencies": {
    "axios": "^1.12.2"
  }
}

Without "type": "module", Node will throw warnings or force you to rename your file to .mjs. This tweak makes everything nice and clean.

Verifying Axios installation with a test request

Now that everything is set up, we can run our test request by executing the following command:

node app.js

If all's good, you'll see JSON in the terminal:

{
  args: {},
  headers: {
    Accept: 'application/json, text/plain, */*',
    'Accept-Encoding': 'gzip, compress, deflate, br',
    Host: 'httpbin.org',
    'User-Agent': 'axios/1.12.2',
    'X-Amzn-Trace-Id': 'Root=1-68dd3ecd-77d17b962ee78f64683b0c70'
  },
  origin: '84.15.220.48',
  url: 'https://httpbin.org/get'
}

Axios is alive and well in your project! Next step: make it talk through a proxy.

Basic Axios proxy configuration example

The easiest way to use a proxy with Axios is by passing a proxy object into the request options. Think of it as telling Axios: "Hey, instead of going straight to the website, please talk to this middleman server first."

Let's start with the simplest setup.

Axios proxy config with host and port

Suppose your proxy is at IP 123.45.67.89 and running on port 8080. Here's how you wire that into Axios:

// app.js
import axios from 'axios';

async function testProxy() {
  try {
    const res = await axios.get('https://httpbin.org/ip', {
      proxy: {
        host: '123.45.67.89',
        port: 8080,
      },
      timeout: 10000, // 10 seconds, good practice when testing proxies
    });
    console.log(res.data);
  } catch (err) {
    console.error('Proxy request failed:', err.message);
  }
}

testProxy();

What's happening here:

  • https://httpbin.org/ip is a handy test URL that just echoes back your current IP address.
  • If the proxy is working, the printed IP will be the proxy's IP, not your own.
  • Adding a timeout is smart because proxies can sometimes hang forever if the server is overloaded or offline.

Using ScrapingBee proxies with Axios

Manually juggling proxies can get frustrating: they expire, they rotate, some websites block them, others need authentication. Services like ScrapingBee make life easier because they handle all that behind the scenes.

Here's how it works:

  • Sign up for a free trial (you get 1000 credits for free; each request costs ~5 credits).
  • Go to the HTML Request Builder in your ScrapingBee dashboard.
  • Pick a proxy type: classic, premium, or stealth (the latter is especially useful for sites with strong anti-bot measures).
  • Select Node.js Axios from the dropdown.
  • ScrapingBee will generate working boilerplate code for you.

You can run it directly on their site, or copy it into app.js.

Here's the clean example:

// app.js
import axios from 'axios';

async function testScrapingBee() {
  try {
    const res = await axios.get('https://app.scrapingbee.com/api/v1', {
      params: {
        api_key: 'YOUR_SCRAPINGBEE_API_KEY',
        url: 'https://api.ipify.org',
      },
      timeout: 10000,
    });
    console.log(res.data);
  } catch (err) {
    console.error('ScrapingBee request failed:', err.message);
  }
}

testScrapingBee();

Replace YOUR_SCRAPINGBEE_API_KEY with a real API key.

The nice part: ScrapingBee automatically rotates proxies, manages headers, and deals with common anti-bot blocks — so you can focus on writing your scraping logic instead of babysitting proxy servers.

Testing proxy

Once you've added the code, run:

node app.js

If everything's set up correctly:

  • With your own proxy, you'll see the proxy's IP address in the output.
  • With ScrapingBee, you'll see whatever IP was assigned from our pool.

Either way, the key point is: the IP should no longer match your local machine.

Using authenticated proxies in Node.js Axios

Some proxies require a username and password. In Axios, you handle this with the proxy.auth field.

If the credentials are missing or wrong, the proxy will block you with a 407 Proxy Authentication Required error. Don't worry: it's common and fixable. Let's break it down.

Setting username and password in proxy.auth

You just add an auth object inside proxy. Here's the simplest version:

async function testAuthProxy() {
  try {
    const res = await axios.get('https://api.ipify.org', {
      params: { format: 'json' },
      proxy: {
        host: 'proxy.example.com',
        port: 8080,
        auth: {
          username: 'YOUR_USER',
          password: 'YOUR_PASS',
        },
      },
      timeout: 12000,
    });
    console.log('IP via proxy ->', res.data);
  } catch (err) {
    console.error('Request failed:', err.message);
  }
}

If the proxy accepts your credentials, you'll see the proxy's IP in the response. If not, expect a 407.

Axios proxy configuration example with credentials

Hardcoding secrets (like passwords) in your code is risky. A better way is to load them from environment variables. That way, your API key or password never sits in plain text in your source files.

const {
  PROXY_HOST = 'proxy.example.com',
  PROXY_PORT = '8080',
  PROXY_USER = 'user1',
  PROXY_PASS = 's3cr3t',
} = process.env;

async function fetchIpThroughAuthProxy() {
  try {
    const res = await axios.get('https://httpbin.org/ip', {
      proxy: {
        host: PROXY_HOST,
        port: Number(PROXY_PORT),
        auth: {
          username: PROXY_USER,
          password: PROXY_PASS,
        },
      },
      timeout: 12000,
    });
    console.log(res.data);
  } catch (err) {
    console.error('Proxy call failed:', err.message);
  }
}

Run like this:

PROXY_HOST=your-proxy-host \
PROXY_PORT=8080 \
PROXY_USER=your-user \
PROXY_PASS=your-pass \
node app.js

That keeps your credentials separate from your codebase — safer and easier to rotate later.

Handling 407 proxy authentication required error

A 407 is the proxy's way of saying: "Hey, your credentials didn't check out." Here's how you can catch and handle it more gracefully:

async function guardedRequest() {
  try {
    const res = await axios.get('https://api.ipify.org', {
      params: { format: 'json' },
      proxy: {
        host: 'proxy.example.com',
        port: 8080,
        auth: { username: 'YOUR_USER', password: 'YOUR_PASS' },
      },
      timeout: 12000,
    });
    console.log(res.data);
  } catch (err) {
    if (err.response?.status === 407) {
      console.error('407: Proxy auth failed.');
      console.error('- Wrong or missing username/password');
      console.error('- API key not whitelisted or expired (provider-side policy)');
      console.error('- Proxy type mismatch (e.g., trying HTTP proxy settings on a SOCKS proxy)');
    } else {
      console.error('Request failed:', err.message);
    }
  }
}

Why 407 happens (and fixes):

  • Wrong or missing username or password. Fix: double-check proxy.auth.username and proxy.auth.password. Watch for hidden spaces or special characters.
  • API key not whitelisted or expired (some providers gate access by IP allowlists or active subscription). Fix: verify the key is active, your source IP is allowed, and your balance/plan is valid.
  • Proxy type mismatch (HTTP vs SOCKS vs HTTPS tunneling). Fix: use the correct proxy type for your provider. Axios expects an HTTP proxy that supports HTTPS tunneling. If your provider only offers SOCKS5, you'll need a different agent (covered later).

Common mistakes in basic proxy setup

Proxies can be finicky. If your request fails, it's usually one of these:

  • HTTP vs HTTPS mismatch. Some proxies only support http://. If you're trying to tunnel HTTPS traffic through a proxy that doesn't support it, the request will fail. Double-check whether your proxy handles HTTPS (CONNECT).
  • Typos in host or port. One wrong digit in the proxy IP or port, and the connection won't go anywhere. Always confirm the exact details from your proxy provider.
  • Blocked outbound ports. Some networks (especially corporate or cloud servers) block outbound traffic on ports like 8080 or 3128. If requests hang or time out, try another port or test from a different network.
  • Missing or incorrect authentication. If your proxy requires a username/password and you don't include it in the auth field, you'll get errors. Here's an example:
proxy: {
  host: 'proxy.example.com',
  port: 8080,
  auth: {
    username: 'myuser',
    password: 'mypassword'
  }
}
  • ScrapingBee 401 error (special case). If you're using ScrapingBee and see 401 Unauthorized, it usually means you forgot to add your API key or typed it incorrectly. Check the params: { api_key: '...' } line and make sure the key is valid.

Rotating proxies with Axios in Node.js

If you send dozens of requests to the same site from one IP, you'll quickly hit rate limits or outright blocks. That's where proxy rotation comes in.

Rotating proxies means you spread requests across multiple IPs. To the target site, it looks like different users are visiting instead of one suspicious script hammering it. This won't make you 100% invisible, but it dramatically lowers the chance of bans, CAPTCHAs, or throttling.

With Axios, rotation is simple: just swap out the proxy object for each request. No big frameworks needed.

Creating a proxy list array

Start with a plain list of proxies. Each one is just { host, port, auth }. Save it in a separate file like proxies.js:

// proxies.js
export const PROXIES = [
  { host: '198.51.100.10', port: 8080 },
  { host: '203.0.113.22', port: 8080, auth: { username: 'u1', password: 'p1' } },
  { host: '192.0.2.33', port: 3128 },
];

If your provider gives you a long list, you can load it from a JSON file or environment variables instead of hardcoding it.

Picking proxies: random vs round-robin

How do you decide which proxy to use on each request? Two easy strategies:

Random:

// proxies.js

export function pickRandomProxy(list) {
  return list[Math.floor(Math.random() * list.length)];
}

Round-robin (take turns):

let idx = 0;

export function pickRoundRobin(list) {
  const p = list[idx % list.length];
  idx += 1;
  return p;
}
  • Random spreads requests unpredictably (good enough in simple scenarios).
  • Round-robin is predictable and ensures all proxies get used evenly.

Using rotating proxies for web scraping

Here's a simple function that rotates proxies and retries once if a proxy fails:

// app.js
import axios from 'axios';
import { PROXIES, pickRoundRobin } from './proxies.js';

async function fetchWithRotation(url) {
  const attempt = async () => {
    const proxy = pickRoundRobin(PROXIES);
    return axios.get(url, {
      params: { format: 'json' },      // for api.ipify.org or httpbin
      proxy,
      timeout: 10000,
      validateStatus: s => s >= 200 && s < 500, // let 4xx bubble up cleanly
    });
  };

  try {
    const res = await attempt();
    if (res.status >= 400) throw new Error(`HTTP ${res.status}`);
    return res.data;
  } catch (e) {
    // retry once with a different proxy
    const res2 = await attempt();
    if (res2.status >= 400) throw new Error(`HTTP ${res2.status}`);
    return res2.data;
  }
}

// Example: confirm rotating IPs
const target = 'https://api.ipify.org';
const main = async () => {
  const out = await fetchWithRotation(target);
  console.log('Result:', out);
};

main().catch(err => console.error('Failed:', err.message));

This code:

  • Picks a proxy from the pool (round-robin).
  • Sends the request with a timeout.
  • Retries once with another proxy if the first one fails.
  • Returns the result (like the proxy's IP address).

Use this pattern when you start seeing:

  • HTTP 429 (Too Many Requests)
  • CAPTCHAs popping up often
  • Uneven latency (some proxies are slower than others)

Rotating proxies with the "power of two choices"

Random rotation is fine, but sometimes one proxy ends up overloaded while others sit idle. A smarter trick is the power of two choices:

  • Pick two random proxies.
  • Use the one with fewer active requests.
  • If it fails, immediately try the backup.

This balances load much better, avoids hammering one node, and makes failures less painful.

The example below shows a self-contained runner with:

  • One Axios client per proxy (so connections are reused)
  • A mini "circuit breaker" that temporarily disables flaky proxies
  • Backup retries.
// app.js
import axios from 'axios';

// ---- 1) Define your proxy pool (HTTP proxies; add auth if needed) ----
const PROXIES = [
  { host: '198.51.100.10', port: 8080 },
  { host: '203.0.113.22', port: 8080, auth: { username: 'u1', password: 'p1' } },
  { host: '192.0.2.33',  port: 3128 },
];

// ---- 2) Targets to fetch (repeat to simulate load) ----
const URLS = [
  'https://httpbin.org/ip',
  'https://api.ipify.org?format=json',
].flatMap(x => [x, x, x, x, x]); // 10 requests total

// ---- 3) One Axios client per proxy (own pool + headers + timeout) ----
function makeClient(proxy) {
  return axios.create({
    proxy,
    timeout: 10000,
    headers: { 'User-Agent': 'simple-rotator/1.1' },
    // Let 4xx bubble up without throwing automatically
    validateStatus: s => s >= 200 && s < 500,
  });
}
const clients = PROXIES.map(makeClient);

// ---- 4) Load + health tracking (circuit breaker) ----
const loads       = Array(PROXIES.length).fill(0); // in-flight per proxy
const failStreak  = Array(PROXIES.length).fill(0);
const cooldown    = Array(PROXIES.length).fill(0);

const OPEN = 0, HALF_OPEN = 1, CLOSED = 2;
const state = Array(PROXIES.length).fill(OPEN);

const MAX_FAILS = 3; // consecutive failures before closing
const COOLDOWN_TICKS = 5; // requests to wait before half-open test

function stepCooldowns() {
  for (let k = 0; k < PROXIES.length; k++) {
    if (cooldown[k] > 0 && --cooldown[k] === 0) {
      state[k] = HALF_OPEN;      // let it try again
      failStreak[k] = 0;
    }
  }
}

function randInt(n) { return Math.floor(Math.random() * n); }
function pickTwoIndices() {
  const alive = state
    .map((st, i) => (st !== CLOSED ? i : -1))
    .filter(i => i !== -1);

  if (alive.length >= 2) {
    const i = alive[randInt(alive.length)];
    let j;
    do { j = alive[randInt(alive.length)]; } while (j === i);
    return [i, j];
  }
  // fallback if almost everything is CLOSED
  let i = randInt(PROXIES.length), j;
  do { j = randInt(PROXIES.length); } while (j === i);
  return [i, j];
}

async function tryFetch(k, url) {
  if (state[k] === CLOSED) return null;

  loads[k] += 1;
  try {
    const res = await clients[k].get(url);
    if (res.status >= 400) throw new Error(`HTTP ${res.status}`);
    // success: heal half-open proxies
    if (state[k] === HALF_OPEN) state[k] = OPEN;
    failStreak[k] = 0;
    return res;
  } catch {
    failStreak[k] += 1;
    if (failStreak[k] >= MAX_FAILS) {
      state[k] = CLOSED;
      cooldown[k] = COOLDOWN_TICKS;
    }
    return null;
  } finally {
    loads[k] -= 1;
  }
}

// ---- 5) Main loop: "power of two choices" + backup retry ----
async function run() {
  for (const url of URLS) {
    stepCooldowns();

    const [i, j] = pickTwoIndices();
    const primary = loads[i] <= loads[j] ? i : j;
    const backup  = primary === i ? j : i;

    let r = await tryFetch(primary, url);
    if (r) {
      console.log(`[OK  p${primary}] ${url} -> ${JSON.stringify(r.data).slice(0, 80)}`);
      continue;
    }

    r = await tryFetch(backup, url);
    if (r) {
      console.log(`[OK  p${backup}] ${url} -> ${JSON.stringify(r.data).slice(0, 80)}`);
    } else {
      console.log(`[FAIL  -- ] ${url} -> both choices failed`);
    }
  }

  console.log('In-flight:', loads);
  console.log('States   :', state); // 0 OPEN, 1 HALF_OPEN, 2 CLOSED
}

run().catch(e => console.error('Runner crashed:', e.message));

Why this works

  • Per-proxy clients reuse connections for speed.
  • Two random choices balance the load better than random alone.
  • Circuit breaker benches flaky proxies instead of wasting retries.
  • Backup retry avoids request stalls when one proxy is dead.

The "power of two choices" balances traffic far better than pure randomness, retries keep throughput steady, and circuit breakers ensure your pool stays healthy. For small to medium scraping tasks, this pattern is enough.

For larger jobs or stricter targets (CAPTCHAs, geo-blocks, JavaScript-heavy pages), a managed service like ScrapingBee can take over rotation, ban handling, and geolocation for you — you just provide your target URL and API key.

Advanced proxy types and environment config

Not all proxies are created equal. Some need extra setup, others are easiest to swap via environment variables. Below: practical, copy-paste stuff so your Axios proxy setup doesn't turn into a dumpster fire.

Setting http_proxy and https_proxy environment variables

Lots of CLI tools respect HTTP_PROXY / HTTPS_PROXY, but Axios does not read them automatically. That's fine — you can read the env vars in Node and inject them into Axios. Storing creds in env vars keeps secrets out of your repo and makes switching proxies painless.

Example (POSIX shells):

export HTTP_PROXY=http://user:pass@host:8080
export HTTPS_PROXY=http://user:pass@host:8080

Simple helper to parse the env var and return an Axios proxy object:

// env-proxy.js
export function getAxiosProxyFromEnv() {
  const raw = process.env.HTTPS_PROXY || process.env.HTTP_PROXY;
  if (!raw) return null;

  try {
    const u = new URL(raw);
    return {
      host: u.hostname,
      port: Number(u.port || (u.protocol === 'https:' ? 443 : 80)),
      auth: u.username ? { username: decodeURIComponent(u.username), password: decodeURIComponent(u.password) } : undefined,
    };
  } catch (err) {
    console.warn('Unable to parse HTTP(S)_PROXY:', err.message);
    return null;
  }
}

Then use it:

import axios from 'axios';
import { getAxiosProxyFromEnv } from './env-proxy.js';

const proxy = getAxiosProxyFromEnv(); // null if not set
const res = await axios.get('https://httpbin.org/ip', {
  proxy: proxy ?? undefined,
  timeout: 10000,
});

This pattern makes it easier to swap proxies without editing code. You can also use other environment variables like NO_PROXY to exclude certain hosts:

export NO_PROXY=localhost,127.0.0.1,.mycorp.local

Take advantage of ScrapingBee curl converter tool that can easily transfer a curl command into your favorite programming language.

Using SOCKS5 proxies with socks-proxy-agent

Axios's built-in proxy option expects an HTTP proxy. But for SOCKS (SOCKS4/5), you need an agent. Yeah, I always chuckle when I see the name "SOCKS". To make it funnier, there's also a Shadowsocks project.

Anyways, install the socks-proxy-agent package with npm:

npm install socks-proxy-agent

Then in your code:

import axios from 'axios';
import { SocksProxyAgent } from 'socks-proxy-agent';

const agent = new SocksProxyAgent('socks5://user:pass@127.0.0.1:1080');

const res = await axios.get('https://httpbin.org/ip', {
  httpAgent: agent,
  httpsAgent: agent,
  proxy: false, // So that Axios does not pick any additional proxies from ENV
  timeout: 10000,
});
console.log(res.data);

Key points:

  • Use httpAgent and httpsAgent so Axios sends requests through the SOCKS tunnel.
  • Works with SOCKS4 and SOCKS5, with or without auth.
  • If you try to use Axios's proxy object for a SOCKS endpoint, it won't work — use an agent.

Parsing proxy URLs securely (special chars, encoding)

When your username or password has @, :, or other special chars, they need URL encoding. Example: user:pa$$@word should be user:pa%24%24%40word in the URL. The URL constructor handles decoding when you read username/password, but make sure to encode them when constructing the URL from raw credentials.

Here's a quick Node.js snippet that shows how to build it safely:

import { SocksProxyAgent } from 'socks-proxy-agent';
import axios from 'axios';

const username = 'user';
const password = 'pa$$@word'; // raw password with special chars

// encodeURIComponent makes sure special chars are safe in the URL
const proxyUrl = `socks5://${encodeURIComponent(username)}:${encodeURIComponent(password)}@127.0.0.1:1080`;

const agent = new SocksProxyAgent(proxyUrl);

const res = await axios.get('https://httpbin.org/ip', {
  httpAgent: agent,
  httpsAgent: agent,
});
console.log(res.data);

encodeURIComponent turns pa$$@word into pa%24%24%40word. When Node parses the URL with new URL(), it decodes it back to the real password before sending it to the proxy.

Residential proxies vs datacenter proxies

When choosing proxy types, here's the tradeoff:

Datacenter proxies:

  • Fast, cheap, plentiful.
  • IPs come from cloud providers — easy for sites to detect and block at scale.
  • Good for high-volume scraping where stealth isn't critical.

Residential proxies:

  • IPs belong to real consumer ISPs — they look like normal users.
  • Harder to block, better for sites with strict anti-bot measures.
  • Slower and more expensive.

Short rule: use datacenter for scale and low risk; use residential when the target is picky (lots of CAPTCHAs / geo checks).

Useful tips and observations

Performance tips when using Axios with proxies

Once your proxy setup works, you might notice things feel slower or less reliable than direct requests. That's normal — you're adding an extra server hop. But you can still squeeze better performance out of your setup with a few simple tricks:

1. Set a timeout

Always add a timeout to your Axios requests. Some proxies just hang instead of failing. Without a timeout, your script might stall forever.

2. Add retry logic

A flaky proxy might fail once but work on the second try. Wrapping your request in a retry mechanism saves headaches. You can use libraries like axios-retry or write a small loop to retry a few times before giving up.

3. Keep requests lean

Every byte you send goes through the proxy. Avoid dumping huge payloads unless you really need to. For scraping, fetch only what's necessary (e.g. don't load images if you just need HTML).

4. Use connection pooling / keep-alive

By default, every Axios request opens a fresh TCP connection. That adds overhead. Setting up an agent with keepAlive: true can make repeated requests much faster.

5. Don't overload a single proxy

One proxy can only take so much. If you're sending hundreds of requests per minute, spread them across multiple proxies or rotate them. Otherwise, you'll get blocked or throttled.

Security tips when using Axios with proxies

Before you start piping all your traffic through a random proxy, let's slow down a bit. Proxies can be super helpful, but they also come with risks if you're not careful.

👉 TL;DR: proxies are a tool, not a magic shield. Use reputable sources, encrypt whenever possible, and handle your credentials like they're cash.

Here are a few things to keep in mind:

1. Avoid "free" proxy lists

Yeah, free sounds tempting. But unfortunately many free proxies are either dead, painfully slow, or flat-out shady. Some log every request, inject ads, or worse: steal credentials. If you care about reliability or security, stick with a trusted provider or a managed service like ScrapingBee.

For example, a friend of mine once used a "free proxy" he found on a random blog to scrape a site. Looked fine at first until his app started sending weird traffic in the background. Turned out the proxy was silently injecting extra requests into his code, basically turning his server into a bot. He only caught it because his cloud bill suddenly spiked.

2. Always prefer HTTPS proxies

If your proxy only supports plain http://, everything you send (including credentials or cookies) is visible to the proxy owner. With https:// proxies, your traffic is at least encrypted on the way to the destination site.

3. Don't hardcode secrets

Never paste usernames, passwords, or API keys directly into your code. Instead, store them in environment variables or a config file that's not committed to Git. Example:

export PROXY_USER=myuser
export PROXY_PASS=mypassword

4. Watch for leaks

Sometimes a site will still see your original IP through headers like X-Forwarded-For. To avoid this, either strip suspicious headers or use a proxy service that takes care of it.

5. Test before going big

Not all proxies work with all sites. Run a quick test (like hitting httpbin.org/ip) before firing off 10,000 requests through a new proxy pool. Saves you time and surprise bans.

When not to use a proxy

Proxies are powerful, but they're not always the right tool. In some cases, adding a proxy just slows things down or adds extra complexity for no real gain. Here are a few situations where you probably don't need one:

1. Public APIs you're allowed to use

If you're calling an API that's open and designed for developers (think GitHub API, weather APIs, payment gateways), a proxy won't add any value. In fact, it may just add latency. Use the official endpoints directly.

2. You already run through a VPN

If your machine or server is behind a VPN, your requests are already masked with the VPN's IP. Adding a proxy on top is usually redundant unless you specifically need rotation or a different geo-location.

3. No geo restrictions

If the site you're accessing isn't blocked in your region and doesn't care about your IP, using a proxy won't magically improve things. Sometimes "direct is best."

4. Internal or private services

For APIs or services running on your own network (e.g. a company intranet or local Docker services), sending traffic through an external proxy just complicates the setup and can expose sensitive data.

Troubleshooting common Axios proxy issues

Even with a clean Axios proxy setup, things can go wrong. Error messages often look scary, but most issues boil down to connectivity, headers, or conflicts with other software. Here are the most common ones and how to fix them.

ECONNREFUSED and timeout errors

What it means: Axios tried to connect to the proxy but couldn't. Either the proxy isn't reachable or something is blocking the connection.

Common causes:

  • Wrong host or port (typo in the config).
  • The proxy is offline or overloaded.
  • Your network (corporate firewall, VPS, cloud host) blocks outbound ports like 8080 or 3128.

Fixes:

  • Double-check the host/port directly from your provider.
  • Test with curl --proxy http://host:port https://httpbin.org/ip to confirm the proxy works.
  • Try from a different network if you suspect firewall rules.
  • Always set a timeout (timeout: 10000) so bad proxies fail fast instead of hanging forever.

User-Agent header configuration

Some sites filter out requests that look "bot-like." If you send the default Node.js or Axios headers, you might get blocked or flagged.

Easy fix: spoof a browser User-Agent.

const res = await axios.get('https://httpbin.org/headers', {
  headers: { 'User-Agent': 'Mozilla/5.0 demo-axios-proxy' },
  proxy: { host: 'proxy.example.com', port: 8080 },
});

This small tweak often bypasses simple filters. You can also rotate User-Agents along with proxies for better results.

Proxy server status and VPN conflicts

Sometimes the problem isn't in your code at all:

  • If your proxy provider is down, nothing you do locally will fix it. Check their status page or dashboard.
  • Running a VPN while also using a proxy can cause conflicts. Your system might route traffic weirdly, causing DNS errors or bypassing the proxy.

Quick checks:

  • Temporarily disable your VPN and retry.
  • Test the proxy directly with curl or Postman to confirm it works outside your code.
  • If stability is critical, use a managed proxy API like ScrapingBee

Get results faster with ScrapingBee

Managing your own proxy setup works fine for small experiments. But when you scale, the overhead piles up: rotation logic, retries, headers, bans, VPN conflicts... it's a lot.

With Web Scraping API, you just send a request with your API key and target URL. ScrapingBee takes care of:

  • Proxy rotation and session handling.
  • Geolocation (choose where requests appear from).
  • Ban and CAPTCHA avoidance.
  • Consistent uptime and reliability.

The free trial gives you 1,000 credits, enough to try it out on real targets. Generate an Axios snippet from your dashboard, paste it into your project, and you're off.

Conclusion

Setting up an Axios proxy in Node.js isn't rocket science once you know the patterns:

  • Basic host / port configs.
  • Adding authentication with proxy.auth.
  • Rotating proxies to avoid bans.
  • Using advanced types like SOCKS5 with an agent.
  • Handling common errors with timeouts and headers.

For small projects, this guide gives you everything you need to test, rotate, and debug proxies with Axios. When you need reliability at scale, managed services like ScrapingBee save you the headache by handling rotation, retries, and uptime automatically.

You now have both the DIY toolkit and the "skip the headaches" option. Time to pick the one that fits your project and start sending requests.

Frequently asked questions

What is Axios and why is proxy configuration important?

Axios is one of the most popular HTTP clients for Node.js and browsers. It makes HTTP requests easy with simple syntax and built-in promises.

Configuring a proxy in Axios matters because it lets you:

  • Mask your real IP address.
  • Bypass geo-restricted content.
  • Avoid hitting rate limits too quickly.
  • Test how your app behaves in different network conditions.

How do I set up a basic proxy with Axios?

You add a proxy object with host and port to your request config. Here's a runnable example that fetches your public IP through a proxy:

import axios from 'axios';

async function testBasicProxy() {
  try {
    const res = await axios.get('https://api.ipify.org?format=json', {
      proxy: {
        host: '123.45.67.89',
        port: 8080,
      },
      timeout: 10000,
    });
    console.log('Your proxy IP is:', res.data.ip);
  } catch (err) {
    console.error('Proxy request failed:', err.message);
  }
}

testBasicProxy();

If everything works, the IP printed will be the proxy's IP, not your own.

Can I use authenticated proxies with Axios?

Yes. Add auth: { username, password } inside the proxy object. If the credentials are wrong or missing, the proxy will return a 407 Proxy Authentication Required error.

Example:

import axios from 'axios';

async function testAuthProxy() {
  try {
    const res = await axios.get('https://httpbin.org/ip', {
      proxy: {
        host: 'proxy.example.com',
        port: 8080,
        auth: {
          username: 'myuser',
          password: 'mypassword',
        },
      },
      timeout: 12000,
    });
    console.log('IP via proxy ->', res.data.origin);
  } catch (err) {
    if (err.response?.status === 407) {
      console.error('Proxy auth failed. Check your username/password.');
    } else {
      console.error('Request failed:', err.message);
    }
  }
}

testAuthProxy();

What are rotating proxies and why are they useful?

Rotating proxies automatically switch which IP is used for each request. Instead of hammering one proxy, you spread requests across many.

This is useful for:

  • Avoiding bans or CAPTCHAs when scraping at scale.
  • Reducing the chance of rate limits.
  • Keeping your session alive longer.

Here's a simple round-robin rotation example:

import axios from 'axios';

const PROXIES = [
  { host: '198.51.100.10', port: 8080 },
  { host: '203.0.113.22', port: 8080 },
  { host: '192.0.2.33', port: 3128 },
];

let index = 0;
function pickProxy() {
  const proxy = PROXIES[index % PROXIES.length];
  index++;
  return proxy;
}

async function fetchIpWithRotation() {
  const proxy = pickProxy();
  try {
    const res = await axios.get('https://api.ipify.org?format=json', {
      proxy,
      timeout: 10000,
    });
    console.log(`Proxy ${proxy.host}:${proxy.port} ->`, res.data.ip);
  } catch (err) {
    console.error(`Failed with proxy ${proxy.host}:${proxy.port}`, err.message);
  }
}

// Run several times to see the IPs change
for (let i = 0; i < 3; i++) {
  await fetchIpWithRotation();
}

You'll see different IPs show up if your proxies are working — proof that rotation is happening.

Can I use SOCKS5 proxies with Axios?

Yes, but not directly with the built-in proxy option. Axios's proxy field only supports HTTP proxies. If you try to drop in a SOCKS5 endpoint, it won't work as Axios doesn't know how to talk SOCKS on its own.

The fix is to use an agent, specifically the socks-proxy-agent:

npm install socks-proxy-agent

Example usage of SOCKS5 proxy with Axios:

import axios from 'axios';
import { SocksProxyAgent } from 'socks-proxy-agent';

// Your SOCKS5 proxy credentials
const username = 'user1';
const password = 's3cr3t!';

// Make sure special characters in credentials are encoded safely
const proxyUrl = `socks5://${encodeURIComponent(username)}:${encodeURIComponent(password)}@127.0.0.1:1080`;

// Create a SOCKS proxy agent
const agent = new SocksProxyAgent(proxyUrl);

async function fetchWithSocks() {
  try {
    const res = await axios.get('https://httpbin.org/ip', {
      httpAgent: agent,
      httpsAgent: agent,
      timeout: 10000,
    });
    console.log('Your SOCKS proxy IP is:', res.data.origin);
  } catch (err) {
    console.error('SOCKS proxy request failed:', err.message);
  }
}
image description
Yasoob Khalid

Yasoob is a renowned author, blogger and a tech speaker. He has authored the Intermediate Python and Practical Python Projects books ad writes regularly. He is currently working on Azure at Microsoft.