New Amazon API: We've just released a brand new way to scrape Amazon at scale Start Free Trial 🐝

How to Build a VBA Web Scraper in Excel: 2026 Step-by-Step Guide

13 January 2026 | 11 min read

Looking for how to build a VBA web scraper in Excel? If you're just starting out, there are quite a few things you need to learn. The process has evolved significantly in 2026, especially with Internet Explorer’s complete deprecation.

In this guide, I’ll walk you through creating a modern, reliable Excel VBA scraper that leverages an application programming interface instead of brittle browser automation. This method will save you maintenance headaches and let you perform web scraping directly from an Excel workbook.

We’ll stick to the VBA programming language inside the VBA editor to write code that scrapes and extracts data for data analysis in a familiar Microsoft Office and Windows environment. Let's start!

Quick Answer (TL;DR)

VBA is Microsoft’s built-in automation language used to script actions inside many desktop apps. People use it to turn repetitive, manual steps into web scraping scripts. It works by sending HTTP requests (often to an API), parsing returned HTML/JSON to extract data, and writing the results directly into a Microsoft Excel spreadsheet

You can use the following VBA code to perform Excel web scraping with ScrapingBee’s API:

Sub ScrapingBeeExtractData()
    ' API key from ScrapingBee
    Dim apiKey As String
    apiKey = "YOUR_API_KEY"
    
    ' Target URL to scrape
    Dim targetUrl As String
    targetUrl = "https://quotes.toscrape.com/"
    
    ' Create HTTP request
    Dim http As Object
    Set http = CreateObject("WinHttp.WinHttpRequest.5.1")
    
    ' Build API request URL with data extraction parameters
    Dim requestUrl As String
    requestUrl = "https://app.scrapingbee.com/api/v1/?api_key=" & apiKey & _
                 "&url=" & WorksheetFunction.EncodeURL(targetUrl) & _
                 "&extract_rules={""quotes"":{""selector"":"".quote"",""type"":""list"",""output"":{""text"":{""selector"":"".text""},""author"":{""selector"":"".author""}}}}"
    
    ' Send the request
    http.Open "GET", requestUrl, False
    http.Send
    
    ' Parse JSON response
    Dim jsonResponse As Object
    Set jsonResponse = JsonConverter.ParseJson(http.ResponseText)
    
    ' Write headers
    Cells(1, 1).Value = "Quote"
    Cells(1, 2).Value = "Author"
    
    ' Write data to sheet
    Dim i As Integer
    For i = 0 To jsonResponse("quotes").Count - 1
        Cells(i + 2, 1).Value = jsonResponse("quotes")(i)("text")
        Cells(i + 2, 2).Value = jsonResponse("quotes")(i)("author")
    Next i
    
    MsgBox "Data extraction complete!", vbInformation
End Sub

To be more specific, the above code uses ScrapingBee’s Data Extraction API to reliably get structured website data without dealing with browser automation issues. It’s ideal when you want reusable code and simple custom functions that teammates can run from the developer tab.

This code should be enough to get you started, but if you want to learn more about web scraping with VBA, continue reading.

Understanding VBA Web Scraping in Excel

VBA web scraping combines Excel’s familiar interface with automated data extraction from websites. Instead of manually copying HTML content from any web page, your VBA code issues HTTP requests, receives JSON, and writes the desired data into cells. This keeps the scraping process maintainable, simple, and avoids brittle document object model (DOM) clicks.

A modern and highly reliable way to do this is by using ScrapingBee’s API, which handles the complexities of browser rendering and JavaScript execution on its servers. Your VBA script just makes a straightforward HTTP request and gets clean, structured data in return, perfect for inserting into Excel without relying on outdated browser automation.

However, you don't always have to write VBA code to gather data in Excel. If you’re interested in exploring different approaches, check out the "How to scrape data from a website to Excel" article, which covers multiple methods beyond VBA.

What Is VBA and How It Works in Excel

Visual Basic for Applications (VBA) is the programming language developed by Microsoft that lives inside Microsoft Office applications. In Excel, it runs as a VBA script inside a VBA project (insert a new module / VBA module) and can interact with ranges, files, and the network.

With the Microsoft HTML Object Library, VBA can reason about HTML elements, HTML tags, and even an HTML document structure when needed. Although modern API-first methods rarely require deep DOM parsing.

How VBA Enables Web Scraping

At its core, VBA web scrapers typically follow this workflow:

  1. Send HTTP requests to an API that VBA interacts with.

  2. Receive structured JSON derived from HTML source code or rendered dynamic content.

  3. Parse and extract data into your spreadsheet to manipulate data further.

This is more reliable than driving a legacy browser. If you must touch live pages, tools like developer tools/inspect element help locate selectors in the document object model (DOM), and some web scraping APIs can execute JavaScript server-side for JS-heavy sites.

When To Use VBA Over Other Scraping Tools

In my experience, you should automate tasks with VBA when:

  1. Your team is already comfortable with Microsoft Office tools and VBA

  2. The workflow begins and ends in Excel

  3. You need to integrate web data into existing Excel-based processes

  4. Non-technical colleagues need to run the scraper without installing additional software

The point is that you should focus on making the most of Excel when that’s what your situation requires. After all, if you outgrow it, you can always hand off to other programming languages while keeping Excel as the reporting layer.

Setting Up Excel for VBA Web Scraping

Now that we have the basics covered, it's time to set up your environment and get in coding.

Enable Developer Tab in Excel

First, you’ll need to enable the Developer tab, which is hidden by default:

  1. Right-click anywhere on the Excel ribbon and select Customize the Ribbon

  2. In the right column, check the box next to Developer

  3. Click “OK” to save changes

Tabexcel

The Developer tab will now appear in your Excel ribbon. As a result, you can click Visual Basic to open the editor, record a VBA macro, and manage references.

Open Visual Basic Editor

There are two easy ways to access the Visual Basic Editor:

  • Press Alt+F11 on your keyboard, or

  • Click the Visual Basic button in the Developer tab of the ribbon

Visualeditor

This opens the VBE, where you’ll insert a new module and keep your custom scripts.

For a more detailed walkthrough, check out our guide on Web Scraping with Visual Basic.

Insert a New Module for Your Code

Within the Visual Basic Editor:

  1. Right-click on your project in the Project Explorer panel

  2. Select Insert, then Module

  3. A new module will appear where you can write your VBA code

Sheet

This module is where all your web scraping code will live.

Add Microsoft HTML Object Library and Internet Controls

For modern HTTP requests, you’ll need to add references to the necessary libraries:

  1. In the Visual Basic Editor, go to Tools and then References

  2. Check the box for Microsoft WinHTTP Services (for HTTP requests).

  3. If you plan to parse JSON, also add the Microsoft Scripting Runtime

  4. Keep the Microsoft HTML Object Library available if you still parse DOM, though API-first is favored.

  5. Click “OK” to save

If you plan to parse JSON responses, you also need to import a JSON parser. The recommended option I use is the free JsonConverter.bas module from VBA-JSON. You should download the .bas file and import it via File, then Import File in the Visual Basic Editor. This enables the JsonConverter.The ParseJson function is used in your VBA scraper.

Writing a Basic VBA Web Scraper Using Internet Explorer

Historically, Excel VBA scrapers relied on open Internet Explorer or a Selenium-type library for automation. Old flows created an InternetExplorer object, navigated to a page, waited for ReadyState, then inspected the DOM. While you can scrape websites this way, the method was fragile, slow, and blocked by modern sites.

Even if you swap to Microsoft Edge (or the Microsoft Edge browser), you’re still juggling drivers, timing, and dynamic websites that change frequently.

While I’ll briefly explain this approach for context, it’s important to understand that it is no longer used, since Internet Explorer itself has been fully deprecated by Microsoft (officially phased out in June 2022).

Create InternetExplorer Object in VBA

The old method started with code like this:

Dim IE As Object
Set IE = CreateObject("InternetExplorer.Application")
IE.Visible = True  ' Make the browser visible

This method required a visible or hidden browser window controlled by VBA using Dim Browser, Dim HTML, or Object Dim Link. These are no longer necessary when using a modern scraping API, which should handle browser execution in the cloud and return clean JSON responses.

The next step in the old approach was navigating to a website:

IE.navigate "https://example.com"

This method frequently breaks now because modern websites use JavaScript frameworks that are incompatible with Internet Explorer or explicitly block IE user agents.

Today, instead of using IE.navigate, you simply set a Dim targetUrl variable and build an API request string. This string is then passed to the API's cloud-based service, which fetches and processes the data on your behalf.

Wait for Page To Load Using a Do While Loop

To handle asynchronous loading, the old method used a blocking loop:

Do While IE.readyState <> 4 Or IE.Busy
    Application.Wait DateAdd("s", 1, Now)
Loop

This was inefficient and often timed out on complex pages, causing scripts to hang indefinitely.

Extract HTML Content Using getElementsByClassName()

Finally, data extraction used DOM manipulation:

Set elements = IE.document.getElementsByClassName("some-class")

This approach is fragile because it relies on a specific HTML structure that can change without notice, breaking your scraper.

For real projects in 2026, use the API approach I specified in the TL;DR section of this article. It is faster, stable, and avoids browser-automation pitfalls. Don't forget to check out our Data Extraction docs for more useful information.

Saving Scraped Data to an Excel Sheet

Now that we understand the limitations of the old approach, let’s focus on the modern API-based method. Once you’ve received JSON data from ScrapingBee’s API, saving it to Excel is straightforward.

The following code snippets assume you’ve already received a JSON response from the API:

Use Cells(row, column).Value To Write Data

Writing data to specific cells is simple:

' Write a single value to cell A1
Cells(1, 1).Value = jsonData("some_field")

' Write to a specific named cell
Range("ProductName").Value = jsonData("name")

This direct approach gives you precise control over where each piece of data lands in your spreadsheet.

Loop Through HTML Elements To Extract Text

When dealing with arrays of data (like multiple products), use a loop to populate rows:

For i = 0 To jsonData("products").Count - 1
    ' Write each product to a new row
    Cells(i + 2, 1).Value = jsonData("products")(i)("name")
    Cells(i + 2, 2).Value = jsonData("products")(i)("price")
    Cells(i + 2, 3).Value = jsonData("products")(i)("description")
Next i

This pattern works well for any list of items you need to extract from a webpage.

Close Browser With browser.Quit

With the ScrapingBee API, there’s no local browser to close; the browsing happens on the API's servers. This eliminates a common source of memory leaks and resource issues that plagued the old IE-based approach.

The API requires less maintenance and avoids browser crashes that might otherwise interrupt your data extraction tasks.

Limitations and Alternatives to VBA Scraping

While our modern API-based VBA scraping approach solves many problems, it’s important to understand its limitations and alternatives.

Internet Explorer Deprecation Issues

Internet Explorer is completely deprecated as of 2022, making any code that relies on the InternetExplorer object unreliable at best and non-functional at worst. Microsoft has been clear that IE automation is no longer supported, and websites increasingly refuse to work properly with IE.

SeleniumBasic as a Modern Alternative

Some developers have turned to SeleniumBasic, a VBA library that automates Chrome or Firefox instead of IE. While this works, it comes with significant drawbacks:

  1. It requires installing browser drivers and keeping them updated

  2. The code is more complex and prone to timing issues

  3. Browser updates can break your automation unexpectedly

For most use cases, I’ve found the API-first approach with ScrapingBee offers fewer moving parts and greater reliability, especially for non-technical teams.

Cross-Platform Limitations of VBA

It’s important to acknowledge that VBA is primarily Windows-focused. While Excel for Mac does support VBA, certain objects (including many web-related ones) are unavailable or function differently on macOS.

When To Switch to Python or APIs

As your scraping needs grow in scale or complexity, you might outgrow VBA. Consider switching to Python or a dedicated API approach when:

  1. You need to scrape hundreds of pages regularly

  2. Complex authentication or anti-bot measures are involved

  3. You require advanced data processing beyond Excel’s capabilities

Even then, Excel can remain your data destination. Python can write to Excel files or APIs can deliver data directly to your spreadsheet. ScrapingBee’s Data Extraction API supports both CSS and XPath selectors for precise data targeting, and even offers AI-powered extraction for complex scenarios.

Frequently Asked Questions (FAQs)

Is VBA still relevant for web scraping in 2026?

Yes, VBA remains relevant for Excel-centric workflows, especially when paired with modern APIs like ScrapingBee. While traditional IE-based scraping is obsolete, the API approach allows VBA to focus on what it does best, manipulating Excel data, while outsourcing the complex browser interactions to specialized services.

How can I create a basic web scraper in Excel using VBA?

The most reliable approach is using WinHTTP to make API calls to a service like ScrapingBee, which handles the actual web scraping. You’ll need to enable the Developer tab, create a VBA module, reference the WinHTTP library, and write code that sends requests and processes the JSON responses.

What are the limitations of VBA web scraping?

VBA web scraping is limited by Excel’s memory constraints, Windows-centric nature, and lack of native support for modern web technologies. Direct browser automation via VBA is now unreliable due to Internet Explorer’s deprecation. However, these limitations can be mitigated by using external APIs.

Are there alternatives to VBA for web scraping in Excel?

Yes. You can use Excel’s built-in Power Query (Get & Transform) feature, which has better web connectivity. For more complex needs, Python with libraries like Pandas can scrape data and write directly to Excel files. There are also specialized add-ins available for connecting Excel to web data.

How do I save scraped data to an Excel sheet using VBA?

After receiving data via an API, use the Cells(row, column).Value syntax to write to specific cells, or Range(“named_range”).Value for named ranges. For arrays of data, use a For loop to iterate through the items, writing each to a new row. Remember to format cells appropriately for dates and numbers.

image description
Kevin Sahin

Kevin worked in the web scraping industry for 10 years before co-founding ScrapingBee. He is also the author of the Java Web Scraping Handbook.