How to Extract Web Data using Node.js?

In this blog, we’ll find out how to utilize Node.js as well as its packages for doing a quick and efficient data extraction for single-page applications. It will help us collect and use important data that isn’t always accessible using APIs. Let’s go through it.

Tip: Sharing and Reusing JS Modules using bit.dev

Utilize Bit for summarizing components or modules with all the setup and dependencies. Share them using Bit’s cloud, work together with the team as well as utilize them anywhere.

What is Web Data Extraction?

Web data extraction is a method used for scraping data from websites with a script. Data scraping is a way of automating the difficult task of copying data from different websites.

Generally, web Scraping is performed when the desired websites don’t render the API to fetch data. Some general data scraping scenarios include:

  • Extracting emails from different websites for the sales leads.
  • Extracting news headlines from different news websites.
  • Extracting product data from different e-commerce sites.

Why do we require web scraping while e-commerce sites expose APIs (Product Advertising APIs) to fetch or collect product data?

E-Commerce sites only uncover some of the product’s data for fetching through APIs so, web scraping is a more efficient way of collecting maximum product data.

Product comparison websites normally do data scraping. Even Google does scraping and crawling to index search results.

What Would We Want?

Starting with data scraping is easy as well as it is divided into two easy parts:

  • Extracting data by doing an HTTP request
  • Scraping important data through parsing HTML DOM

We would be utilizing Node.js for data scraping. We would also utilize two open-source npm modules:

  • Axios — It is a promise-based HTTP client for browser as well as node.js.
  • Cheerio -Cheerio makes that easy to choose, edit, as well as view DOM components.

You may learn more regarding comparing well-known HTTP request libraries.

Tip: Don’t duplicate the common code. Utilize tools like Bit for organizing, sharing, and discovering components for apps to create quicker. Just take one look.

Setup

The setup is very easy. We make a new folder as well as run the command within the folder to make a package.json file. Let’s make a recipe for making the food delicious.

Before starting cooking, let’s get ingredients for the recipe. Add Cheerio and Axios from npm like our dependencies.

npm install axios cheerio

Then, use them in the `index.js` file

const axios = require('axios'); const cheerio = require('cheerio');

After collecting all the ingredients, let’s begin our cooking. We are extracting data from a HackerNews site for which we have to make the HTTP request for getting website content. And that’s where axios have a role to play.

Our answer will appear like this -

We are collecting related HTML content that we find while making the request from browsers like Chrome. Then, we want some help from Chrome Developer Tools for searching through the HTML of the webpage as well as choosing the necessary data.

We need to extract News headings as well as their related links. You could view the HTML of a webpage through right-clicking on a webpage as well as choosing “Inspect”.

Parse with HTML using Cheerio.js

Cheerio is a jQuery for Node.js, where we utilize selectors to choose tags of the HTML document. A selector syntax got borrowed from jQuery. With Chrome DevTools, we have to get selectors for different news headlines as well as their links. Let’s add a few spices to the food.

Initially, we have to load the HTML. The step in the jQuery is implied as jQuery works on one, supported-in DOM. Using Cheerio, we want to pass the HTML documents. After loading an HTML, we repeat all the table row incidences

to scrape every news available on a page.

The result will appear like this:

Originally published at https://www.3idatascraping.com.

--

--

Data Scraping Services and Data Extraction

3i Data Scraping is an Experienced Web Scraping Service Provider in the USA. We offering a Complete Range of Data Extraction from Websites and Online Outsource.