Book Image

Professional JavaScript

By : Hugo Di Francesco, Siyuan Gao, Vinicius Isola, Philip Kirkbride
Book Image

Professional JavaScript

By: Hugo Di Francesco, Siyuan Gao, Vinicius Isola, Philip Kirkbride

Overview of this book

In depth knowledge of JavaScript makes it easier to learn a variety of other frameworks, including React, Angular, and related tools and libraries. This book is designed to help you cover the core JavaScript concepts you need to build modern applications. You'll start by learning how to represent an HTML document in the Document Object Model (DOM). Then, you'll combine your knowledge of the DOM and Node.js to create a web scraper for practical situations. As you read through further lessons, you'll create a Node.js-based RESTful API using the Express library for Node.js. You'll also understand how modular designs can be used for better reusability and collaboration with multiple developers on a single project. Later lessons will guide you through building unit tests, which ensure that the core functionality of your program is not affected over time. The book will also demonstrate how constructors, async/await, and events can load your applications quickly and efficiently. Finally, you'll gain useful insights into functional programming concepts such as immutability, pure functions, and higher-order functions. By the end of this book, you'll have the skills you need to tackle any real-world JavaScript development problem using a modern JavaScript approach, both for the client and server sides.
Table of Contents (12 chapters)

What is Scraping?

For the remainder of this chapter, we will be talking about web scraping. But what exactly is web scraping? It's the process of downloading a page and processing its content to execute some repetitive automated tasks that would otherwise take too long to do manually.

For example, if you want to get car insurance, you need to go to each insurance company website and get a quote. That process normally takes hours since you have to fill in a form, submit it, wait for them to send you an email on each website, compare prices, and pick the one you want:

Figure 3.14: The user downloads content, types data in, submits it, and then wait for the results

So why not make a program that can do that for you? That's what web scraping is all about. A program downloads a page as if it were a human, scrapes information from it, and makes decisions based on some algorithm, and submits the necessary data back to the website.

When you're...