Web Scraping Using BeautifulSoup + Pushing Data to MySQL Database

Create your own dataset by regularly scrape and log data from websites.

Ammar Chalifah
CodeX
5 min readSep 12, 2021

--

In this article, we will learn how to scrape data from web’s HTML content with BeautifulSoup, a popular Python package for web scraping. Then, we will learn how to push the data into a table in MySQL database.

Scrape HTML data with BeautifulSoup

BeautifulSoup is a Python package for parsing HTML and XML documents. Every website accessed via a web browser consists of HTML documents, which can be seen by inspecting the page.

Book Depository’s new release page and HTML. — Image by author.

Let’s say, you are a bookworm. You want to keep yourself updated with new books from Book Depository, and you want to update your list regularly. You can check New Release page in Book Depository’s website every now and then (by visiting https://www.bookdepository.com/top-new-releases), but you think it will be far easier to process those information if you can scrape the data and save it in tabular format.

After inspecting the elements of that website, you discovered that all book elements in that page are stored under div tag with "book-item" class. From each book-item elements, you can scrape various informations, such as ISBN, title, author name, published date, book format, and price. You want to record all informations, plus one additional information i.e. scraping date. So, now you have a rough plan for your web scraping journey: (1) first, scrape informations from HTML and convert it to tabular format, and (2) save the table to external location. For the sake of simplicity, we will use requests package to send a HTTP request to get the HTML response, use BeautifulSoup to parse the HTML, use pandas to convert the information to a DataFrame, and lastly push all data to a MySQL database by using sqlalchemy package.

HTML tags of book items in New Release page of Book Depository — Image by author.

Install BeautifulSoup using a Python package manager. If you use pip, install it by using pip install beautifulsoup4. Then, import several packages and start requesting the HTML via HTTP request.

In the above code snippet, a get request was sent to the URL. The response stored in page variable, which later converted to a parsed data object by using BeautifulSoup object. Then, we take all HTML elements with book-item class and store it in books list. The books variable consists hundreds of book element, with each element looks like this:

Based on the element’s structure, we can determine where we want to scrape the informations that we need from. For instance, ISBN, price, and currency can be scraped from tag a with "btn btn-sm btn-primary add-to-basket" class. Title can be scraped from meta tag with "name" attribute for itemprop, where author name, book format, and published date can be scraped from p tags with "author", "format", and "published" class respectively.

For the first book element, each information can be scraped by using these lines of codes:

To scrape information from all book elements, create a dictionary where each key corresponds to each column (e.g. title, author, or isbn), and each value contains a list of values for the respective columns. Then, iterate over to books variable and append each list by using the same method to scrape individual informations as shown in the previous code snippet.

Finally, convert the dictionary to a pandas DataFrame.

Scraped data in DataFrame format — Image by author

Pushing DataFrame to MySQL Database

Assume you have a MySQL database, with a table named "book-depo-new-releases". Along with that table, you also have the username and password of your MySQL database, the host, and database name. To push dataframe to MySQL, create an engine with sqlalchemy, then push the data directly by utilizing pandas method.

By using append option, you won’t overwrite previously written rows in that table. Moreover, the DataFrame’s index also won’t be written to the MySQL table.

--

--

Ammar Chalifah
CodeX

Finance Data Engineer at Bolt, Estonia, and an aspiring comic artist.