Geek Culture

A new tech publication by Start it up (https://medium.com/swlh).

Member-only story

Easy Way building Web Crawlers using Laravel

--

Photo by Patti Black on Unsplash

As web developers, sometimes we want to monitor our websites. We want to make sure that the website doesn’t have deadlinks or pages that didn’t return success. Of course, there are a lot of tools out there that can help us achieve that, but, can we make the tools by ourselves?

We can build our own Crawler project using the Spatie package, where, as it is the name, will automatically crawl every link that our web has. The package also has callbacks if anything failed or succeeds when doing the crawling. It’s so useful for us to build our own crawler project. But that’s not all of it, Spatie also has a package called browsershot to capture or convert HTML to an image, pdf, or string. So in this article, I’ll show you to create a crawler project so we can monitor any websites, as well as we’ll take a screenshot for every page we’ve crawled.

Content Overview

  • Goals
  • Migration & Installations
  • Implementation

#1 Goals

At the end of the article, we’ll have running crawlers that will capture every page that return 200.

--

--

Cerwyn Cahyono
Cerwyn Cahyono

Written by Cerwyn Cahyono

PHP/Backend Engineer at Undercurrent Capital Pte Ltd — Data Science Enthusiast

No responses yet