I recently launched an alpha-stage side project (Officeless.dev) on LinkedIn to get some insight from my network. It went well, with the 144 connections I have I managed to reach 1,100+ people in 5 days. If you’d like to check it out, the post is here.

Officeless.dev is a remote-first job board that curates postings from popular sources. I’m actively working to add additional features but wanted to gain some insight into my target audience by leveraging my small network of friends, colleagues, and early-career professionals.

This article will explain how I debuted on LinkedIn, my thoughts on why I chose a small unveiling, and what the outcomes were. …

This is part 3 of building a web scraping tool with Python. We’ll be expanding on our scheduled web scraper by integrating it into a Django web app.

Part 1, Building an RSS feed scraper with Python, illustrated how we can use Requests and Beautiful Soup.

In part 2 of this series, Automated web scraping with Python and Celery, I demonstrated how to schedule web scraping tasks with Celery, a task queue.

A stock photo of a MacBook with a text editor displayed.
A stock photo of a MacBook with a text editor displayed.
Photo by Christopher Gower on Unsplash

Background:

Previously, I created a simple RSS feed reader that scrapes information from HackerNews using Requests and BeautifulSoup (it’s available on my GitHub). After creating the basic scraping script, I illustrated a way to integrate Celery into the application to act as a task management system. …

This is part 2 of building a web scraping tool with Python. We’ll be using integrating Celery, a task management system, into our web scraping project.

Part 1, Building an RSS feed scraper with Python, illustrated how we can use Requests and Beautiful Soup.

In part 3 of this series, Making a web scraping application with Python, Celery, and Django, I will be demonstrating how to integrate a web scraping tool into web applications.

Image for post
Image for post
Photo by Adi Goldstein on Unsplash

Background:

In a previous article, I created a simple RSS feed reader that scrapes information from HackerNews using Requests and BeautifulSoup (see the code on GitHub). …

This is part 1 of building a web scraping tool with Python. We’re using Requests and BeautifulSoup. In parts 2 and 3 of this series, I’ll be illustrating how to build scheduled scraping with Celery and integrating this into a web application with Django.

A photo of a MacBook with a code editor open.
A photo of a MacBook with a code editor open.
Photo by Fabian Grohs on Unsplash

Background:

I’ve utilized web scraping in different capacities for my projects, whether it be data collection for analysis, creating notifications for myself when sites change, or building web applications. This code is available publicly on my GitHub under web_scraping_example.

This guide will walk through a quick RSS feed scraper for HackerNews. The RSS feed itself is located here, it will have updates of new posts and activity on the site at regular intervals. …

Most of the Stripe tutorials that I’ve seen have revolved around using dj-stripe or similar packages. Here I’m creating a Stripe API call using Django’s CBV. Find this on GitHub here.

A sample photo depicting code on a screen.
A sample photo depicting code on a screen.
Photo by Markus Spiske on Unsplash

Background:

I have found that the Stripe integration tutorials for Django typically either pre-date the widespread use of the Stripe API or are tailored toward using a widely known package like dj-stripe.

This example focuses on integration using the stripe-python API to call a form submission within a class based view.

If you’d like to check out the code directly it’s available on GitHub under cookiecutter_example.

Recently, I deployed an application that didn’t use users at all. I’ve noticed this is somewhat uncommon in the Django world, especially in the cookiecutter-django space.

A project planning/wireframing stock photo
A project planning/wireframing stock photo
Photo by Alvaro Reyes on Unsplash

Background:

I’ll be publishing this on my GitHub under the cookiecutter_example repository here. This repository is subject to change, but each example article is available in its own standalone branch for posterity.

In a recent deployment, I scrapped the popular utilization of Django’s user models and out-of-the-box functionality of cookiecutter-django’s abstracted user model.

Use Case:

The application that I created did not require user login, but rather a single payment form that would handle all client interactions via email once submitted. …

I struggled through my first deployment, so I wrote a guide for my future self and decided to share it.

Photo of laptop with code on it.
Photo of laptop with code on it.
Photo by Maxwell Nelson on Unsplash

Background:

Doing anything for the first time can be a puzzling experience. Often we can feel overwhelmed and may even find ourselves tempted to give up. I struggled with my first deployment but I learned a lot and have put those lessons to good use. Originally I created this guide for myself and then decided to publish it in the hopes that it might help others. This article is a collection of links that serve as references to tutorials, general notes for deployment, and a step-by-step overview of the process I take to deploy an application. …

About

Matthew Wimberly

SQL DBA and constant tinkerer. https://mattdood.com

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store