The Creation of JobBots | Full-Stack Web App

Ernesto Gonzalez
Strategio
Published in
4 min readFeb 13, 2023

I have been in the process of constant learning for the last three years. When I started to learn web development, all the projects I did were front-end oriented. I enjoyed styling my site and making it look as cool as possible. APIs, databases, and backends were always a barrier, that is, until 2021. In 2021, I committed myself to learn backend technologies and becoming a full-stack developer.

The Beginning of a New Stage

(GetyarnIO “The beginning of a new era, the dawning of a new age…”)

JobBots was a project built with a team of 4 developers (including myself) and my first ever full-stack and team project. We were given one month to do everything from storyboarding to design, development, and deployment. When I started working on this project, I was very new to concepts such as REST APIs or front-end technologies such as React.

JobBots is an application that simplifies the job application process for beginners by fetching or scraping job postings of websites such as Glassdoor or Indeed and then displaying the available options in the user dashboard. The main idea was that the user should be able to apply to different job postings in one click by uploading their resume.

The technologies used to develop the JobBots were ReactJS with Bootstrap as a framework and Python with Flask, SQLAlchemy, and Flask JWT Extended as frameworks.

My Role in the Development of JobBots

As mentioned, before JobBots was even conceived, I committed myself to become a full-stack developer. I worked on both the front-end and back-end of JobBots, and I mainly worked on the user's dashboard, account settings, login, and signup pages. I also initially worked on the bot that would scrape the job postings.

Time to Rock N Roll

(LOLBrary “time to work”)

On the front end, I created the login and signup forms as floating modals to minimize the number of times users switch between pages. Creating these functionalities on the back end was a fun experience.

I had to create a route to allow the user to change their password by sending an authorization code to the user's email. I used the MailGun API to send emails from a custom domain. The front end would get the authorization code from the backend as a response from the API. Nonetheless, when the user submits the code on the front end, the validity of the code is rechecked with the code stored in the backend.

The login and signup routes were straightforward to create in the backend. Every time a user submits either of the forms, we check with the database to ensure that the information received is correct. User settings are more of the same thing. Request the backend, check with the database, and retrieve and update the necessary information.

Starting out with the bot was the most complicated part. I had never done any kind of web scraping before. After looking for libraries online, I decided to go with Selenium and Beautiful Soup. Selenium allowed me to navigate a virtual browser and interact with every part of a website. Beautiful Soup allowed me to extract text from the desired portions of a website. For example, if I wanted to extract the job title postings from GlassDoor, I would use Selenium to navigate to and through GlassDoor and then Beautiful Soup to get every job posting title on the site.

Result

Thankfully, I was able to get most of the functionalities down. However, the web scraper gave me a lot of issues. Since we were working in a virtual environment with no apps, we had to use a headless browser for Selenium, which caused the scraper to crash or prevent it from finding many critical HTML elements (header tags, p tags, anchor tags, etc..) in a site.

Outside of the scrapper, I managed to make all the components I worked on work fluidly with each other. For instance, the user would be redirected to the dashboard right after login. The user could also change all their data in one submission instead of individually changing each piece of mutable user data. These kinds of small tweaks allow the user experience to feel more fluid and responsive.

What I Learned

Designing the website structure and looks is essential before working on it. I know it sounds like common sense, but this was a mistake we made as a team. We had the user's stories down but no feasible design. We designed the website as we worked on it, which made the website look unappealing to the eyes in the end. Nonetheless, the communication between the team was solid, and it was a great first time working in a team. Additionally, thanks to working on the front-end and back-end, I got much more comfortable using and creating APIs.

JobBots was a fantastic great project with a lot of potential. Even though it was not as great as it could be, the experience I got from it was a huge step forward.

Check JobBot outs here!

Thanks for making it all the way through. Comment and clap if you liked it! Suggestions and corrections are always welcomed!

(EETimes Robot Waving)

--

--