How to Scrape Job Postings From Indeed.com

Scrape/extract job postings in bulk and save it in Excel in less than 2 minutes without any coding

Jay M. Patel
Aug 8, 2020 · 2 min read
Image for post
Image for post

Indeed.com is an extremely powerful job search engine and it’s a perfect tool to scrape job postings for a particular city/state/zipcode.

In this post, we will use absolutely no coding to extract job titles, company names, location, salary, summary, url from job postings and save it as a CSV file in simply four steps.

Step 1: Indeed Job Scraper API is a great option if you want to get extracted information for free. You’ll have to signup with Algorithmia but it’s free (no credit card required) and you get 10,000 free credits which are more than enough for thousands of API calls a month.

Step 2: Once you are signed in and in the console, simply edit out the search term (presently set to “python programmer”) to something you want, and edit the location (“set to “Atlanta, GA”) and click on run example (circled in red). You will get around 15 results per query; simply change the page number to fetch additional results for the same query. Once you get the result on the right pane; click on “copy” above to copy the JSON file.

{"search_terms":"python programmer", "location":"Atlanta, GA", "page":"1"}
Image for post
Image for post

Step 3: We will need to convert the JSON file we copied in above step into a CSV file. Just go to a JSON to CSV converter and paste the contents in the text box shown. You’ll be able to preview the CSV file and once it appears there, click on the Download CSV button.

Image for post
Image for post

Step 4: Once you download the CSV, just open it in Excel, open office or some other spreadsheet viewer.

Image for post
Image for post

Note: You probably must’ve noticed that you can fetch 15 results per API call. Simply change the page numbers in the input page (step 2) to get additional results. Alternately, just contact us at Patel.jay@specrom.com and we can modify the API as per your requirements.

Web Data Extraction

We cover web scraping/crawling and natural language processing to extract structured data.

Jay M. Patel

Written by

Cofounder/principal data scientist at Specrom Analytics (specrom.com) natural language processing and web crawling/scraping expert. Personal site: JayMPatel.com

Web Data Extraction

We cover all the cutting edge natural language processing, machine learning and AI powered strategies to extract web data on big data scale.

Jay M. Patel

Written by

Cofounder/principal data scientist at Specrom Analytics (specrom.com) natural language processing and web crawling/scraping expert. Personal site: JayMPatel.com

Web Data Extraction

We cover all the cutting edge natural language processing, machine learning and AI powered strategies to extract web data on big data scale.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store