How I have used Python over the last five years
It is very common for someone to say Python “can be used for anything”. And, generally speaking, I agree with this. Python has a very rich ecosystem of libraries and its simple syntax makes the code more readable and easier to write.
I have been using Python for almost five years now (with some breaks here and there) and so I decided to compile a list of use cases I’ve had for this wonderful language. I believe this may be more valuable to junior coders as I hope this can give you some thoughts on why you should pursue learning Python over other languages early on. However, there might be a use case in this list that makes senior programmers decide to use Python on their next personal project :-)
Yes, my first “Hello world” was written in Python. After that, I have learned generic programming concepts in Python and to this day I still default to Python as much as possible when something/someone requires me to write code.
I have learned all about conditional clauses, loops, data structures, etc. in Python. As the idea goes, I have learned about the concepts in Python and now, when needed, I “simply” look up the syntax for those concepts and operations when working with other languages. It it perfectly fine to have a preference for a language/framework, but always remember that you are learning how to do something, the language is just how you explain what you’re trying to do.
I have done a fair bit of web scraping with Python over the years. This was one area that attracted me right away and I came up with plenty of excuses along the way to practice this skill. At one point I did a so called “daily news scraper” that went through a handful of news sources I followed at the time. Some had APIs available (e.g. Reddit, Hacker News), but others I had to scrape the web pages myself (e.g. Webtoons, Wccftech). The end result was a basic HTML+CSS page generated through the Python script. This wasn’t hosted anywhere, I simply ran the script manually every day to get news in a single place.
I should also mention there is some variation in web scraping. Sometimes you interact with web pages statically, i.e. you load the page, read the information by pointing to the HTML elements and that’s it; others you need to interact with the page to display the information you want, e.g. click a button to display a text box or fill in user credentials to login and have authenticated access. In practical terms, my experience was using BeautifulSoup4 for the static web scraping and Selenium for the dynamic web scraping.
Working with APIs
This is a fairly generic topic, but I really like to make API calls in Python. requests is a wonderful library to send the common GET and POST requests, and its functions include some very useful parameters to handle, for instance, different authentication scenarios.
Even if, for example, I am using Postman to try out an API, I can just copy paste the generated code to integrate the calls into my Python scripts.
However, I really want to shout out Streamlit, my latest Python obsession. Basically, it is a low-code-ish library to build web apps that really bring home the point of how valuable it can be to for the user to have a graphical interface to interact with your code. The team positions Streamlit more towards showcasing Data Science projects, e.g. letting the user tweak the parameters for a Machine Learning model to see the results right away. In my experience, I have used Streamlit whenever I can as an easy way to create a front-end that I can deploy to Heroku with a few clicks. My recent applications have been for wallpaper generation of two games I have played.
Data Science (Data Engineering, Data Analysis and ML)
Yeah this is definitely the big one for me. I started out, as many people, with pandas, NumPy and Matplotlib to transform and visualise data. But, over time I have branched out to use Python for connecting to databases (MySQL and Postgres), and eventually Machine Learning.
I started out with data analysis to learn the fundamentals of tabular data (pandas and NumPy) and visualising data via programming (i.e. not using Excel at the time). I have learned quite a lot about shaping data to arrive at the desired shape for visualisation, which helps me to this day when working with SQL and Power BI.
I have done way less data engineering in Python, but I have learned how to connect to databases programmatically to create/read/update/delete data. At one point I used Stack Overflow Developer survey datasets to process the data and store it in a Postgres database. I then consumed this data via SQL queries in Python using the Psycopg2 library and visualised the data with Plotly (an alternative to Matplotlib).
Recently I have also started to dive into PySpark (i.e. the Python implementation of Spark) via Databricks, but it is way more than “pandas at large scale on the cloud” so I have a lot to learn especially to write optimised PySpark code.
Lastly for ML, I still consider myself a beginner in ML as most of my experience is around simpler classification and regression models. However, I have also used Python in the context of Azure Machine Learning Studio to develop, deploy and test ML models. In the future I intend to build more complex models, or at least create more valuable models, and at some point add computer visions to my skill set.
Surprisingly this one has been pretty big for me in the last 1–2 years. As I got more into posting on social media I started writing Python to automate the images I was posting (e.g. tweet screenshots and quotes) and at some point I started coming up with wallpaper generators for games I played.
Mobile anime games (and yes I mean gachas) usually have fan wikis full of information and character profiles, so I’ve been scraping the URLs for the character art. Then I use a couple of Python scripts to load those images, apply some transformations like resizing, add shadows, add custom backgrounds, etc. to generate mobile and desktop wallpapers.
Pillow has been a fantastic library to process and transform images programmatically, and these wallpaper generation personal projects have been a culmination of sorts for all the others areas I mentioned in the article. I write the image processing scripts, scrape the source character art images, and bring the generators to life as a web app via Streamlit.
This is another generic topic. Because Python has so many libraries, it is easy to find that someone else has already written code to automate a boring task you have. And don’t disregard the built-in libraries, they also provide plenty of functions and modules for your automation needs.
For example, I consider that daily news scrape example as automation to some degree because it removed the need for me to open multiple websites. With that I simply needed to run the script and look at a single web page.
Other examples include tasks like renaming dozens of files in a folder and sending emails.
(Basic) Video Game Development
And to end on a more relaxed note, I have also used Python to make video games. I didn’t build any super impressive 3D open world video game, it was actually something as simple as blackjack (the card game).
I used Pygame for this and it was a pretty fun month of trying out this library. I came across Pygame early on my Python/programming journey, so I was very impressed with finally having some kind of graphical output. I did some random stuff like generating randomly-coloured rupees on the screen (the currency from The Legend of Zelda), but all in all the blackjack game was the most complex bit of work on Pygame. I created the assets (i.e. the cards) and then built the logic and the graphical interface for this player vs. cpu game.
And that’s it for my experience with Python. I think I have had a good variety of applications with this wonderful language, and so I wanted to share them in case you’re looking for an excuse to learn Python or looking for ideas to use in a new personal project. I wish you happy days of Python coding :-)