#13 Week of the Year

Florian Dahlitz
Coding experiences
Published in
4 min readApr 1, 2018

--

Building my personal portfolio page. Integrated Medium articles in my personal blog. Fetching GitHub stats and animate them. Working with Tkinter creating GUIs.

Personal Portfolio Page

In the last weeks summary I already mentioned, that I’m currently building my personal portfolio page, where I want not only to share blog posts with others, but also present projects I created or contributed to. Over the past week I focused on this major project to get it up and running as fast as possible.

The home page is nearly finished, only the content of one section has to be adjusted. The blog page is already finished and will be discussed in more detail in the following section of this article. Another major achievement is the integrated “GitHub Facts” section, which I will describe a little bit further in the third section.

After all I’m quite confident to finish this project and get it up and running online by mid of April.

Medium articles in personal blog

All of the articles about coding and tech-related topics I’ve wrote and published on Medium. However, a feature of my new portfolio page should be a blog, where people can see and read my written posts. Because I didn’t want to copy and paste everything from Medium to my own blog, I decided to find a way to automatically fetch the latest articles published on Medium and display them in my blog with a link to the original Medium post. Currently Mediums API doesn’t support fetching articles of a certain user. So I needed to get familiar with Mediums RSS feed, which seemed to be the only way to get the needed information.

I opened an interactive Python shell and started hacking together the information. The first thing I’ve noticed is, that the feedparser package I used was not able to get the whole content of the RSS feed. So I needed to get the RSS feed through a normal request and add it to the parser.

import requests
import feedparser
page = requests.get("https://medium.com/feed/@DahlitzF")
rss = feedparser.parse(page.content)

With the code above I was able to access the latest posts from my Medium account. rss.entries is a list of all the posts, so I only needed to iterate over the list and access the properties of each posts, e.g. title, summary, link, date of publishing and the image url. I put the code inside of my Flask application and created a dynamic HTML-template. For further information you can have a look at the commits made and at the whole project itself.

The thumbnail of this articles shows a small snapshot of the finished blog page.

GitHub Facts

On the home page I wanted to share some GitHub stats about my contribution in open source. However, information about total number of commits, issues and pull requests are not available or at least not without iterating over all repositories. I didn’t find a way. So I made use of the information, which are accessible: Number of public repositories, public gists, follower and following. All these information can be accessed via:

https://api.github.com/users/<username>

Another time I set up a request and fetched the data from the API and writing it to the template. Here’s a small code snippet showing how to get the data:

import requestsurl = "https://api.github.com/users/DahlitzFlorian"
github_user = requests.get(url).json()
context = {"name": "home",
"repos": github_user["public_repos"],
"gists": github_user["public_gists"],
"followers": github_user["followers"],
"following": github_user["following"],
"contributions": contributions}

The result is shown in the GIF below.

If you wonder seeing GitHub’s contributions calendar below the pure numbers as well, you don’t need to be scared that you missed something. I made an additional requests getting the calendar in svg format and added it in a div-container below.

Have a look at the far simple code:

import requests
from bs4 import BeautifulSoup
url = "https://github.com/users/DahlitzFlorian/contributions"
page = requests.get(url)
soup = BeautifulSoup(page.content, "html.parser")
contributions = soup.find("svg")

The variable contributions was already added to the context variable at the previous code snippet.

That’s all on how I hacked together a GitHub Facts section. If you want to see the full code, please have a look at my overall project. Feel free to ask questions or to contact me via Twitter for further help on that topic.

Further stuff and information

Additionally to the already mentioned things, I proceeded solving algorithm challenges at HackerRank using Python. It’s in my opinion a great resource to train you understanding of a certain programming language (or even multiple ones).

Furthermore, I completely finished my Python color-changer, added it to Travis CI for continuous testing and published it to PyPI. You can now install it via

pip install color-changer

For more information have a look at the GitHub repo.

On Friday I became a patron of the “Talk Python To Me”-podcast from Michael Kennedy, which I really enjoy listening to once a week. You should definitely check it out. Wish you a pleasant week, stay curious and keep coding!

Articles and podcast episodes

As usual I share with you a list of articles I’ve read and episodes I’ve listened to and that are worth sharing.

Articles

Episodes

--

--

Florian Dahlitz
Coding experiences

Student, Developer, IBMer. Member of the RealPython.com team. Coding and sports are my passion. Python | C/C++ | Java