Demo Day and my experience architecting my first app

Wow, where do I begin? I am happy to say that I have completed the 4 months of craziness that was the Iron Yard program. It is so impressive how far the cohort has come. We went from for the most part nobody knowing much of anything to building out full-fledged applications that solve some pretty fun problems. My application, for instance, was to help me spend less time on my fantasy football team. I had a blast building it and you can find it here http://whodoipick.azurewebsites.net/.

Demo day was a rewarding experience. The last demo day helped me to make up my mind to join cohort 8 “The Ocho” and this one was just as awesome. The overwhelming support from the folks in the community just helps to reinforce that I made the right decision. The most tiring part of the day was the wait in between getting my space set up and the beginning of demo day. It’s sad that this was the last day for the current Iron Yard. Hopefully, the incredible staff at the Iron Yard will be able to bring back immersive code education to the area sooner rather than later. However, until then there is a lot of blossoming talent wanting to use their skills to positively impact multiple companies in the area including myself.

So let me begin by talking about the process of building my application. First of all, it was harder than I thought to find a reliable API to grab and source data from. I was able to find data from https://www.fantasyfootballnerd.com/fantasy-football-api. The information it provided wasn’t 100 percent accurate but at least it provided player names, positions and the teams the player played for. I then wanted to get information based on players points per game and overall stats on the season. However, I found out the NFL.com and ESPN don’t support developers any longer. I’m not sure what the reasoning is but it created a bit of a hurdle to get over. I wanted to make it so the user didn’t have to do as much legwork to import their own team.

Anyways, once I was able to get that data I was able to build a javascript get API to fetch information from the site and then write a javascript post function to post that information back to a page. Thus allowing users to add players to their team. Once I could create a collection of players I needed figure out what I wanted the information page to feature. Since I wasn’t able to grab information through a traditional API I decided that I would use a web scraper to get that information instead. Enter the HTML5 agility pack library. I was able to use this tool to populate data form both Fantasypros and Footballoutsiders website. The web scraper enabled worked really well I would recommend it to any C# developer looking for a web scraping tool.

The next trick was to get the web scraper and javascript API to play nicely together. So I decided to use the URL to pass data into the C# controller and use that information to tell that web scraper where to find information on the respective pages. That is basically the gist of how my website works. As for the next few iterations. I need to upgrade the overall UI. So that you can add multiple people to a team before navigating away from the add players page. I would like to beef up the player info page with even more data from various websites. I hope to get this tool to be something to were you can click a players name and get trade value, show trending points scored each week and just adding more data overall. I would also like to add a dictionary page of some sort to provide tips and suggestions when negotiating trades, providing information on what different stats mean and also importing useful fantasy football articles so readers can read up.

Like what you read? Give Robby Bourne a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.