I spent the better part of the last two working days of this week trying to figure out how to write a Spark dataframe from my Azure Databricks Python notebook to an Azure blob storage container. Of course, going through that process, one thing that kept me pushing through the frustration was the thought that when I figured it out, I’d be the first to share an article on the solution with the world. That explains why this article exists. Now let’s get right into it.
My Databricks notebook does three things:
I love to write elegant code. I’m going to go on a rant here but no, I’m stopping myself so we can get right into the meat of this article.
Recently, I had to build a REST API that serves USSD requests, its client being the telecoms operator (telco). The input data from the telco is a JSON object whose form is like below:
msisdn is the phone number of the end-user who initiated the request by dialing a USSD code on his phone. sessionNumber is a…
This is my first medium post. I have a blog at http://www.hafiz.com.ng, so it begs the question of why I’m starting to write on Medium as well. Simple. My intention as at this time is to keep posting personal and non-professional stuff on my current blog while my Medium posts would contain only technical and career related content. That said, let’s get straight to the business of this article.
This week at work, I had to integrate a Groovy client to Google’s nascent DLP (Data-loss prevention) API. And I’d have to admit that it’s one of the most difficult things…