JSON Processing Pipelines with gron

Using Linux Tools to Parse and Manipulate JSON Transformed with gron

Jimmy Ray
Capital One Tech
5 min readOct 15, 2018

--

As a polyglot programmer I strive to always employ the simplest approach and the best tools for the job. I have parsed JSON in Java, Python, and Go, but I think too many times we ignore the UNIX/Linux tools such as sed, awk, cut, etc. Too many programmers write hulking data parsers that are just overkill. With gron transformations, I find it easier to utilize these strong UNIX/Linux text editing, manipulation, and filtering tools.

While jq is powerful at parsing known JSON structures, its major shortcoming is that it requires one to know the JSON structure being parsed. gron is less restrictive and can be combined easily with the above Linux tools to build very powerful parsing pipelines, without having to know exactly where to expect a particular structure or value.

Installing gron

Instructions can be found here for installing gron. I used brew install gron, and then, for reasons that will be apparent later, I added the following alias:

alias norg=”gron — ungron”.

Make JSON greppable

Obviously, being text-based, JSON is already “greppable”. However, the strength of gron comes from its ability to split JSON into lines of what is referred to as “discrete assignments”.

Given the JSON snippet below (from an AWS EC2 CLI call):

gron will parse (cat ~/ec2.json | gron) and convert the JSON into lines of discrete assignments:

Munging gron Output Through Command Line Pipelining

JSON is more compact than the gron output and suited for data structuring for transport and integration. While more verbose, the gron output is a more usable format for text searching, filtering, and manipulation via Linux’s text manipulation and filtering tools, or even sed and awk. For example, consider the following commands:

The above command “pipeline” searches the gronned JSON for the text “AvailabilityZone” value and returns the discrete assignment line.

The above pipeline extracts the AvailabilityZone value via the Linux cut command.

The above pipeline pulls all the EC2 instance IDs from the AWS EC2 CLI output and creates a list of IDs.

Transforming JSON with gron and ungron (a.k.a. norg)

Earlier, I referenced the norg alias that pointed to the ungron command. With this command, gron will transform gron discrete assignments back into JSON. Consider the commands below:

Note: cat was removed and gron was called directly.

The above pipeline grons the JSON, greps for the InstanceId field, and then converts the lines of discrete assignments

from the grepped gron output back into usable and simplified JSON.

The above pipeline adds ImageId to the transformed JSON using egrep (Yes, I know GNU has deprecated egrep in lieu of grep -E.) .

sed

sed is a powerful stream editor and is handy for executing find/replace algorithms on text files.

The above pipeline adds stream editing with sed to perform multiple inline string replacements.

The above pipeline adds the translate command, tr, to remove newline characters, and then another sed command to remove remaining whitespace. This is handy for minimizing JSON files.

Summary

gron converts structured JSON into lines of discrete assignments. That transformation enables the process to pipeline text to native tools like grep and sed to perform powerful text manipulation. Once manipulated, the discrete assignments can be transformed back into JSON via the gron -u| — ungron command. This makes gron a complement to existing tools like grep and sed, for munging (a.k.a. manipulating) JSON data.

DISCLOSURE STATEMENT: These opinions are those of the author. Unless noted otherwise in this post, Capital One is not affiliated with, nor is it endorsed by, any of the companies mentioned. All trademarks and other intellectual property used or displayed are the ownership of their respective owners. This article is © Capital One 2018.

--

--

Jimmy Ray
Capital One Tech

Cloud Native, PaC, and Cybersecurity SME - Opinionated Writer - My opinions are just that, mine; I own them.