Week 06: Looking into JSON

emily leung
code3100
Published in
4 min readApr 7, 2017

Ways of collecting and using data

Starting on Friday 31st March 2017, I’d thought I’d look into understanding JSON and how that would be effectively and efficiently used through Flux.

It seemed worth the time to check out the advantages of using JSON and how, to it’s full capacity as it is the “primary data structure used by Flux to transport your data from one Flux plugin to another” [https://community.flux.io/articles/2124/what-is-json-data-and-how-do-i-work-with-it.html] — “ Flux Snippets — Enhancing interoperability using the Flux Flow”

JSON [http://www.json.org/] — “Introducing JSON” — in simple terms is a minimal human readable format for structural data. Its primary use is to transmit data between a server and web application — and can be seen as an alternative to XML.

The video by Flux.io [https://www.youtube.com/watch?v=tri1KQoTSts] — “ Flux Snippets — Inspecting and creating data in the Flux Flow Flux” — explains the processes of using Flux Flow to:

  • Inspect Data
  • View underlying geometry which can be downloaded as a JSON file
  • Create your own data input (numbers, text, arrays and JSON objects)

But this doesn’t have to resort to looking into smaller elements — e.g. a geometry such as a line. It has a lot of power when combined to constantly flow through Flux.

After watching this video [https://www.youtube.com/watch?v=UoDp7leljrY] — “ Flux Snippets — Enhancing interoperability using the Flux Flow” — by Flux explaining the process of the interconnected data input between and Excel sheet to Grasshopper/Rhino and Dynamo/Revit, we can consistently speed up and automate the process of creating levels in workflows.

This is only possible through understanding the way in which JSON code is read and what we want to achieve with the data.

Basic things such as:

The Index ALWAYS starts at [0] plays a huge role in determining data transformations

This wasn’t initially picked up in that video. Alexis’ tutorial on scripting the Serpentine Pavilion provided many insights into the sorts of nodes/logic that are applicable to many situations:

After doing this research, there’s one big question that I believe needs an answer:

How can using JSON code/data be brought into the Pavilion Project?

As I continued to watch more videos, I stumbled upon another Flux video: [https://www.youtube.com/watch?v=UJ-0UMC7Rt8] — “ Flux Labs — File Uploader: dxf files direct into Revit with Flux”

This video perfectly explains the scenario of sending a DFX file to Flux — via the Flux uploader and extracting the right information through the Flux Flow nodes system so that it can be cleanly passed through to Revit with the correct family attached to the linework. This is definitely an eye-opener for me because of the way I’d imagine having the need to go through Dynamo. This process bypasses that, but does require some groundwork in understanding the JSON data and thus Flux Flow workflow.

This could move forward to an alternative VR solution in seeing the design become a reality: [https://www.youtube.com/watch?v=iG5Kr2H53Zo] — “ Flux Labs — SketchUp Model in VR using Unity — Under the hood”

This could also potentially leading into developing our own workflows/components (grasshopper clusters):

So the last image of scribbles basically summarizes a concept I’m confident will work based on the information collected about using JSON and the Flux flow workflow.

What I imagine is that the structure (grid system) for the pavilion will require a series of members that may or may not have solids (for storage) — determined by true or false Boolean values. By generating a model and determining a true/false, integer or alphanumeric value, this can be sent to Flux flow to extract the right information. The length/width of the structure could also have information coming in from other places, which is easily adjustable in Dynamo/Grasshopper which can be exported and updated to an excel spread sheet.

This model in Flux will then be inspected and using the right nodes, am able to extract the geometry for flattening into one large structure — rather than individual solid and void cuboids. The information can be sent to dashboard for viewing or to other places such as Revit for documenting the heights/families etc.

This process would ultimately require possible custom cluster components in grasshopper or dynamo to determine labelling for the structure.

--

--