Where Whales and Buildings Meet

Dan Edleson
Paper Architecture
Published in
11 min readMay 19, 2020

Digitized Data in the Realms of Paleontology and Architecture

Source: Paper Architecture Illustration

“Then the world was the whale’s; and, king of creation, he left his wake along the present lines of the Andes and the Himalayas. Who can show a pedigree like Leviathan? Ahab’s harpoon had shed older blood than the Pharaoh’s. Methuselah seems a school-boy.”

- Herman Melville, The Whale or Moby Dick

In early 2005 I walked across an ocean floor housing some of the largest creatures in the history of planet Earth. I was accompanied by two other college age explorers, a pair of Brits I had bestowed with the thoroughly American nicknames Ty Cobb and Cy Young. Covered in desert dust we habitually downed swigs from the water-filled canteens draped across our shoulders, trekking through cavernous passes and taking refuge in the shadows. We never saw a single whale.

Sometime between six and nine million years ago the waters of the Atacama receded, trapping the marine life that had flourished there. It exists today in what is termed a “rain shadow” due to unaccommodating winds from both directions. The once proud ocean has become the driest place on earth, so dry that the soil has been found to have similarities to Mars and NASA frequently uses the area to test its rovers. Long after the last whale, the Atacameño arrived, then mere millennia later the land was claimed by the independent nation of Bolivia before being usurped by Chile. Since the early 1990s the population has grown over 75%, primarily due to a booming tourism industry based on numerous opportunities for trekking and climbing what was once deep waters. With people have come roads. Many miles of desert land have been excavated, in the process uncovering some of the most astounding whale skeletons on earth.

The road to San Pedro de Atacama is littered with the bones of whales / Source: www.lifegate.it

I learned about these magnificent bones while listening to the book Spying on Whales by Nick Pyenson, a leading paleontologist at the Smithsonian. While it was fascinating to learn that whales once walked on land — at one point they looked like giant dogs and then for reasons unknown adapted back to life in water — as an Architectural professional involved in utilizing 3D laser scans for building documentation the most intriguing piece of information was that Pyenson and his crew employed 3D laser scanning and photogrammetry to record the site layout and catalog the bones. I quickly became curious as to whether the original scan data was the final step or if further parsing was done to apply metadata and structure to the billions of digital points.

I see this often with the work I do with Building Information Models (BIM) for Architecture and Structural Engineering: a laser scan gives you fantastically accurate information, but it is merely a hologram. You can look at it but can’t really interact with it in a coherent way until some form of translation is done. The typical process I take is to bring the raw scan data into a BIM program –I use Autodesk’s Revit, which is an industry standard — and start translating the points into more tangible objects the software understands such as windows, walls and doors. Once I have translated things thoroughly they are now tangible “things” I can attach metadata to, such as the manufacturer specs of a door, the material composition of a wall, or anything custom I wish to add such as the cost per unit of a piece of furniture.

Source: Smithsonian Digitization Program

Through initial research I learned a bit about the paleontological process but didn’t find anything that took data beyond a 3D model. That is to say the scan data had been translated into 3D models of the skeletons through programs such as Blender and Maya, but the embedded data, the transformation from merely a 3D visual representation to a vast 3D database, did not seem to have been thoroughly developed.

Soon the Spying on Whales audio book came to life as I found myself on the phone with Dr. Nick Pyenson of the Smithsonian. I pitched my idea for an all-encompassing Whale Information Model, one that would contain all the metadata in the entire world on whales. Dr. Pyenson had a bit more pragmatic outlook, “The question is why? Metadata is the function of what the question is you want to ask.” Once we moved past my initial assumptions, location began to emerge as one of the most relevant pieces of information to be catalogued. “It’s not so much every single bone but the rock sequence it’s found in that gives you context,” Pyenson said, “A lot is determined by what is found in the rock. Skeletons are incomplete. Maybe it’s a limb or a jaw bone, but if you remove it you lose a lot of context.” Going further into location data, it became clear that Pyenson was not just talking about the context of where the bones were found, but where they were stored as well. “I want to know where that thing is physically in the museum. At the Smithsonian we are dealing with a collection built over one-hundred and fifty years.”

I was beginning to get an idea of the data that could be of use but was still curious about software and using a model in the context of storing information. The short answer was Dr. Pyenson and his colleagues are using any software they can get their hands on and making the best of it. Unlike the construction industry, Paleontology is not a lucrative field with disruptors in search of commercial success constantly producing innovations. Furthermore, the Smithsonian is government funded and subject to the same financial constraints as any other government agency. This leads to budget conscious methods, such as free options like 3D modeling software Blender or grants for more expensive programs which seem to offer something comparable to BIM, such as Amira. What really matters for the team at the Smithsonian is that the software they use is going to be relevant for a long time. “Final file type matters,” Pyenson says, “have you ever tried to open a word processor document from the 1990s? We want to future proof things as much as possible…I wonder about a lot of the intermediate rendering software twenty years from now. I want to invest in workflows that are relevant now and for longevity.”

Digitized Whale Skeleton / Source: Smithsonian Institute

Talking with Dr. Pyenson gave me the necessary information to imagine some possible ways to turn the raw scan data into an Information Model. Focusing on location, I mocked up in Revit a scenario where a model can shift location in different phases, and that location data can be very specific, but also cover a vast amount of space. One can imagine a fully rendered whale in its original habitat, then as a fossil located exactly where the bones were placed on the Atacama ocean floor, excavated and conveniently located in full skeletal form for an exhibit at the Smithsonian, then stored in specific drawers in the Smithsonian archives once the exhibit was over. I mocked this up in Revit because it is the software I know best and am most capable of hacking. It worked pretty well on a conceptual level, but it also made clear Revit is not the right program for the job.

There are a number of reasons Revit is not suited for whale paleontology. The main one is that if we are talking about 4D modeling — time being the 4th Dimension — in the context of objects physically moving, Revit will not do this. You can demolish or create a new object in Revit in different Phases, but you cannot physically move the thing. Another drawback to Revit is when we talk about longevity of a software. Revit is strict in the file types it will accept and not backwards compatible in any way whatsoever. I can imagine poor Dr. Pyenson’s frustration after implementing a Revit based process for this type of system and then trying to exchange files with another museum that has a copy of Revit 2017 and can’t open the files he just made in Revit 2019. While IFC (Industry Foundation Class) file types exported from Revit are a possibility, they would still require a level of file uniformity that paleontologists do not yet have the infrastructure for and may never have. That being said, when you get beyond designations for industry specific objects such as walls (IfcWall), much like Revit the logic of IFC files breaks down into broader categories that could be more accessible to a variety of disciplines.

To overcome Revit’s limitations with factoring in time and movement, as well as the rigidity of the file type, I considered another Autodesk product, Navisworks. Navisworks functions primarily as a clash detector, merging multiple models from the many disciplines on a construction site, and locating issues where the models “clash”, such as when a water pipe has been modeled in a way that it directly runs through a steel beam. Combining models and running clash detection saves significant time in the construction industry. One thing that Navisworks does that could be of use in carrying out this type of operation is it is tailored to 4D, creating timelines that not only help produce Gant charts of the construction process, but also allow you to create a visual representation of objects moving over time (think a construction site and the process from excavating the site to erecting the last wall and all the logistics in between).

A conceptual mockup of a whale skeleton in Revit / Source: Paper Architecture Illustration

Beyond 4D timelines, Navisworks emphasizes its ability to open as many file types as possible. Among them are Revit, the IFC format and relatively cheap 3D modeling programs like Rhino and Sketchup. While there is no guarantee Navisworks itself is going to be around twenty years from now, it does seem to at least provide potential for more interoperability than something like Revit. Navisworks is also great for visual queries, where you can search specific objects based on ID or other metadata you have assigned. I myself am a novice Navisworks user, and so I reached out to John Niles, a Navisworks expert with HITT Contracting, who was generally positive about the overall concept when I ran it by him, with a caveat. “You’d have to come up with some sort of data sets that you could populate to,” Niles said, “the fields need to be created before you come into Navisworks.” Which is to say you would need a secondary software before using Navisworks.

I had reached a dead end, much like those hapless pre-historic whales did in the Atacama so long ago. Revit was great for metadata but poor for interoperability and 4D simulations while Navisworks did well with timelines and various files but was unable to produce significant metadata. The combination of the two programs seemed a costly and clumsy solution to a relatively simple problem. I realized the issue I was running up against was not so much a problem of whale paleontology, but a problem of the building industry, and perhaps of all industries that utilize modeling and data; there is no such thing as an all-encompassing model. It’s something I’ve encountered frequently in my field since I was an eager intern encouraging middle-management designers to ditch Sketchup and use Revit. Different programs do different things great, and they also do different things badly. What is needed are unifying interfaces, much in the same way that AOL made an esoteric connection of computers into something your grandma could use. In the building industry we are starting to see the next step towards breaking down these data silos, and it is likely going to come in the context of apps that link these varied sources of data in a clear and accessible way. When you order a ride with Uber you are doing this, leveraging the technology of GPS and Google Maps to visualize where and when your driver will arrive. It’s taking the wealth of mapping data and funneling it into a specific use. Autodesk has been making rumblings about this with their vision for Project Quantum. Terms like the Internet of Things and Digital Twin seem to be on everyone’s lips these days. It’s coming soon, but for most it’s still just out of reach.

Source: Magic Leap

All of this makes me think perhaps for now Dr. Pyenson should just pair each bone with a low-cost wireless key finder and wait it out for a decade or two. Like he said, metadata is the function of what you want to ask and he just wants to know which drawer the thoracic vertebrae is in. Whale bones are not as dynamic as a building, if only because they are the remains of something that is dead, and an occupied building is something very much alive. Not to mention buying a couple $20 geo-locating tags online seems a lot more cost effective than modeling everything in a software that costs $2,500 a year to license. (Writers Note: In a follow-up e-mail I suggested such a seemingly simple scheme but Pyenson shot down the idea. Whales have two-hundred or so bones and they found portions of forty skeletons in the Atacama, which would have put the total cost of geo-tagging each bone at around $165,000, which was more than the original budget. He also mentioned using such a strategy for the Smithsonian’s 145 Million specimens would cost around $2.9 billion.)

Whale Information Modeling, or as I call it WIM, may have to wait a decade or so. While a true digital standard that categorizes metadata may be far off, in the shoestring spirit of the paleontology profession it seems there are potential software solutions from the AEC industry that could be utilized to assist in moving towards a unified approach. HBIM –Historic or Heritage Building Information Modeling, depending on who you ask — could potentially serve as an example of how AEC software could be altered and used for other purposes. In Historic Preservation the building elements are often more complex and unique than modern construction and their imperfections can sometimes play a role in the desired analysis. HBIM is also just starting to develop, however it does seem to be a few steps ahead of the metadata being used by paleontologists. Conversely, the questions and processes Dr. Pyenson has identified as being crucial to WIM could inform the evolution of HBIM. The idea of time playing a role in a BIM model, objects in the model that change location or imperfections that arise in different phases but are not necessarily demolished or built, is a compelling issue that has not yet been satisfactorily resolved in HBIM but is certainly worth considering. Furthermore, in subsequent correspondence Pyenson made clear a crucial benefit of using digital representations; the whale skeletons he cataloged in the Atacama never actually left the country due to Chilean laws forbidding their export– which was the reason Pyenson utilized digitization in the first place. So at first glance a software designed for use in the context of whale bones can sound farfetched. Someday though it will surely be realized, just as the farfetched idea of laser scanning whale bones has now come to fruition.

Dan Edleson is the Principal of STEREO, a Building Information Modeling firm focused on digitizing existing building conditions. www.stereobim.com

--

--

Dan Edleson
Paper Architecture

Living at the forefront of where Architecture and Technology meet. Always looking towards the past on how to innovate the future. #bim #architecture