The Fallacies of Carbon Modelling in Construction
For 7 years, I’ve experimented with bridging the gap between Life-cycle Assessments (LCA) and Building Information Modelling (BIM). My aim was to automate carbon calculations in the construction industry (AEC).
In this article, I’ve listed 4 most fundamental errors that corrupt carbon calculations in AEC. I’ll share lessons learned regarding each type of error.
My aim is to answer these questions:
- What types of design errors in BIM models corrupt carbon calculations and to what degree of error?
- How trustworthy are generic emission factors and environmental product declarations in 2024?
This article discusses the errors and fallacies that corrupt carbon modelling in the construction sector. With this article in mind, you will gain understanding how to improve carbon and cost modelling in your respective context.
The Premise
Throughout my career, I’ve worked at two large and prestigious consultancies within urban development. Later I worked for a large manufacturer of construction components. As a BIM coordinator, I stood for design quality of an airport terminal, a hospital building, and commercial centres. I’ve also worked with design validation of metro stations, industrial buildings and a school. Thus, I supported architects and all sorts of engineers with feedback on the erroneous nature of their designs.
I had a lot of opportunity to observe environmental consultants and LCA analysts in different kinds of settings. In the context of preparing an embedded carbon calculation, I noticed environmental consultants tend to process bills of quantities using the following methods:
At first look, these methods for transferring information to LCA tools hold a logical merit. However, the methods share a hypothesis that quantities and material descriptions are in a good enough shape to serve as the basis for a LCA. Since BIM models are substantially flawed while still in design tools, one wonders how could a lifecycle assessment of a building hold true then?
There is one group of people who would know more about this issue than the designers themselves — the BIM coordinators. Have BIM coordinators and environmental consultants been working together in a systemic and synergic fashion? No.
Have BIM coordinators understood their skillset is key to enable decent carbon calculations? No. They are most often asked to do 3D coordination. Some developers recognised their importance for metadata control to enable facility management in the future. However, I do not know a single BIM coordinator who has been approached to analyse embedded carbon of a building.
I had to do something about the status quo. In 2016, I started my experiments with integration of environmental product declarations (EPDs) and generic emission factors (GEFs) with BIM objects. I used Solibri Office for this purpose. My agenda was to help LCA analysts to scale their knowhow to more projects.
Years have past. I have learnt many lessons — about data, technologies, and how different people use them. I learnt far too well how non-technical staff reacts to the word “automation” in private companies where the per-hour-spent business model dominates.
This article will discuss carbon calculations only within the context of lifecycle stages A1:A3 (EN15804). These lifecycle stages cover raw material extraction, transportation of raw material, and manufacturing of the final product. These lifecycle stages have already become a subject of mandatory climate declarations in some Nordic countries.
Do We Trust Carbon Calculations? Can We Prove Them Anyhow?
️An AEC project organisation can neither formulate a mathematical proof for an embedded carbon calculation to hold true, nor can it determine the degree of error of the calculated results. There are too many known unknowns in both BIM models and emission factors today.
Hence, any tender with carbon performance as an evaluation parameter is a potential subject to an appeal.
There are expectedly unknown unknowns present as well.
The astronomer Carl Sagan once said: “Extraordinary claims need extraordinary evidence”. Indeed, I would not sacrifice my free evenings to jot down this article, if I had not assembled such an evidence in the first place.
Why can’t we formulate a proof for a carbon calculation to hold true?
To calculate embedded carbon emissions, we require an exact bill of quantities of the built-in components with perfect material descriptions.
Designers shape the future build in their design tools on CAD, BIM or GIS platforms. Designers are, thus, the source of geometries and metadata describing products and materials.
Any stakeholder with access to the models can export a bill of quantities on their own. However, most environmental consultants do not know how to do that, or they often lack relevant software licenses. Hence, an environmental consultant usually asks the designer to export their bill of quantities in a spreadsheet format.
How precise are designers’ BIM models from geometric and metadata point of view in 2024? Are bills of quantities good enough so environmental consultants could conduct a carbon calculation of an entire building?
Data Gaps, Erroneous Data and Logical Fallacies of Carbon Modelling in the Construction Industry
1) Material Descriptions in BIM Models Contain too Much Discrepancy
1️At least 15% of unique material descriptions in BIM models in the Buildings sector in Sweden are erroneous. In erroneous cases, material descriptions were either generic or misleading. The misleading descriptions could present an aluminium object as wood, for example. Steel objects could be described as concrete, for example. Some architectural models contained up to 40% of erroneous material descriptions in early design stages.
Hence, any environmental consultant without BIM validation competence consequently selects a wrong emission factor as the basis for their calculation in those erroneous cases.
To support my claim, I analysed 8 Swedish commercial and industrial buildings’ BIM models between 2019 and 2023. I studied the mix of architectural, structural, HVAC, and plumbing models. The models were produced by respectable consulting brands for respectable private and public developers.
I have yet to see a BIM model that would meet my needs from metadata point of view, so I could select an emission factor for a group of BIM objects without a doubt.
Ask yourself. Would you post a package, if you knew there’s a 15% to 40% chance it would arrive to wrong hands? Should we tolerate such error prone work methods in contexts were public funding is involved? Where does this leave environmental certifiers like BREEAM or LEED?
These findings were the reason why I developed a new BIM validation method for scrutinising BIM models from material point of view. Additionally, my company Illuminum paired up with Solibri to develop new tooling to support this methodology.
2) Quantities in BIM Models Are Substantially Erroneous
2 It is no surprise that designers deliver geometrically erroneous BIM and CAD models. That’s the main reason why BIM coordination raised as a self-standing discipline. 20 years later, this issue remains unsolved, but we became better at detecting design errors early on.
Known issues with BIM models and their quantities that corrupt climate calculations are:
- Geometries in BIM models that ought to represent individual physical built-in objects overlap, sometimes quite dramatically. The cause is either a poor modelling technique of the designer, or lack of time to model geometries properly.
- There are many objects and shapes that never got modelled, and never will.
Hence, bills of quantities contain both data gaps regarding some materials and too large quantities of other materials simultaneously.
- There can be duplicated BIM objects in the same 3D space. This error cannot be seen in 3D views or on 2D drawings. Duplicates corrupt bill of quantities with volumes and weights that would not occur in reality.
Once I stumbled upon 7 identical copies of the same structural steel beam in the same 3D space.
- There exist group-objects like “Assemblies” that can make 1 physical object appear several times on a bill of quantities. This sounds like a duplicate. However, here the “duplication” arises from including both the hierarchically superior group-object, and the individual subordinate members of that group.
Usually, only advanced BIM validators know how to detect and nuance Assemblies from their individual members on a bill of quantities. - Hollow objects get modelled as solids. Imagine a round ventilation duct made of 1 millimetre thick sheet of steel. Yet, it gets modelled as a solid volume of steel without any cavity. In general, quick-and-dirty modelled without cavities objects involve often furniture, plumbing and ducting. In such cases, quantities get corrupted with factor of up to 100x on the object level.
Think about what geometrical overlappings, duplicates, assemblies, solid hollows, and non-modelled objects can do to your carbon calculation, if you rely on consultants who lack BIM validation skills to scrutinise the design.
Note, it is possible to detect collisions and overlappings. Yet, it is not possible to detect missing objects with confidence.
For these reasons, many senior professionals prefer to quantify construction components manually using 2D PDF drawings still today. It’s a shame. If the AEC project would pioritise having properly modelled geometries with exact metadata, cost calculators, environmental consultants, and facility managers could rely on BIM models as a single source of information.
Design errors regarding quantities and material descriptions arrive at an unknown and unpredictable rate in every AEC design process. Consequently, they corrupt both cost and climate calculations, and whole life lifecycle assessments of entire buildings.
Today, it is a mainstream practice to address environmental consultants and LCA analysts to multiply quantities with emission factors. The industry does not address experienced BIM coordinators to do this activity.
I have observed that environmental consultants would primarily ask the discipline owner, such as the architect or the structural engineer to acquire bills of quantitie in spreadsheet format.
Secondarily, environmental consultants would address the cost calculators for bills of quantities.
Never have I observed that an environmental consultant would ask the project’s BIM coordinator to acquire the bill of quantities.
Hence, BIM coordinators have a job to do — to sell themselves better to both cost calculators, production planners and to environmental consultants. However, design managers and project managers have to update their know-how to use unique skillsets in the right way.
The bottom line is this — an environmental consultant without BIM validation skills to scrutinise both quantities and erroneous material descriptions has an exceptionally unlikely chance to produce a reliable calculation of embedded carbon for an entire building.
What’s worse, such a person cannot even model the degree of error of the calculated results at any level of confidence. Yet, the entire AEC project organisation usually swallows calculated results as they come from an environmental consultant.
It matters little, if you have a license to an advanced LCA tool in the project. Data inputs regarding both quantities and material descriptions are substantially corrupted before they get loaded to a LCA tool.
Your job does not end when you’ve managed to produce numeric results with kg CO2e at their end. Try to prove your calculations to hold true. Only then you’ll discover how deep the rabbit hole goes.
Our inability to formulate a mathematical proof for our carbon calculations would not be such an issue. Indeed, there are many truths which we won’t be able to formulate a mathematical proof for. The issue that matters here is that we cannot determine any degree of error at any level of confidence. We sense the numbers are fluffy. The disaster is we do not know how fluffy they were when a design transposes to the reality.
3) Environmental Product Declarations Are Overrated
3️These steps take place to prepare a lifecycle assessment (LCA) of a construction product, and its environmental product declaration (EPD):
- data collection about the product, the factory, its logistics, its energy and medium use, upstream and downstream material flows, and more,
- data processing by a 3rd party environmental consultant who makes a LCA,
- data review by a 3rd party EPD validator,
- and data conversion, restructuring and distribution online by a yet another 3rd party publisher.
Let’s imagine a few particular stakeholders involved in a data flow that ought to give birth to an EPD of a construction product. Let us then evaluate how trustworthy an EPD can be.
Tommy, the HSE — The Data Source
Since he was a boy, Tommy was good at repairing anything mechanical. Tommy joined a manufacturing company 17 years ago, and worked himself up. Today, Tommy knows how to operate all the machines in the factory, and how to assure the quality of the manufacturing process. He has been promoted to the Health, Safety and Environment responsible (HSE) for 3 factories of the Manufacturer who employs him.
Suddenly, Tommy’s management realises that their company looses market share as they do not have EPDs of their products like their competition does. Tommy is suddenly asked to collect data about energy and water use, waste from internal information systems of a company whose digitisation level is close to zero.
The only proper data he is self-confident about are the products manufactured as those determine the sales volume. Then Tommy knows the electricity consumption, but only because the energy company sends Tommy a bill to pay every month.
Never had Tommy had to perform such a deep internal information audit. Expectedly, there are no routines for it. But, Manufacturer does not have anyone else to spare on the task. The Manufacturer cannot suddenly create data and routines out of thin air. They did not budget for such a task last year. However, Manufacturer needs to get competitive again and fast! Everybody believes Tommy will do his best.
The only tool for data collection, storage and manipulation Tommy has learnt is MS Excel. He kindly collects all requested data in a spreadsheet the best way he can, and he mails it to Lisa.
Lisa, an Environmental Consultant — The Data Processor
Lisa is deeply concerned for the environment. Hence, she works with LCA to be able nudge industrial stakeholders to better environmental performance. Lisa is highly educated in forest and marine ecology, hydrogeology and ecotoxicity. She understands the physics of global warming. Later in her professional career, she took a series of courses on LCA methodology. Lisa has behind her 20 years of working experience at various respectable 3rd party environmental consultancies.
She collects the data from Tommy. She follows a LCA methodology she studied and applied in praxis many times over. Eventually, she produces a LCA of the production process and EPDs of Tommy’s company’s products.
Lisa’s only data collection, data storage and data processing skill is also in MS Excel. At first, she had hard times to figure out what Tommy typed down for values. Data was scattered a bit chaotically here and there around several spreadsheets without clear descriptions. There were plenty of abbreviations about materials, processes, and products on Tommy’s spreadsheet which she never came across before. She’s never been to the factory as its too far away to travel. However, she called Tommy a couple of times to clarify and got confident she figured it out. Even though she never worked with this particular manufacturing process before, she is now fairly confident she understands what is going in the factory.
She knows she can even visualise and analyse Tommy’s data with PowerBI, if she needs to, but she still struggles to use the tool productively. Lisa knows there are pretty nerdy IT colleagues at the sister division, but they are so weird to talk to. Besides, her budget never counted on involving external help with data validation and post-processing. She has a university degree and she’s done this type of assignment many times over, remember?
Peter, a LCA Expert — The Data Validator
Then we have a 3rd party body to validate whether Lisa has done a good job within the given Product Category Rules (PCRs) which Tommy’s company’s products fall under.
Let us call our validator Peter. He has a Doctorate in LCA methodology with focus on construction materials, and Masters within organic chemistry. As a student, Peter belonged to one the first years who studied anthropological effects of climate change. Peter was pioneering the development of the PCR regarding Tommy’s company’s product types.
Today, Peter is a senior internationally recognised LCA expert. He has behind him over 100 lifecycle assessments of extremely complex supply chains. Peter’s organisation was responsible for defining the PCR for a few construction product types. Hence, Peter gets involved in the final EPD evaluation process to double check Lisa’s work method, data, and results.
Lisa’s work looks good. Just to be sure, Peter cross-references Lisa’s results with competing products within the same PCR portfolio. All looks and feels reasonable.
Peter sticks to his work methods and does his calculations sometimes with pen and paper, sometimes in MS Excel. Peter writes his reports in MS Word. He retypes his paper or Excel based calculations to a table in MS Word. Afterwards, he exports the document to PDF and sends it to his branch colleagues for distribution and further packaging into an electronic EPD format.
Peter prints out Lisa’s EPD he just validated and stores it next to the 100+ conducted LCAs in his personal archive made of banana boxes. He knows that letting his banana boxes lie on the office floor is not optimal. Yet, he considers himself to be relatively organised as he keeps a separate box for his LCAs. Another box is for 3rd party EPD validations.
Peter’s banana boxes full of LCAs and EPDs are famous around the country among LCA analysts who work for governmental bodies. A few times have Peter’s banana boxes served as a data source for comprising national GEFs.
John, the BIM Enabled Designer — The Catalyst
John is a designer eager to constantly improve his work methods through automation, digitisation, and new tooling. He mastered basics of BIM validation, pulls off beautiful photorealistic visualisations in gaming engines, and develops intricate design automation scripts with visual programming.
Excited to hear the learn about a plugin to his BIM design tool that can prepare LCAs, he persuades his boss to purchase the software licence. The boss buys into the idea his team could expand the service portfolio and offer carbon calculations in BIM models to their customers.
Unaware of how product data was collected, processed, and transformed before they become EPDs in his newly acquired plugin, he trusts the LCA analysts did their job. John has no reason to doubt the correctness of LCAs and EPDs and lives in the belief the data was validated using rigorous data scientific methods.
John produces a bill of quantities of the project from his BIM model, and multiplies the EPDs’ emission factors with the quantities.
John is aware of a few design issues in his models, yet he is remains unaware of many issues with quantities and wrong application of material libraries. John has not attended Illuminum’s course in Carbon Checker. He has no idea about the erroneous nature of fallacious nature of environmental impact constants. Hence, John blindly assigns EPDs to material libraries in his model, and multiplies the constants from the EPD database with his quantities.
Unaware of the consequences of his actions, John has become the catalyst for distribution of all present errors. He has increased the entropy of the data flow exponentially.
Beyond which point in time would it become impossible to validate such a data flow, and determine, if the final EPD results are true?
Let us get a few issues out of the door — spreadsheets contain man made mistakes, claims research of Prof. Raymond R. Panko.
See:
- a brief summary of Panko’s research made popular by Salesforce and Forbes that a vast majority of spreadsheets is erroneous,
- Panko’s research papers on erroneous nature of spreadsheets.
For me, it is no wonder that humanity is so prone to corrupt the data in a spreadsheet. Consider that one user interface element allows you to
- manually enter data,
- process data with equations,
- and return results.
In such an interface, all you need is one mistype or one misclick to add, wipe out, or drag stuff out of the ordinary. You don’t even notice it happened. The entire construction industry is heavily Excel dependent due to its low level of digital maturity.
It’s not an issue of Microsoft or Google leveraging bad software. It’s an issue of a lack of separation of data inputs from data outputs, and the processing logic which divides them. The responsibility for that lies with the user.
Why should embedded carbon calculations, LCAs and EPDs hold any validity just because we managed to multiply GEFs or EPDs with quantities from unvalidated BIM models?
Data corruption happens in spreadsheet environments every day. It happens to IT professionals too. It happens to me too. That’s why I stopped using spreadsheets in commercial contexts above 40 hours of work. I could not prove my work to stand correct. Hence, I migrated to modern data science tools and programming languages dedicated specifically to gain control over data and calculations.
What do you think? How did this team do?
Research suggests that vast majority of spreadsheets contain man made errors. What are the odds that Tommy, Lisa and Peter have detected all the errors they made themselves? What are the odds they detected the errors of their peers l up the data stream?
What are the odds that Tommy understood all requirements that Lisa needed of him? Did Tommy manage to collect all data without errors?
What are the odds Lisa did not kid herself and understood the factory, its processes and KPIs through Tommy’s eyes, so she’d be able to spot any errors in Tommy’s data? Was Tommy’s own understanding of the factory and its processes good enough in the first place? And what are the odds Tommy withheld some information on purpose?
What about Peter’s PCR? Does it actually reflect the reality on the floor of dozens of different manufacturers with unique processes that manufacture the same type of product?
Note, no stakeholder in this data flow has communicated a possible degree of error behind any of the constants. They all just passed on one-dimensional numbers, as if they were the truth, an absolute, unequivocal and dichotomously undeniable truth.
I speculate that this phenomenon is a cultural residual of patriarchy—construction has historically been a prerogative of overconfident men. So is the style of communication.
AEC industry as a work culture does not communicate any degree of possible errors to any stakeholder further down the data stream. Additionally, the professionals get to believe it’s numbers to hold true, and present them that way.
Intentional Censoring
LCA contains all the details about product and the intricacies of the manufacturing process. Yet, an EPD is a derivation and a data degradation of a LCA. An EPD presents only final results not to disclose manufacturers’ unique knowhow and secrets. An EPD is an intentionally censored piece of information degraded by professionals whose only skill of data manipulation is usually within MS Excel, the same software which Penko’s research suggests are exceptionally bad at managing.
Hence, from where I stand, I look at the EPD validation stamp pressed in by Peter, the EPD validator, and I wonder. Am I supposed to trust that Peter was sure that Lisa did her job who in her turn was sure that Tommy did his?
I Have a Doubt.
Because I’ve worked next to dozens environmental consultants like Lisa at respectable technical consultancies, because I’ve had the chance to get to know the types of Tommy working for various manufacturers and recyclers, because I’ve had the pleasure of debating the contents of banana boxes with an internationally recognised LCA expert like Peter, I have serious doubts about this particular data flow.
However, my biggest doubt stems from the fact that I myself was in Tommy’s and Lisa’s position once upon a time. I know how doubtful I was about the relevance of my own work results, because nobody ever told me which numbers were solid, and which were fluffy.
I imagine the process of preparing an EPD like an assembly line with many work stations. At every workstation there has been a wrong instruction given to the laborer. The labourer does his job and passes the product down the line. At the end, comes out of a product that should never have existed. Yet, the CSO, Sales and Top Management feel ready to start a marketing campaign that they finally know the environmental impact of their product, and they are a few percent better than their competition. It’s a bad joke.
It’s critical to mention that all of these stakeholders meant well, worked to the best of their ability, honour, and professional integrity. All of these stakeholders only wish what’s best for the environment. They follow their methods and relevant code of conduct. They think, talk and act transparent, and strive genuinely to achieve a fair competition from environmental point of view. However, they still participate in the quackery of carbon modelling and life cycle assessments in construction as they did not bother to question the probability of calculated results to hold true. Too hastily have they passed the Stastics course at the university or skipped it altogether. Now, their house of cards are shaking.
Closing Notes on EPDs and LCAs
EPDs and LCAs are both a great idea! Their methodologies and dedicated PCRs are fairly well developed reflecting many realities within the most intricate of industries. Yet, from my standpoint, many work methods must reform within LCA as a discipline, if I am to believe that the numbers are rather true than false.
As long as we tolerate data corruption prone work methods, EPDs won’t gain data scientific rigour. Hence, as of 2024, EPDs are overrated.
In this context, human operators corrupt the data flow to an unknown degree of error an unknown amount of times. They corrupt it unknowingly and unintentionally. Let us thus support LCA analysts with the best data science can offer so we get more reliable EPDs out on the market one day.
Lacking Skills among BIM Coordinators
BIM coordinators have an important role to play in the context of embedded carbon calculation of entire buildings. However, they will have to learn a few new skills.
- Scrutinising erroneous material descriptions is crucial, yet not applied in practice. There are neither good methods for this type of control, nor reliable algorithms that would detect an issue with material descriptions. I have developed an imperfect solution to detect lies among material descriptions, and taught it to a handful people so far, but that does not help the industry until we scale it, and the method gets a few supporting algorithms of its own.
- BIM coordinators must learn the needs of LCA analysts, learn their language, needs and reasoning. They will have to learn how climate calculations play a role for BREEAM and LEED certifications, EU Taxonomy, GRI, green loans etc.
- There are still plenty of BIM coordinators who struggle to deliver 3D coordination and validated bills of quantities with confidence. We need to close that knowledge gap among the BIM staff, so we can rely on the quality of design, and know its imperfections with confidence.
Further on, no matter how experienced and skilled a BIM coordinator is, nobody can guarantee the absolute correctness and of geometries and materials presented, and pinpoint all the mistakes. I too will sometimes pass unknown unknowns down the data stream.
I ask myself often — where I am making a logical error in my reasoning? All of this cannot be true, or can it? I cannot be the only who sees these errors. But, it gets still worse. Keep reading.
4) Climate Databases Contain Expired and Fallacious Bits of Information
4️There are Nordic authorities that have developed their climate databases with Generic Emission Factors (GEFs). Great!
Unfortunately, they contain a few logical fallacies. Additionally, there are critical data gaps as well. Let’s take a look at them:
🙅 GEFs lack the base count of measurements taken to produce the mean value which the GEF should reflect. We do not know, if a generic emission factor reflects just 1 case specific measurement, or 20, or 200. The climate database itself does not tell.
🙅 Authorities do not even describe, if GEF values in their climate databases are arithmetically weighted means, averages, maximums, minimums or just some random numbers. We are left to figure that out ourselves. Perhaps it is written in some document somewhere. Perhaps it was said on a public meeting. Yet, the descriptions of columns or the documentation of the climate database won’t tell.
Every time I query about the nature of GEFs to authorities’ staff who work with development of respective climate databases, I tend to receive this type of a reply:
“Oh, well, we know the values are not perfect. They are ‘just generic’ emission factors.”
Yes, I am aware. But, do you present the mean values or averages? Also, from what amount of unique data points, measurements or EPDs have you assembled the emission factor value from?
Until this day, I do not know what basic statistical number types are used to communicate GEFs in the Nordics. The AEC industry is completely devoid of existence of statistics, and that phenomena can be uncertain.
I live in a wishful thinking that the authorities serve us the weighted arithmetic mean values reflecting the market shares of known competitors within a material supply chain. Deep down I fear we are getting mere averages of an incomplete and misunderstood picture about the market shares of each competitor.
🙅 Multiple material resources in the Swedish Transportation Administration’s climate database (Trafikverket’s Klimatkalkyl) have broken references to the studies behind their GEFs. Links to the original sources of information, the studies that ought to have produced the GEF, do not always work, and search engines won’t find the cited sources.
Try to look up the sources behind for “jord” (soil), “krossmaterial” (crushed gravel), and other resources with broken references. You won’t succeed. Gravel and soil are kind of important for shafting and laying roads and railroads.
Any academic journal would reject a study without transparent sources and citations, or unique reasoning and data produced within that paper. Yet, Trafikverket’s Klimatkalkyl publishes constants for an entire industry which lack both scientific rigour and transparency.
🙅 Some GEFs are long expired. For example, the emission factor for “betong, anläggning” (concrete, facility) in Trafikverket’s Klimatkalkyl is 669 days pass its validity date to date 2024–01–25, and counting. Is not concrete one of the most volumetrically used materials in the infrastructure sector?
🙅 Both Finnish Environmental Institute (SYKE) and Swedish National Board of Housing, Building and Planning (Boverket) seem to oversimplify the management of standard deviations in their respective climate databases by the way they apply their “Conservative” emission factors.
Today, SYKE applies their Conservative Factor of +20% on top of the “Typical” Emission Factor for all physical materials in their climate database. LCA analysts report numeric results for the Typical emission factors only. The conservative emission factor is, thus, formulated as 1.2 x Typical value.
Boverket applies their Conservative factor of +25% on top of 201 out of 216 resources in their climate database.
Both climate databases apply their respective conservative emission factors to virtually all material supply chains. The climate databases do not distinguish, if the material resource is of virgin or recycled nature. Conservative Factor of plastics, metals, concrete, and wooden resources is one and the same.
🤔 What are the odds that the standard deviation of climate performance of manufacturers within hundreds of fundamentally different material supply chains is always the same?
Imagine the supply chain of concrete with its consequent limestone and sand extraction, production of crushed minerals and cement.
Then imagine the journey of aluminium with bauxite extraction and who knows how complex post-processing to get aluminium out of the ore. Imagine the journey of recycled aluminium.
Then picture the journey of plastics from crude oil or bio-based streams refined to pellets, and their subsequent extrusion.
Then do the same exercise for the journey of wooden products, different types of glass, explosives, epoxy, steel…
Do all these fundamentally different material supply chains, their respective manufacturers, recyclers, unique logistics, unique manufacturing processes, and unique energy mixes vary the same across different competitors within the supply chain? Probably not. I’d wager exceptionally unlikely would be the right term here.
Do we manage unique deviations of hundreds of fundamentally different material supply chains the same in our climate databases? Yes, we do. Exactly the same.
To finish off the fixed value of a Conservative Factor, let’s nuance the supply chain of steel. The HYBRIT project of Vattenfall, SSAB, and LKAB strives to produce carbon neutral steel with burning hydrogen. Normally, coke or electric arc furnace with local electricity mix would stand behind the melting of iron ore. Hence, on one hand you have steel with virtually zero climate impact within the system boundaries. On the other hand, you have a substantial impact with an infinitely larger emission factor. Try to squeeze an infinity to a Conservative Factor of +20% or +25%, respectively.
I coined a new term for the logical fallacy behind the fixed value of Conservative Factor — numeric socialism. I speculate that the nature of this fallacy may be caused by authorities’ obligatory compliance to the Law of Open Tenders or by Nordics heritage of socialist and equality policies.
Authorities and by extension their staff are not allowed to favour certain suppliers over others. This principle should preserve an open competition for public services.
The mindset of non-favourism might have transitioned to climate databases as a single constant called the Conservative Factor. The constant is exactly the same not to favour one material supply chain over the other, thus it equalises material supply chains to the same terms of competition.
Beware of proliferation of cultural or political values into our datasets and maths.
📢 At Illuminum, we do not use Conservative GEFs. By default, we use the Typical values.
The introduction of the Conservative Factors is, in theory, a welcomed improvement. Yet, I am convinced that we must conduct individual research of how standard deviations occur for each resource, and understand those respective material supply chains separately.
I heard an argument from staff of one of the authorities: “We motivate AEC projects to use the product specific EPDs by implementing the Conservative Factor”.
Huh? Then make it a law to use EPDs. Regulate, but don’t introduce bias into your datasets which lacks scientific rigour. You will only corrupt your own statistics about what is going on with carbon in construction.
I speculate that SYKE were the first to introduce this logical fallacy in their, otherwise front-running and ambitious, climate database. Boverket mostly copied SYKE throughout the process of establishing the Swedish climate database. In Sweden, there tends to be the need ‘to be extra sure and extra safe’. Hence, Swedes placed themselves 5% above Finns with their perception of uncertainties.
The real trouble is that Boverket serves the conservative emission factors as the first choice constants on their webpage, not the Typical ones which were actually calculated by LCA analysts. Hence, the Swedish construction industry is nudged to work with unreasonably higher emissions factors which are 1.25 times larger than they need to be.
In comparison, SYKE does not do any nudging between Typical versus Conservative values on their official channels. They just publish a JSON file and an API with all the data, and learn from how the industry reacts.
GEFs and authorities’ climate databases are free of charge to the public. To save money, most AEC projects will likely base carbon calculations on GEFs in early design stages.
If Conservative GEFs remain Boverket’s first choice in an early design stage, then every EPD based calculation would likely lead to a decreased level of carbon emissions in the As-Build design phase. Swedish designers, contractors, and developers could then heroically clap themselves on their shoulders for how much carbon they reduced throughout the design process. Of course, this sense of achievement would be illusory. In reality, they wouldn’t know, if they reduced anything or made things worse.
Qualitative Comparison of Nordic GEFs
I have worked only with Swedish and Finnish authorities’ climate databases. From my subjective stance, the most described, researched and developed climate database comes from SYKE. Finnish knowledge of materials used and construction processes seems also years ahead of the Swedish.
Trafikverket’s climate database lacks clear descriptions of variables in Klimatkalkyl. Some constants are expired. Many links to references are broken. The authority clearly fails to maintain constants in its own product. This makes me wonder about the quality of the recipes used to model construction processes which are also part of Trafikverket’s Klimatkalkyl.
Infrastructure projects are expected to use Trafikverket’s Klimatkalkyl when the investment goes above €5 million (50 million SEK). Yet, there is no support for BIM validation of the quantities and material descriptions in Klimatkalkyl. Additionally, the authority struggles to develop and execute basic BIM and CAD requirements in its projects. Hence, we cannot hope to get any reliable carbon calculations within the Swedish infrastructure context anytime soon.
Trafikverket’s annual budget is roughly €5 to 6 billion. Assembling decent digital tooling to serve all projects costs categorically less than laying a single road or raising a bridge. Thus, there is an issue of poorly structured priorities in Sweden, despite the fact the country promised to be a climate neutral by 2045. Without getting Trafikverket’s infrastructure projects on board, which cannot possibly happen without decent tools for carbon modelling, the climate neutrality promise will fail.
Some Nordic countries pledged to become carbon neutral in a decade or two. Climate declarations have become obligatory for most of the new build in Sweden since 2023, and are about to roll out in Finland since 2025. Yet, the tooling the industry gets to work with lacks the rigour of data science.
The status of data quality reflects upon how well the society understands the logic it ought to apply to solve a problem.
I weep in silence knowing we have yet to grasp the logic of carbon modelling in construction. However, a part of me remains hopeful still. Understanding the core of a problem lasts usually longer, then solving the problem. Help me thus to spread this article to your colleagues, so they could apply their skills on solving the problems.
Essentially, the construction industry struggles with an algebraic problem of merging constants and variables in a reliable fashion.
I look forward to an academic debate with fellow BIM coordinators and LCA analysts that would improve the rigour of carbon modelling in the future.
Closing Words
In the 19th century, medical doctor Ignaz Semmelweis proposed that surgeons’ poor hand hygiene correlates with mortality rate of mothers giving birth. Indeed, his hospital and its two departments were notorious in Vienna. In the 1st department, only male doctors — who also performed autopsies — would attend at childbirth. In the 2nd department, only midwifes would attend at childbirth.
High mortality levels of mothers in the 1st department were known to the public, and upcoming mothers would rather give birth on the dirty streets than in the hospital.
Semmelweis noticed the pattern in statistics among the two departments clearly. He enforced handwashing for the doctors of the 1st department.
The notion was preposterous. The doctors revolted against it afterwards, and ceased to wash their hands.
Ignaz could not provide a satisfying theory or evidence. All he had to show was the statistics of dead mothers.
After his colleague and friend cut himself during an autopsy, and died of an infection similar to the mothers’ at childbirth, Ignaz theorised of an invisible “substance” living on skin which causes fewer and death.
Semmelweis got mocked by his medical colleagues, lost his position as the Prime Doctor, and suffered a mental breakdown. Since he would not stop raving about an invisible substance causing fewer and death, and insisted on doctors’ hand hygiene, he was condemned to a mental asylum.
I wonder what seemed more preposterous to the Viennese at the time— the fact that Semmelweis conceived of an invisible substance causing symptoms and death, or the fact that he implied that a male doctor was worse at the same task than a female midwife?
Upon his arrival to the mental asylum, Semmelweis got beaten by the guards. He succumbed to a gangrenous wound which was possibly inflicted by the beating. An invisible substance festered in and concluded his life.
Two decades later, Robert Koch proved the existence of bacteria for which he got the Nobel Prize for Physiology.
Today, we consider the presence of invisible life-forms and see-through substances a common knowledge. We even wash our hands on a regular basis and judge those who don’t. Medical doctors wash their hands a dozen of times while on duty.
To conceive of new facts that cause inconvenience for a large group of people is no easy position to be in. To imply the need for further education, reforms, and dissect stakeholders’ fallacies is not an act of arrogance. It takes passion, new skill, and integrity to move masses towards modernity.
When will this stop?
I have reached out directly to LCA and BIM specialists working for governmental bodies at Statens vegvesen (NO), Trafikverket (SE), Boverket (SE), SYKE (FI), Väylävirasto (FI), and Vejdirektoratet (DK) and warned warned about The Fallacies of Carbon Modelling in Construction. Swedish consultants at SWECO and Ramböll, and private developers like Vasakronan and Jernhusen were informed. I reached out LCA analysts and environmental consultants who were in business over 20 years.
Silence. The public sector leaves it (perhaps intentionally) to the market. Yet, the market does not even ask the right questions, and will not work on the issues unless developers allocate budgets for it.
Who takes responsibility and for what, I do not know. Who will feel the impacts of poor carbon modelling, that I do know.
I believe regulation of the quality of the design is needed as a precondition to mitigate carbon emissions from construction. Until now, the market has failed to deliver reliable quantities, material descriptions, GEFs, and EPDs. I cannot see these numeric and algebraic issues fixing themselves from within the market on their own this decade. Regulation is needed.
With my colleagues at Illuminum, we will gladly support the authorities to improve the quality of their GEFs, and climate databases. We know how. We know what needs to be done:
- remove known logical fallacies,
- fill in important data gaps,
- introduce statistics and probabilistic thinking,
- and enforce BIM requirements with focus on the needs of LCA analysts.
We’ll gladly support developers with introduction of new requirements on quality of BIM models. We’ll reshuffle the organisation of work and distribution of tasks and responsibilities in an AEC project to get the right type of competence for the right job.
After all, technical consultancies have no easy quality gap to close, so that their services would become more relevant. The organisation model of separated know-how and market silos serves them poorly in this case.
If we’d manage a few of these improvements, it would be a huge victory for the climate mitigation agenda. Maybe, one day, we might even formulate some proofs for some parts of our carbon calculations.
So, Can We Automate Carbon Calculations?
For 7 years I prodded, if carbon calculations can be automated. Today, I know it can be done, but it would be a terrible idea at this point in history. We would have no means to guarantee the correctness of the results.
Please, do not attempt to automate carbon calculations in construction. As long as human designers draw buildings manually, there will be an unique and unknown amount of errors in both quantities and material descriptions in every project.
Hence, the idea of doing LCAs of entire buildings on one click, no matter how tempting it may be, is a dangerous notion.
One day the society might have access to a fully parametric and generative design process. Human designers might step aside and let algorithms render geometries and metadata for all technical disciplines.
That is when automation of carbon calculations and whole life LCAs would make sense. Until then, BIM validated carbon calculations should be delivered as a service from someone who understands the erroneous nature of BIM models.
Even though a BIM coordinator could calculate a more accurate numeric result of embedded carbon than an environmental consultant, BIM coordinators do not know how to interpret the results. Nor would BIM coordinators know how to lab with other material choices or solutions. Hence, pairing environmental consultants and BIM coordinators to work synergistically is crucial for any design manager to solve.
Can We Solve Some of the Fallacies Somehow Today?
Yes, we can.
Before you fetch the answers, please pause. If you learnt anything new that might help you in your role, please do give this article some traction. Click the Clap button, share it to your colleagues, and comment your experiences with carbon modelling below.
If you’d like to prevent these errors from happening in your project, request our services on https://illuminum.se/contact