Sperm meets egg and your genetic destiny is sealed. For most people it’s just that simple. But genetics is anything but simple. While certainly a necessary event in our individual creation story, our parents getting together for a proverbial roll in the hay is hardly the only act in our life’s genetic drama. Life continues to mold the way our DNA expresses itself in ways we have only begun to fathom long after our parents did their part to bring us into the world.
Genes don’t exist in a vacuum. Given the right circumstances they can turn on or off. This capacity to respond quickly to environmental stress can have significant consequences for generations to come.
While it’s often thought that mutations are where all the action is, the interaction between our genes and the larger world is going on in real time beneath the radar, frequently with health consequences that can be as severe as cancer even if they are less perceptible to those suffering from them. Of course, this capacity for genes to express themselves or not in response to cues from the environment can also have positive impacts on our well-being. This phenomenon, known as epigenetics, is now well documented.
Epigentics has to do with gene expression, as opposed to gene alteration. Much of our DNA spends its time silently replicating without actually doing anything remarkable, or even much of anything at all. It’s somewhat analogous to a program that’s been downloaded onto a laptop that the owner has forgotten about. It’s only when someone asks “what’s this?” and double clicks on the icon that we discover what the software does and whether its activation comes with any compatibility issues that might cause us to regret our curiosity later.
Sadly, the event that turns on a formerly silent portion of the genetic code, or turns off a formerly active one, is often a traumatic one of some sort. Indeed, it was a famine that gave epigenetics its first real moment in the scientific spotlight.
During the winter of 1945, the Nazis cut off food supplies to the Dutch to punish them for a railway strike by Dutch workers that had been launched with the intent of interrupting the flow of reinforcements and supplies to the front. By the time the war ended in May of that same year approximately 20,000 people had starved to death in the Netherlands as a result.
In 2013, while reviewing the medical records of 408,015 Dutch males born between 1944 and 1947 and subsequently examined for possible military service at the age of 18, a team of researchers from Columbia University found that men whose mothers had been pregnant with them during the famine of early 1945 were far more likely to suffer a variety of health problems later as adults. As a result, these men experienced a far higher mortality rate than those conceived and born either before the famine or afterwards.
The epidemiologists at Columbia University were only the latest to document significantly greater occurrences of health issues among those in utero during the months known as the Dutch Hunger Winter. Prenatal exposure to malnutrition during that period had already been linked to higher rates of cardiovascular disease, diabetes, obesity, pulmonary disease, high blood pressure, and kidney disease to name just a few. Overall, if your mother was pregnant with you in the Netherlands during the winter of 1945, your mortality rate was 10% greater after the age of 68 than those born prior to or conceived following those horrible months.
Saying that a gene ‘decides’ when it is transcribed is like saying that a recipe decides when a cake is baked.
Thus transcription factors regulate genes. What regulates transcription factors? The answer devastates the concept of genetic determinism: the environment…genes don’t make sense outside the context of environment. Promoters and [DNA] transcription factor introduce if/then clauses: ‘If you smell your baby, then activate the oxytocin gene.’ ~ Robert M. Sapolsky
While studying anthropology at university, an elephant was often present in the classroom. Sometimes it was gingerly acknowledged. Other times it was simply ignored or dismissed as self-evidently false with an off hand remark. That elephant was determinism.
Determinism is, in my experience, a label that critics like to attach to other ideas they want to undermine more than it is a doctrine people typically claim to strictly adhere to. I’m not sure I’ve ever met a true absolute determinist. That said, there seemed to be a reasonable amount of tolerance for certain varieties of determinism within many of the same academic circles often hurling the charge in the direction of the physical sciences.
For example, culturally determined (or constructed) things were considered less toxic to concepts like free will or humanity’s supposed elite status than genetically or environmentally determined aspects of our existence. The reason cultural determinism is more likely to get a pass has little to do with the evidence and a great deal to do with human psychology. It’s much more comforting to believe that humans serve as the fundamental unit of change than it is having genes or random forces of nature play this role to an equal or greater degree.
The field of anthropology long ago split itself into physical and cultural branches, in part, I think, to accommodate those uncomfortable with the demotion humanity had been dealt by evolution. This bifurcation never felt quite right to me. It seemed that once again humans were removing themselves from nature by treating culture as an entirely separate force acting independently rather than a product of natural forces, with all the connections and limitations that entails. That any major environmental change, whether it was one we intentionally initiated or one thrust upon us, would have physical and psychological impacts upon both the individuals and groups experiencing them seems obvious. The only question is how these impacts will manifest themselves at the personal level and what the consequences will be if these effects are scaled up to involve large populations.
Epigenetics is quickly erasing much of what’s left of the artificial line many in fields like cultural anthropology would prefer to keep between humanity and the physical environment they inhabit. Scientists are now finding it’s not just sudden traumatic shifts in a person’s environment like a war induced famine that shape gene expression, typically for generations to come. Day to day culture itself influences gene expression, creating self reinforcing feedback loops between the genes being switched on or off and the behaviors influencing their expression.
In a study just published in the journal eLife, researchers report that as much as one quarter of gene expression is likely the result of cultural differences between populations rather than their genetic ancestry. The researchers were examining two diverse populations of Latino children, one in Puerto Rico and the other in Mexico.
That 25% of the gene expression in these two populations might “reflect a biological stamp made by the different experiences, practices, and environmental exposures of the two subgroups” has profound implications for how we view both our culture and our biology. As Noah Zaitlen, one of the co-authors of the study put it, “These data suggest that the interplay between race and ethnicity as social constructs and genetic ancestry as a biological construct is more complex than we had realized.”
Culture is a broad term that touches upon virtually every aspect of human activity. Simply sitting on the couch eating a bag of potato chips technically qualifies as a cultural activity given both the couch and the chips, to say nothing of the eater, are products of a particular cultural milieu. Most of the time we are swimming through culture the way fish swim through water: without either much awareness or effort.
But if our culture is both shaping and being shaped by our environment, providing feedback that influences gene expression as it does so, then we owe it to ourselves and to the future generations whose genetic story we are continuously writing and rewriting to take a more deliberate stance. Public policy, diet, and even time spent in front of the screen could very well be triggering changes both subtle and profound that we are not even aware of. Many of these changes may be positive, while others will be only temporary. Regardless, that a more reflective intentional approach to culture would likely have a net positive effect from our genome to whole ecosystems appears increasingly difficult to seriously dispute.
Other recent stories by Craig Axford: Are You Getting Enough Awe in Your Experiential Diet? & Objectivity vs. Subjectivity: An Incongruity That Isn’t Really