A History of Anesthesia
A Terror That Surpasses All Description
In 1844, Gardner Quincy Colton — semi-professional dentist, salesman, and huckster — came to Connecticut to demonstrate the exhilarating effects of nitrous oxide. It was not a medical demonstration, and Colton was just one of many gas-hawking entrepreneurs traveling the country and staging “ether frolics.” Attendees paid for a few sniffs, or just to watch other spectators get wrecked and embarrass themselves. The events were, of course, quite popular.
At Colton’s 1844 show was Horace Wells, then 29 years old and trying to make it as a dentist. After watching a man inhale the gas, trip, and laugh through a painful leg injury, Wells thought perhaps this pain-relieving effect could be used for dental surgery. Given the poor state of dental hygiene at the time, tooth extractions were all too common. And, given that there was no such thing as general anesthesia — the word itself didn’t even exist — any kind of surgery was a brutal, trying, and often deadly experience that was accurately described as “a terror that surpasses all description, and the most torturing pain.”
In the grand tradition of self-experimentation, Wells inhaled some nitrous oxide and let an associate pull out one of his teeth. He experienced no pain. Within a year, he had organized a public demonstration of what he expected to be a revolutionary wonder drug. Though the patient claimed to experience little pain during the procedure, he had moaned throughout. The assembled doctors, soured on the gas, called it a “humbug.”
Around the same time, Wells’s one-time classmate and dental practice partner William T.G. Morton was self-experimenting with ether for anesthetic purposes. Whereas nitrous was fast-acting and brief, ether could completely incapacitate patients, making it perhaps more suitable for major surgeries. So, a year after Wells’s failed demonstration, Morton came to the same surgical theatre at Massachusetts General — now called the Ether Dome — and “etherized” a patient, whose neck tumor was then painlessly excised (some recent historical digging suggests that Wells’s demonstration may not have actually occurred there). This time, the surgeon remarked, “Gentlemen, this is no humbug!”
And from thereafter, anesthesia was immediately accepted as standard medical practice. No, not exactly. We shall return to the bizarre story of Wells and Morton, but first let’s go backwards.
Before Ether
Morton and Wells, it turns out, weren’t the first to use inhalants for general anesthesia. Crawford Long was a surgeon from Georgia who had been using ether as an anesthetic in his surgical practice since 1842. He hadn’t bothered to publish his results, or, you know, tell anyone. Sort of a missed opportunity for him.
But then go back a little further to the late 1700s, when famed chemist Humphry Davy was performing experiments with nitrous oxide (which he called “dephlogisticated nitrous air,” a nod to the then-modern phlogiston theory of chemistry). Again in the tradition of self-experimentation, Davy quickly discovered the gas’s euphoric effects, even experiencing a classic drug-induced revelation: “Nothing exists but thoughts!” he wrote, dictates of scientific formality forcing him to leave off the “Duuuuude” that surely prefaced it. Then, he nearly as quickly became addicted, becoming a great example of Pavlovian conditioning: “the desire to breathe the gas is awakened in me by the sight of a person breathing, or even by that of an air-bag or air-holder.” Good thing for him everyone in the world isn’t always breathing.
Davy’s high-society acquaintances were next, including Coleridge (prior to his career as an opium eater). In a scene ripe with Freudian symbolism, Davy sat his friends down, held up a big silk balloon full of laughing gas, and allowed them to suck down the precious mother’s milk from that gaseous teat. He recorded their observations, noting just how lacking was language to describe the experience: one user said they felt “like the sound of a harp,” another used the term “a display sky-rockets,” and another, the poet Robert Southey, might have been the first to use the phrase “turned on.” Davy said the gas was “Inconceivably pleasurable,” and also described a more megalomaniacal/Incredible Hulk-like reaction: “I seemed to be a sublime being, newly created and superior to other mortals … the thrilling increased, the sense of muscular power became greater …” (the 18th century equivalent of “here, hold my beer”).
Though a fabulous chemist and a deliverer of a first-rate gas-bag, Davy wasn’t a surgeon, and he never used nitrous oxide as an anesthetic. But he did write an early 1800s monograph in which he suggested that the stimulating properties of the gas “may probably be used with advantage during surgical operations.”
It’s a fascinating thing, when you look at the history of an idea, to see a person or group come right up to the edge of a major innovation, yet never quite make that final little step. Why did it take four decades for anyone in the Western world to go from Davy’s recognition that nitrous oxide is capable of “destroying physical pain” to think to use it for surgery?
The Culture of Pain
The answer, or at least part of it, is that the Western world was beholden to a very strange assumption: that pain was good. As Stephanie Snow describes in Blessed Days of Anesthesia, pain was thought to have both physiological and moral value. On a larger scale, Christian beliefs dictated that pain was god’s will, punishment for sin, and a necessary and unavoidable feature of the human condition. Doctors — perhaps partially wrapped up in those same beliefs — believed pain to be a vital and beneficial part of medical treatment.
Pain was seen as especially important to surgery, as it was thought to stimulate the body to deal with the stress of the operation: the “smart of the knife is a powerful stimulant, and it is much better to hear a man bawl lustily than to see him sink silently into the grave.” In fact, Davy’s belief in the utility of nitrous oxide for surgery was based not on its analgesic qualities, but its stimulating effects. In a way, this belief made a certain kind of sense: you were more likely to die if you passed out during an operation, so staying awake was a good thing (thus, even soporific analgesics like alcohol were to be avoided). It is minor but perhaps meaningful to note that the word “anesthesia” did not even exist until well after Morton’s ether demonstration — the term was actually suggested to him by Oliver Wendell Holmes.
I also think that surgeons were ignoring the concept of anesthesia because of a narrowness of focus akin to functional fixedness. Speed was the most prized ability of scalpel wielders at the time, since longer operations were more dangerous. It seems likely to me that doctors became so fixated on improving speed and efficiency that they weren’t able to reorient their thinking to ponder solutions that entirely obviated the need for quickness. That, and doctors were an immensely egotistical lot — when Semmelweis suggested doctors wash their hands to prevent childbed fever, they ignored him because “gentlemen’s hands are clean.”
As anesthesia slowly wended its way into the practice of western medicine starting in the late 1840s, it revealed even more ghastly and stifling cultural beliefs than the simple idea that pain was good. For example, many doctors were reluctant to anesthetize women, out of concern their delicate frames could not recover from the soporific effect, or because it suggested impropriety to be around an unconscious woman, or because labor pains were a punishment for Eve’s treachery (also, weirdly, at least one doctor worried that under anesthesia, women in labor would feel “the sensations of coitus” rather than the “pangs of travail”).
And still more. “Heroic manly fortitude” (actual quote) was said to make most men insensitive to pain, thus rendering anesthesia unnecessary (one man who suffered through an operation for an anal fistula without anesthetic was said to have “borne the operation heroically and only complained of smarting in the parts”). Anesthesia was seen as unnecessary for anyone that wasn’t white, since the “lesser” races were supposedly insensitive to pain. And, perhaps most horrifyingly of all: infants were not routinely given general anesthesia until 1985 on the assumption they could not feel, or remember, pain. For many doctors, any excuse to not give anesthesia would do.
As it grew to be more accepted through the 18th century, anesthesia made life much easier for patients. But, says Snow, a perhaps broader consequence was that it changed western cultural assumptions about pain. Pain was no longer a necessity of the human condition. It is no coincidence, she argues, “that from the 1860s onwards public executions became private events, legislation was introduced to reduce cruelty to animals in scientific experiments, and ideas of pain in Christian doctrine were reworked.” By the 1880s, even death didn’t have to be painful: by then there was already a book extolling the virtues of euthanasia.
Everyone Else
That’s all very Eurocentric, so it’s worth taking a side trip to discuss what everyone else was doing. Around the same time Davy was staging his ether frolics and realizing that nothing is real but thoughts, man, Japanese surgeon Hanaoka Seishū was developing and testing a compound called tsūsensan. It was an herbal concoction comprised of nearly a dozen ingredients, and the active compounds included narcotizing paralytics like scopolamine and atropine. Special credit, too, goes to his wife (name unknown, but there’s a movie about her), who tested the creations and was blinded by a bad batch. In 1804, Seishū performed a partial mastectomy and anesthetized the patient with tsūsensan. It’s considered by most to be the first documented usage of general anesthesia; Seishū later performed more than 100 other surgeries, and was widely renowned in Japan (see his actual illustrated casebook here; more info here and here)
One can go further back still, where things become much more speculative. Hua Tuo was a second-century AD Chinese surgeon who supposedly crafted a mixture of wine and herbs called mafeisan, which he used as a general anesthetic. Modern attempts to recreate it have not been successful, in part because Tuo burned all his manuscripts shortly before he died (the collected works were referred to as The Book of the Black Bag, which is great), and possibly because the brew may have been secondary to anesthetic effects induced by acupuncture. Tuo was the last Chinese surgeon for quite a long time, since Confucianism held surgery to be a taboo form of body mutilation — providing an interesting correspondence to how Christian conceptions of pain centuries later kept surgeons from taking up Davy’s ideas about nitrous oxide.
Step back six more centuries — that is, 2500 years ago — and you have Bian Que. He was said to have employed an “intoxicating wine” that made patients “feign death” for up to three days. That’s plausible, at least, but the rest of the story is that he used it to perform a double heart transplantation, so I suspect it’s mythical.
There are bits and pieces of attempts at anesthesia in various other times and places. In the 7th century BCE, doctors in India were using cannabis vapors for sedation (vaping: three millennia old). A few centuries later, the Assyrians and Egyptians were using carotid compression — basically a choke hold — to induce brief periods of unconsciousness. Romans both bled patients into unconsciousness or mixed mandrake root into wine to induce it. Lastly, this isn’t really anesthesia, but the priestesses at Delphi were said to report on their visions after inhaling gases emanating from geologic faults; historians have suggested they may have been inhaling ethylene, a general anesthetic — so the oracles were just tripping balls on magic gas spewing out of ruptures in the earth’s crust.
By contrast, general anesthesia was known and perhaps even common in the Arabic world more than a millennium ago. By 1000 AD, the first texts devoted to surgery were published by Abu al-Qasim al-Zahrawi, they included brief descriptions of general anesthesia. Twenty years later, a medical canon published by Ibn Sina described the use of “soporific sponges.” The sponge was infused with a variety of compounds, but the primary actor was probably opium. A piece of cloth would be soaked in solution and then placed over the patient’s face to induce unconsciousness, allowing surgery to be performed (read more on Al-Zahrawi here)
What I would love to be able to do is describe the transition from these older techniques to more modern anesthetic practices, especially vis-a-vis its relationship to what was happening in Europe. But so far I have been confounded in unearthing that information.
Morton, Wells, and Credit
Let’s return to Horace Wells and William Morton. By 1848, Morton was pitching his gas and a newly-designed inhaler to doctors, all while billing himself as the inventor and “revealer” of anesthesia. That the gas — which he called “Letheon” — was actually ether was essentially an open secret by that time. Wells, meanwhile, had left medicine altogether and become an itinerant salesman (one source claims he was selling canaries at one point, which can’t be right). He still held out hope that “his” gas — nitrous oxide — would find a toehold.
In early 1848, Wells began to self-experiment with the anesthetic properties of ether and chloroform, which had already been coming into some use. After days of constant huffing he was badly delusional and completely unhinged, and threw a vial of sulfuric acid at two women in the street. They were unhurt, but Wells was sent to the Tombs. Upon returning to coherency following his hallucinatory episode, Wells became so distraught over his actions that he asked guards to retrieve his shaving kit, then took a sniff of chloroform to dull the pain and cut his femoral artery. He bled to death, leaving behind a wife and child, and also a tragic reminder that self-experimentation only sounds cool when it works out (read more on Wells here).
Morton’s story is less sad but far stranger. He was largely despised in the medical community for his attempts to patent and cash in on “an agent capable of mitigating human suffering.” Beyond that, he was mired in a debate over who deserved credit for the “discovery”: one of Morton’s med school lecturers, Charles Jackson, claimed that he’d demonstrated ether when teaching Morton. The pair publicly feuded seeking credit and, more importantly, compensation; neither was particularly likeable. Morton spent years pigeonholing Congress to award him an honorarium for the discovery. Jackson, meanwhile, had a long history of claiming dubious co-credit for inventions and ideas: he’d also do so with guncotton, the telegraph, modern theories of digestion, and copper mining from Lake Superior.
The Jackson-Morton rivalry extended to one of the 19th century’s trials of the century: the George Parkman murder. A scion of one of Boston’s richest families, in 1849 Parkman was murdered, dismembered, and cremated by John Webster, a Harvard lecturer who owed him money. The trial was one of the first cases to make use of forensic evidence and was an absolute tabloid bonanza. Since all that was left of the body were some bone fragments and loose teeth, the entire case turned on whether the bones, particularly the teeth, could be positively identified as Parkman’s. Morton testified for the defense, while Jackson testified for the prosecution. Webster was convicted and eventually confessed. He was executed; the Boston Brahmins sent out engraved invitations to the hanging.
Morton died in 1868 (more on him here and here) and Jackson in 1880, but you can keep going down the rabbit hole, and of course I’m going to. Morton’s son — also William Morton, also a doctor — was a pioneer in “electrotherapeutics,” where he developed a device to deliver electricity to x-ray machines. In 1912, he and an accomplice were indicted and convicted of mail fraud, for selling some 3.5 million shares of nonexistent stock for a mining concern in Ontario. He was later pardoned by the president.
His accomplice was Julian Hawthorne, who had a long career as a journalist and essayist; he was the son of Nathaniel Hawthorne and Sophia Peabody (side note: when Julian was born, Nathaniel sent a telegram to his sister: “A small troglodyte made his appearance here at ten minutes to six o’clock.”). Unlike Morton, Julian was not pardoned, and spent a year in prison. After his release, he wrote a book called The Subterranean Brotherhood which advocated an end to incarceration. Both Hawthorne and Morton maintained their innocence.
There ends the story of Morton, Wells, and Jackson, but there is still one other person who deserves some time, and that is John Snow. Snow is more commonly known for his crackerjack epidemiological work in identifying the source of a London cholera outbreak in 1854; he’s considered one of the fathers of epidemiology (Snow’s work in the epidemic was really amazing, by the by). But he should be more well known for his role in standardizing anesthetic practices.
Wells, Morton, and/or Jackson may have “discovered” anesthesia, but it might never have taken hold were it not for Snow’s dedication and investigatory work. Morton had built a simple breathing apparatus that was an improvement on soaked rags or sponges held over a patient’s mouth. But Snow worked at refining a breathing apparatus that would administer a constant and controllable flow of gas, and kept copious notes while testing the device and the effects of anesthesia on several thousands of patients (all of those notes are still in his archives, which is really neat).
If any single event can be said to have spurred the use of general anesthesia, it is when Queen Victoria was given chloroform during the 1853 birth of Leopold, administered under the watch of and with the apparatus designed by Snow. Victoria described the experience as “soothing, quieting, delightful beyond measure.”
As the 19th century wore on, nitrous oxide came to be used more and more in dentistry, where its fast but mild and transient action was seen as a benefit. Ether and chloroform continued to be the general anesthetics of choice, both coming with their own set of risks: ether smelled like paint thinner and was highly flammable (surgeries were done by candlelight at the time); chloroform gave a deeper unconsciousness but had a much higher chance of causing spontaneous unexpected death. A few decades later, a big step forward was the introduction of paralytics like curare, which ensured that patients remained absolutely still. It wasn’t until the mid 1900s that modern anesthetics were developed and deployed.
A Final Trivium & Additional Reading
Another option for surgical pain reduction in the early 1800s was hypnotism, then known as mesmerism. A British surgeon, upon completing his first surgery on an etherized patient, proclaimed “this Yankee dodge beats mesmerism hollow!”
For further reading on anesthesia generally, I recommend this visual timeline of anesthesia, this somewhat difficult to navigate website, and Stephanie Snow’s Blessed Days of Anesthesia.
I also strongly recommend this amazing article about Humphry Davy’s nitrous experiments.
Originally published at iheartliterati.wordpress.com on October 30, 2015.