Moore’s and Eroom’s Law in a Graph -Skyrocketing Pharma R&D Costs Despite Quantum Leaps in Technology

Bálint Botz
10 min readSep 13, 2016

--

Moore’s law coined by Intel co-founder Gordon Moore in the mid-1960s demonstrates the exponential growth observed in the semiconductor technology and the entire IT sector in the last decades. Briefly, it states that the number of transistors per integrated circuit doubles approximately every two years,a prediction that held true for almost half a century (though its future appears to be less certain).

Conversely, Eroom’s law (which is as you may have noticed is the same name spelled backwards) states that the cost of developing a new drug doubles approximately every nine years. It has a much shorter history than its more famous cousin, being coined in 2012. Eroom’s law has quickly become the symbol of the troubled present of pharmaceutical research.

I made this graph to put this into perspective and to demonstrates the trends described by both laws:

The cost of developing a new drug entity compared to the price of a megabyte (MB) of Random Access Memory (RAM) across the decades. Note that the horizontal axis has been set to logarithmic scale to accomodate the magnitude of RAM price drop.

The price of RAM is used as a benchmark of technological advance in the IT industry, since its impressive change over the years is well-documented. Drug development costs according to the latest reports reached a whopping $2.558 billion per approved new entity. Data about earlier periods is a bit more conflicting, hence I used those given by this report for gross visualisation of the trend. What is clear however that while the price of RAM decreased at an astonishing rate throughout the years, the price of developing new pharmaceuticals progressed in the entirely opposite direction. Looking at the way scientific methods and biotechnology advanced this seems surprising at first. Advanced techniques greatly reduced the hassle to synthesize and screen new chemicals, and an arsenal of computational tools aid the design of new drugs, compared to the former, more or less“brute force” approach. Nevertheless, R&D costs of the pharmaceutical industry increased nearly 100 fold between 1950 and 2010.

In this short opinion piece I’d like to adress both the underlying cause and the potential solution of this phenomenon. First, we will discuss what factors are repsonsible for the booming R&D costs:

Trying to outdo Beatles

The causes of the stalling drug development are manifold, but one key factor has been termed the “Better than Beatles problem”. It is beyond doubt that the succesful past hits of the pharma industry are increasing the bar for novel entities, which now have to be better than not just a placebo, but a plethora of highly effective pharmacuticals as well, all with an excellent track records — and thanks to mostly expired patents — low price. Clearly, many applications are now saturated with well-tolerated and safe drugs, which renders superseding them increasingly more challenging.

As renowned Pharma blogger Derek Lowe put it:

We are in the position of someone trying to come up with a better comb.

Superstar drugs becoming off-patent

Another aspect of the problem is the relatively short lifespan of new entities. Due to the prolonged preclinical and clinical phase of drug development (an average development time of new drugs is now 14 years), the now average new drug has about 10 years of exclusivity before its patent expires. That poses a significant risk, since the exclusivity granted by patents is one of key incentives for pharmaceutical R&D. There is a very real risk of running out time, which can lead to the new drug not returning the initial investment and generating loss for it developer.

Currently the industry spends about $135 billion on year on drug development, while the market value of the entire drug pipeline is estimated at a mere $293 billion. A concerning aspect is that as the former superstar drugs such as sildenafil (Viagra) gradually lose their privileged status and have to face generic competitions, the small amount of new entities cannot compensate for the losses. The theory of a solid supply of new “blockbuster” pharmaceuticals worked well in earlier periods, but as drug discovery struggles, the financial model of research oriented pharmaceutical companies are being jeopardized.

Are the low-hanging fruits all gone?

Another often emphasized causing factor is that decades of pharmaceutical research has cleared the opportunities that could be easily approached by the brute force “find a target-modify an effect” strategy, and the remaining options are either to niche to be financially rewarding, or are to difficult and require a conceptual paradigm shift. Notably neuroscience and cancer research are often mentioned as examples of fields where traditional small molecule drug developmental strategies proved to be less successful. Development of novel antibiotics is also badly needed in the ever-increasing amount of resistant pathogens, however very few new entities have been introduced in the last two decades, which seriously jeopardizes the improvements of modern healthcare.

To be clear, the ‘low-hanging fruit’ problem argues that the easy-to-pick fruit has gone, whereas the ‘better than the Beatles’ problem argues that the fruit that has been picked reduces the value of the fruit that is left in the tree.

But are those fruits really gone? Most experts including the parents of Eroom’s law refute this. Remaining at the metaphor, the only thing we know is how many fruits have picked, but we have no clue about the number of fruits remaining. The human genome project was completed in 2002. At the time exactly 482 targets have been successfully drugged, whilst according to the contemporary estimations there were already around 8000 known potential drug targets, of which about 5000 can be realistically targeted alone by small-molecule compounds.

Diminished output despite improved methods

It is beyond doubt that the R&D technology evolved rapidly in the last decades, and major breakthroughs in biotechnology, informatics, and molecular biology now enable solving tasks previously deemed unimaginably complex. Despite this, the output of the Pharma industry (aka. the number of new drugs launched/year) did not keep up with the pace, and actually diminished compared to its peak years in the 1970–80s. This happened despite the steady increase of R&D spending of the pharma industry. These years also marked a paradigm shift in the research methods for new entities. The attention shifted from natural products being tested on complex system towards a molecular mechanims-based approach. The novel, streamlined mechanism-based reverse pharmacology proposed clean hits, and superior target selectivity compared to the earlier “hit a target-get an effect” approach, however many drug candidates discovered this way have failed in the clinical trials. This is seemingly controversial, perhaps even enigmatic at first. It turned out, that overemphasizing “clean” effects mediated by a single molecular pathways turned out to be less fruitful than initially thought, the reasons of which we will discuss later.

Regulatory barriers — that nevertheless keep proving to be useful

The necessity to regulate drug development has a history of more than a century, preceeded by the “dark ages” of patent medicine, when chemical mixtures of questionable or bona fide unknown composition were boldy marketed as universal cure against a plethora of ailments. Nevertheless laws concerning drug development remained relatively lax until well into the 20th century. With a swiftness now unimaginable chlorpromazine, one of the first antipsychotics to be discovered on December 11 1951 was administered within less than a month to the first actual patient in January 19 1952. In November 1952 it entered the market — only 11 months after being synthesised.

The real change in regulation toke place in the 1960s, when the disastrous birth defects caused by the newly launched sedative thalidomide sent a shockwave through the public. Subsequent investigations found that the tragic effect was the consequence of the rudimentary safety testing, and no efforts were taken (nor was it mandated) to rule out possible teratogenic effects prior to the approval of the drug. The thalidomide disaster represents a tipping point after which pharmaceutical industry has become one of the least trusted industries, and the frequent target of conspiration theorists.

In the subsequent decades laws regarding testing and licensing drugs have become gradually tougher, however occasional tragic events still occur. More recently rofecoxib (Vioxx) gained notoriety. This novel nonsteroidal anti-inflammatory drug had gained significant popularity before being withdrawn in 2004 due to increased cardiovascular risk. Just this year a tragic event again higlighted the importance of meticulous clinical trials. BIA 10–2474, a promising new drug candidate targetting the endocannabinoid system, and with manifold implications from pain to obesity induced disastrous adverse effects during the Phase I trial, leading to the death of one participant.

In summary, regulatory barriers have been proving their merits, and tragic examples prove from time to time the immensely perilous nature of the the translation of pharmaceuticals from the preclinical phase to the clinics

Growing disparity between basic research output and results translated into the clinics

The recent decades have seen an astonishing growth of basic biomedical research productivity, reflected e.g. by the number of articles published. This however did not result in an accompanying increase of outcomes in applied pharmaceutical research, or the general health and life expectancy of the population. Quantum leaps occurred all across the biomedical field, from sequencing the human genome and advanced computer modeling, to high-throughput screening, yet the hard endpoint - namely the number of new drugs approved showed modest if any increase. In the meantime detrimental trends have also gained momentum, as increased specialization of basic scientists made complex task-solving more difficult, and the growing number of scientific misconduct, eroded the quality of the literature. The variability of reagents (e.g. antibodies), or the commonly occurring cell line contaminations also resulted in an altogether poorer reproductivity of basic research findings.

Since basic research is the foundation, upon which pharmaceutical research can build, it is critical to repair the broken scientific publishing process, increase transparency, and to remove the perverse incentives that push authors to publish dubious results, and to engage in fraudulent actions. Hopefully, recently there seems to be a willingness to do this, facilitated largely by scientific publishing watchdogs such as Retraction Watch, or post-publication peer-review sites as PubPeer. Increasing the reliability of basic research findings can pave the way to improved success rates in pharmaceutical research.

Ways to break out the vicious cycle

Eroom’s law was coined four years ago, which gives us enough time to assess it from a semi-historical standpoint. The more than half thousand citations, the dedicated wikipedia article, and a myriad of reflections in online media signify its importance, while R&D costs kept rising as predicted. Moore’s law held up for decades, but eventually it succumbed, due to reaching the very limits of technology that allowed it in the first place. But what will be the fate of its distant cousin?

Eroom’s law as we have seen has many facets, from technological limitations and “good enough” medications, to legislative challenges. Some of the greatest results of pharmaceutical research stem from “brute force methods” that we have discussed earlier, while the paradigm shift towards more sophisticated tools in the 1990s resulted in a paradoxically plummeting efficiency. It is very likely that over-reliance of presumably more streamlined cell-based systems compared to the extensive in vivo screening prevalent in the heyday of drug research, and overemphasizing the importance of “clean” effects mediated by acting on a single target are one aspect of this issue.

The target-based approach seeking superselective drugs dominated the field since the 1990s, there are however increasingly vocal opinions proposing the return of polypharmacology. Since most diseases are being multigenic, multitarget drugs modulating a network of related targets could be the key for better outcomes. Advance in silico technologies like computational models, aided by machine learning methods are promising in reinvigorating small molecule drug discovery. Moving beyond sheer computational power, deep learning neural networks, offer new ways to analyze data, and search for patterns, will likely have a tremendous impact on drug development and healthcare in general, by facilitating biomarker discovery, screening, and clinical trial assessment. Artificial intelligence-based drug screening may appear futuristic, but several startups such as Atomwise or Numerate are convinced to bring multi-omics data analysis and high-throughput screening together this way. It goes beyond doubt that such systems could prove to be superior way to handle the information explosion in biomedicine, and potentially notice complex patterns unnoticed otherwise. AI could be useful for finding shortcuts to optimise chemical synthesis as well as a myriad of other, currently not even known potential uses.

When it comes to preclinical testing of drug candidates organ-on-chip systems offer entirely new dimensions to screen and evaluate effects in a system retaining the complexity of an in vivo millieu while keeping costs at bay. In a similar vein some authors proposed open access models for early phase drug discovery, which would distribute risks among numerous stakeholders, whilst downstream patenting would make it profitable for all inventors involved. It is beyond doubt that such an environment, would counteract pharma mergers and acquisitions, and benefit smaller companies, altogether resulting in increased collaboration. Finally, increased risk-taking could result in more serendipitous “happy accident” discoveries.

In the last few years there appears to be modest trend towards increasing drug approvals/year. As others have pointed it out it could be partially due to improved methods already taking hold, but also due to the successes with diseases caused by well-identified genetic risk factors. However some key drug categories don’t fare so well. Antibiotics transformed the entire healthcare and had a key contribution to the increased life expectancy in the post-war period. Its now hard to imagine how defenseless humanity was against bacteria in the prepenicillin era. Even in 1924, the son of US president Calvin Coolidge succumbed to generalized bacterial infection that started as a simple sore toe following a tennis match. Such a scenario would be impossible these days one would say — but is that truly so? Decades of widespread and sometimes irresponsible use of antibiotics are jeopardizing our defensive arsenal against microbes, as multiresistant strains are outpacing researchers. Despite this, there’s very little improvement in antibacterial agents recently. The reasons are manifold, and go beyond the scope of this opinion piece, however one key factor is certainly the relatively small market. Specialized super-antiobiotics are badly needed in hospital settings, but overall demand is low comparable to e.g. antidiabetic, or antihypertensive drugs — yet R&D costs are in the similar range. Hence, researching new class of antibiotics is less tempting, due to its increased risks and limited returns. This means, that strong incentives, and governmental aid is badly needed to reduce risk aversion of company executives, and also to maintain the necessary level of drug developmental efforts in some less tempting therapeutical areas.

In conclusion, Eroom’s law can only be overcome by the joint effect of technological and conceptual paradigm shifts, but also by acknowledging that the road to discovery starts as often as not with one giant leap into the unknown.

As a final remark: Moore’s law hold up surprisingly long, but eventually it gave its way. We hope that Eroom’s law will not last half a century.

--

--

Bálint Botz

MD/PhD, Diagnostic radiology resident, Researcher mainly focusing on neuroinflammation. Eclectic range of interests.