Diagnostics: why are they now getting the attention they deserve?

Dr Adam M Hill
5 min readMay 11, 2020

--

‘Unlike some countries, we didn’t go into this crisis with a huge diagnostics industry. We have the best scientific labs in the world, but we did not have the scale’ said UK Secretary of State for Health & Social Care, Matt Hancock at the beginning of April. He was responding to mounting criticism from health workers struggling to know if they had the disease, or were safe to return to the frontline.

The German Health Secretary could call on 100 test labs he added, ‘we have had to build from a lower base’.

A little punchy, maybe accusatory… or objective, with a modicum of truth; whatever your view of the Secretary of State’s statements at that time, the diagnostics industry in the UK was engaged in the COVID-19 challenge late.

If there’s one good thing that’s come out of the virus crisis, it’s been the value that governments, policy makers and investors are now placing on the role of the diagnostics industry in public health management. But why not before now? Has the industry been too slow to stand up for itself? Before getting into the challenges, let’s start at the beginning by understanding what a diagnostic is.

What is a diagnostic?

Diagnostic tests are clinical investigations performed on samples taken on and from the body, and used in a broad range of applications. Often called in vitro diagnostics (or IVDs for short), these tests can be performed at home (e.g. a pregnancy test), at work, by your bedside in a hospital, your physician’s clinic, or in a central laboratory. A routine part of clinical practice, these little inconveniences when we seek medical advice for our ailments can in fact significantly influence the clinical decisions your doctor makes, and therefore your clinical outcomes.

To payers, Diagnostics are often the least expensive part of the health care pathway, and arguably the most cost effective; the NHS spends less than 4% of its budget on diagnostics and yet over 70% of health care decisions are dependent upon them.

Diagnostic tests provide objective information about a person’s health. Some are used for risk assessment purposes — to determine the likelihood that a medical condition is, or will become, present — whereas others are used to monitor the course of a disease or to assess a patient’s response to treatments.

Increasingly, diagnostic tests are even being used to guide the selection of further tests; a diagnostic to determine the use of another diagnostic is not as tautological as it might sound! Clinical knowledge is expanding rapidly (the number of medical publications is doubling roughly every 8 years) and medicine has become correspondingly complex. With that complexity comes the need for diagnostic tests to support critical decisions.

There are thousands of diagnostic tests, and they can be classified many ways. Physicians often group them according to the way they gather information or the type of technology they employ:

- Chemistry (as it sounds);

- Immunochemistry (immune system chemistry);

- Haematology or Cytology (cells of the body including those in the blood);

- Microbiology (bacteria and the like) and

- Molecular (fragments of cells and the machinery that makes them function).

Diagnostics use reagents; these are essential to diagnostic tests because the substances cause a reaction with the cell/ blood/ tissue sample from your body. It is the availability of reagents which has, in no small part, challenged the availability of COVID-19 tests during this pandemic.

Each diagnostic test can have a number of applications:

- Early disease detection (screening);

- Diagnosis (identifying the disease itself);

- Disease staging (predicting the outcome of a disease);

- Therapy selection (not all drugs work for all people) and

- Disease monitoring (understanding the course of a disease and whether a treatment is working).

Whether simple, or complex, diagnostics for use in medicine are always regulated by a competent authority — the industry regulator that lays out the rules so that no one gets hurt.

Innovation and advances

There has been rapid innovation both in the range and complexity of diagnostic tests, and in laboratory test methods and techniques; and the regulators have been instrumental in this development.

The evolution of the Clinical Laboratory Improvements Act in the US from 1967 to present day, made lawmakers tread the fine line between over-regulation and wilful disregard, and allowed innovation to flourish. As the US remains a substantial healthcare market, this regulatory system has had a disproportionate bearing upon the development of diagnostics over the last 50 years.

Advances in diagnostic products made it possible to detect diseases early (when they are often most responsive to treatment), made lab tests easier to use and made them less subject to user error. The advances have led to faster, more precise and more consistent results, and have helped transform medical practice.

But with innovation comes change, and change in medical practice is notoriously slow.

Why the uptake of new diagnostics is slow

There are a number of challenges to driving the adoption of innovative, new technologies into clinical practice.

Reimbursement of diagnostic tests as a low-margin commodity, as is common in the NHS, has historically reduced the likelihood that any economic return from development of an innovative test will justify the required investment.

Furthermore, low margins make it particularly difficult to conduct the elaborate clinical trial programmes taken for granted in the development of new pharmaceuticals, demonstrating both efficacy and effectiveness. Without this evidence, the effect size of a diagnostic upon an outcome has often eluded investigators, and further compounded the devaluing of the sector.

Finally, education of physicians on the proper use and limitations of novel diagnostic tests — as is common in both drugs and device sectors — is key to the expansion of this field. It is not unusual for clinicians to struggle to evaluate the utility of a diagnostic, as unlike a drug or device, diagnostics are characterised by their statistical performance; you need to have access to lots of data and understand statistics to evaluate the utility of the diagnostic. There’s certainly more work to be done in this area.

Diagnostics are vital tools in determining the right treatment for each individual patient; whilst they come in all shapes and sizes, they can have a wide variety of applications, not just in diagnosing disease. The UK has been at the forefront of scientific innovation in diagnostics for some time — yet there remain substantial challenges to develop a market for these products.

However, the COVID-19 pandemic has changed everything. The practice of testing for diagnosis is now firmly in the headlights. Getting out of lockdown is dependent on widespread effective virus testing, as well as contact tracing, according to commentators analysing why some countries are faring better than others.

At the end of April, Severin Schwan, the CEO of Roche and in many ways, the veritable captain of the global diagnostics industry noted in commenting on the COVID-19 response: ‘You have other countries who underinvested in healthcare infrastructure, but now it shows.’ He added: ‘There are certain regions in the UK where there is no single high throughput [diagnostics] platform… In Switzerland, there are 20.’.

Now clearly a topic of discussion in the court of public opinion, and a matter of concern for Matt Hancock and his colleagues, perhaps the long under-recognised and under-valued UK diagnostics industry will receive the attention that it so very much deserves. And those of us that rely upon our fabulous NHS can eventually benefit from the UK’s world class innovation in accurately diagnosing medical conditions.

--

--

Dr Adam M Hill

CEO @ONC.L, Professor @IGHI, Board Member @ICHP and @MyRecovery, Dad. Disrupting the diagnostics industry. In pursuit of scalable impact. linkedin.com/in/hillam