Machine Learning in Pharma- what’s happening now and what’s next?

Edward ffrench
Systems AI
Published in
5 min readOct 15, 2019

Written by Edward ffrench and Nnamdi Affia

The adoption of Artificial Intelligence (AI) in research-intensive organisations has essentially seen linear growth in recent years. Pharmaceutical, CRO, university and healthcare organisations are beginning to shift away from theoretical plans to genuine practical applications of machine intelligence. But what problems are these institutions trying to solve? Dr. Lester Russell, Associate Partner, Clinical Digital Innovation at IBM and a part-time General Practitioner (GP) in the South of England, provides his perspective on the use of Artificial Intelligence within the life sciences industries.

What is the case for AI in life sciences?

The cost of and the demand for provisions, such as healthcare, is rising exponentially. The answer is that the industry needs to increase the efficiency of delivery, whether this be in the front-line of healthcare or in costly drug discovery research. Applying AI is one of the methods by which efficiency can be boosted.

How are research-intensive organisations applying AI today?

To get a glimpse of AI in pharmaceuticals, biotechnology and healthcare, you need to examine how researchers and clinicians are applying machine learning today. One of the areas seeing marked progress is digital pathology. AI is augmenting the traditional process within pathology of manually examining glass slides through a microscope. Lester, a practising GP, recognises an enormous demand for these types of skilled laboratory-based services. The shortage of practising pathologists means that there is a demand for assistance in the form of machines that are capable of visual diagnosis. These machines cannot legally give a patient a diagnosis, but they can, for example, replace manual triaging processes and act as an automated scheduling tool for laboratory workloads, allowing a pathologist’s time to be used more effectively.

Advancements may reach the stage where technology can be leveraged to recognise entirely new disease entities, but for now it should at least be applied to perform repetitive jobs. Tasks that are mundane but nonetheless important, such as counting tissue cells per slide, can be performed accurately by automation tools. Machines can execute iterative tasks with greater reliability than humans because they never get tired. This is a way of augmenting the capacity as well as the ability to make new discoveries.

How will Artificial Intelligence be used in the future?

Looking to the future, there will be much greater adoption of AI, especially in professions like pathology and radiology. Lester theorises that artificial intelligence will not replace pathologists, but rather pathologists who use AI in their day-to-day work will replace the pathologists who do not. Medical professionals will become newly skilled in analysing patient data with the assistance of machine learning algorithms.

Is it fair to describe this rise of AI as an augmentation of human capability rather than its replacement?

Yes, human general intelligence is so vast that artificial intelligence cannot replace humans in every conceivable task. There are widespread misconceptions, but no single machine entity has risen to this level of advanced general intelligence… yet.

For now, AI sits in a middle ground within the medical field and machine learning is used to assist thinly-stretched human efforts (as it should be). Essentially this has moved us from a Human-versus-Machine to a Human-plus-Machine environment. A very important move for society indeed, posing the question: how can artificial intelligence concepts that are used in conventional environments (like production lines in the manufacturing industry) be applied to critical medical settings, which concern the intricate nature of human health? Following these innovations, we are about to enter an exciting competitive market landscape whereby industries will pit Human-plus-Machine against another Human-plus-Machine. This will be the ultimate test that determines the quality and effectiveness of one AI compared to another.

What barriers are affecting the full exploitation of AI today?

There are three major challenges preventing mankind from achieving the full benefits of AI in the healthcare and life sciences industries:

  1. Healthcare data silos
  2. The clinical-technical skills gap
  3. Misconceptions of the safety/ethics of AI and effective regulation

The first and foremost massive barrier is the disparity of medical data bases. The McKinsey Global Institute estimates that the industry’s inability to tap into healthcare big data accounts for $300 billion in annual value wastage in the US alone[1]. However, even if data engineers were given access to all these protected data sources and then managed to aggregate the different formatting and data types into a single source of truth, the clinical-technical skills gaps would still block progress. Clinicians are disconnected from data scientists and skills from both parties are required to build and deploy production-ready machine learning models. This leads us to the final barrier: effective regulation. Making sure that technology is safe is undoubtedly a good thing and should not be perceived as an inhibitor of innovation. Care must be taken when pushing boundaries and therefore regulation is imperative.

The overall challenge is in taking AI from “the bench to the bedside” or in other words from theoretical research environments to a platform where it is delivering lifesaving benefits to patients. The key will be to involve medical professionals in the process of legitimising the technology. Active collaboration and knowledge sharing between data scientists and clinicians will fortify the technology and help to build trust in AI’s ability to achieve the next step.

What skills do these organisations require to take full advantage of AI?

The skill domain should not be purely limited to techies, on the contrary the skills required are multidisciplinary. Lester describes the significance of a combined workforce of technical professionals, pharmaceutical researchers, front-line clinicians and regulators all working collaboratively. All these skilled individuals must contribute for the technology to innovate as safely and as quickly as possible.

References

  1. https://www.mckinsey.com/industries/healthcare-systems-and-services/our-insights/the-big-data-revolution-in-us-health-care

--

--