The Fresh Writes
Published in

The Fresh Writes

Better Future with Robot Radiologists

Deep-learning algorithms are peering into MRIs and x-rays with unmatched vision, but who is to blame when they make a mistake?

How would you feel if an algorithm could inform you that you had cancer-based on your mammography exam or CT scan? It is highly likely that in the future, the creative work of radiologists will be necessary to solve challenging problems and to oversee diagnostic procedures. AI will absolutely become part of their routine in diagnosing basic cases and helping to assist with repetitive jobs.

So, instead of feeling threatened by AI, radiologists need to become familiar with how it could help them in their daily lives for the better.

Radiologists who use AI will replace those who don’t

There is a lot of fear surrounding AI and its future impact on medicine. There are many indicators suggesting that AI will completely revolutionise the world of healthcare. With the advancements in deep learning algorithms and narrow AI, there is a buzz around the medical field of imaging, in particular, something that has set many radiologists into a panic.

Curtis Langlots, Professor of Radiology, recently presented at the GPR Tech Conference in San Jose. He mentioned how one of his students had sent an email saying they were considering going into radiology but that they weren’t sure it was a viable career any longer. This is completely wrong, radiology isn’t a dying profession, in fact, it’s far from it.

There is a lot of hype around the radiology profession that deep learning and machine learning and AI, in general, is going to replace radiologists in the future and that perhaps all radiologists will end up doing is looking at images. However, it’s simply not true. As a comparison, we could consider that of a plane going into autopilot. This innovation certainly didn’t replace pilots, it assisted with their tasks. When a plane is flying a very long route, it’s great to be able to switch on the autopilot, however, they are not very useful when rapid judgment is required. So, the technology and human combination is definitely a winning one, and it is going to be the same case in healthcare too.

Larger range of tools and better precision

Around half a century after the X-ray was discovered, another innovation joined the medical imaging field: ultrasound. With Clarius Mobile Health introducing the very first pocket-sized handheld ultrasound scanner complete with a smartphone app, physicians can carry it around with them to undertake fast exams and to guide quick procedures like targeted injections and nerve blocks.

Let’s talk about body scanners. In 1971, the very first CT scanners were developed. They had a single detector for brain studies and were created under the leadership of Godfrey Hounsfield, an electrical engineer at EMI (Electric and Musical Industries, Ltd). In the 1970s, Raymond Damadian built the very first MRI scanner by hand, with the help of students at New York’s Downstate Medical Center. The first MRI scan of the human body was completed in 1977, and a human organism with cancer in 1978. In the early 2000s, medical imaging was routine in many centers, such as fetal imagine, body MRI, cardiac MRR, and functional MR imaging.

Is it possible for AI to predict when you might die?

Experiments have been carried out by scientists at the University of Adelaide where the use of AI systems should be able to tell one when they might die. The deep learning algorithms have been analyzing the CT scans of 48 patients to predict whether they might die within the next five years, the study so far has been 69% accurate. It is a similar outcome to the results of human diagnosticians, which is an impressive achievement. The deep learning machine was trained to indicate signs of disease in the organs by using a series of 16,000 images. The aim of the research is to check and measure overall health, rather than to identify a single disease.

This is just the tip of the iceberg, however, as there is a lot of research being carried out to teach algorithms about different diseases and how to detect them. An algorithm launched by IBM called Medical Sieve has been able to assist in clinical decision making in cardiology and radiology. The system can look at radiology images and detect problems faster and more reliably. Watson, another IBN AI analytic platform, is also used in the radiology field.

All of this research doesn’t necessarily mean that we are currently ready to have patients face their life expectancy based on their medical images, however.

What are the challenges in introducing AI to the radiology department?

The algorithm is fed by many many images and data parts that allow it to learn and detect differences in tissue. Just like how computers can recognise images of cats and dogs. If the algorithm makes an error, it is spotted by the researcher and they make an adjustment to the code. It is therefore a lengthy process and tons of data is needed.

It is believed that the end result will look similar to this: Radiologists will conduct the high-level exam, and the algorithm will likely create a minable, structured, prelim report. The algorithm will therefore do the quantification that most humans don’t enjoy doing, and it will do it very well.

Other experts in precision medicine believe that there are many challenges faced in building these analytical platforms, from first acquiring and inputting the data, ensuring annotation of the data is effective, storage strategy, the regulator/policy/governance through the process, and the types of analysis that will be enabled via the platform.

The largest challenge is that of data annotation and allowing various views of it along with enabling its discovery across the many connected data sets in the platform.

Furthermore, there is some convincing to be done to show hospitals that AI algorithms actually work. Experts suggest that there will be a process that takes advantage of external and internal “crowdsourcing” of appropriately anonymised data.

For instance, a user could have established data science algorithms that are based on anonymised datasets from their hospital network. Then, a new hospital could use the algorithm to further refine the anonymised local datasets to customize if for their needs. Once hospitals see a “win” scenario, they may be encouraged to allow the systems to use further datasets so as to contribute to the users’ solution. It’s perhaps similar to how we try to go into cool water on a hot summer day. Firstly, you see other people doing it, then you see that it’s safe, and so you get involved too, perhaps dipping your toes before fully committing.

When will we get to have AI analyzing our CT scans?

Every day, we move closer to clinical use. The Data Science Bowl of 2017 aimed at detecting lung cancer by using smart algorithms on more than 1000 anonymous lung scans that were provided by the US National Cancer Institute. There were over 18,000 unique algorithms created during the challenge. The main goal was to find the path to deliver the algorithms to systems that can be used in clinical care, and therefore members like the FDA and the American College of Radiology can connect to the image system users and the radiologist who would use these algorithms.

In 2017, the FDA went on to approve the first cloud-based deep learning algorithm. It was developed by Arterys for cardiac imaging. So we can see, we are slowly getting there. Experts suggest that within the next 3 years, we should see many machine learning algorithms being used in clinical pilot schemes and also in approved use. It is also expected that within this time frame, there may also be low dose CT lung cancer deep learning algorithms within the arsenal of a radiologist’s toolkit. This would be able to assess an individual’s risk of lung cancer.

There are, however, no concrete estimations, and it is viable that it will be a step by step process where a lot of sub-fields will be developed quicker than others. An example is in mammography, where it is more likely that AI will be used sooner than in CT scanning. There is indeed potential for a quicker approach that could see preliminary reports within the next 10 years. In some fields, this is a distinct possibility.

The future of radiology is with AI

At the end of the day, experts and research trends show just how AI will revolutionise radiology in the future. So, rather than feel threatened by it or neglect it, the medical world should adopt it with open arms.

Rather than radiologists feeling pushed out by machine intelligence, they should engage it, learn it, and promote it. After all, it is something that will help patients. We expect that there will be huge changes in the radiology field in the coming years. It is a field that needs to be kept at the forefront, and what matters the most is taking care of the patients. Let us all nurture that thought and make the future of radiology with AI a good one.

Thanks for reading.Happy learning 😄

Do support our publication by following it

Also refer to the following articles.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store