Artificial Intelligence in Exponential Health

To enable exponential health improvements we need to understand what artificial intelligence is, what we can do with it and how to do that.

What is it?

Artificial Intelligence (AI) is the theory and development of computer systems able to perform tasks normally requiring human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages. This the force multiplier that sits on top of all the low cost sensors and patient data inputs.

How can we think of it?

For the A.I possibilities we can then think of D.A.S.H when looking at applications for health transformation.

Direct. Augmented. Sentinel. Helper.

But Let’s explore the fields of A.I first:

Subfields of Artificial Intelligence include:

Neural Networks – e.g systems modelled on the brain & nervous system these are often used in deep learning systems and are used in character recognition, time series prediction, expert systems and classification

Evolutionary computing – e.g systems modelled on evolutionary programming, evolution strategies and genetic algorithms used to solve complex real world problems e.g in populations or swarms

Computer Vision – e.g systems enabling object recognition, image understanding and augmented reality used to automate vision based problem solving.

Robotics – e.g systems enabling computer machines to interact with the real world such as intelligent control, autonomous exploration and dexterity in object manipulation.

Expert systems – e.g systems that model the decision making ability of a human expert. They are designed to solve complex problems using a series of if/then rules. and provide decision support systems such as used in medical diagnosis.

Speech processing – e.g systems that enable speech to text transcription, speech recognition, voice based automation & identification and speech production

Natural language processing – e.g systems that simulate the human ability to read and understand language such as used in machine language translation, question answering and chatbots.

Planning – e.g systems that can increase their autonomy and flexibility through the construction of sequences of actions to achieve their objectives such as scheduling, navigation, game playing

Machine learning – e.g. systems with the ability to learn without being explicitly programmed. Machine learning focuses on the development of computer programs that can change when exposed to new data. This essentially means the computer learns how to perform a task better and better based on its experience and data from the past. Deep Learning is a class of machine learning techniques that allow computers to learn from the real world and expererience in terms of a hierarchy of concepts (feature hierarchies) with each concept defined in relation to the simpler concepts. This allows the computer to learn complex concepts by building them out of simpler ones. Deep learning is often based on neural networks. This allows computers to complete tasks which were previously regarded as subjective and intuitive, simple for humans but impossible for computers.

What can A.I do to improve health outcomes?

Decentralised intelligence enables healthcare to be delivered in completely new locations and in completely new ways.

Optimise Medical Diagnosis

Medical diagnosis is the ultimate data based problem to solve. Now the deep learning system IBM’s Watson, which was able to beat all human competitors on the USA questions and knowledge based game show jeopardy, has now been applied to oncology, specifically cancer diagnosis. For human doctors to make diagnoses involves collecting large amounts of patient data and then keep up to date with all the global clinical trials, global research papers and also take into account that patient’s genetic code. This is an impossible task for a doctor to stay on the cutting edge, it is estimated 160 hours a week is needed just to read all the latest scientific papers but this does not account for the important task of remembering and cross referencing it all for relevance. The processing power of AI based systems is incredible, IBM’s Watson can read 200 million pages of text in 3 seconds. It would take 1000 doctors reading 547 pages of scientific work per day every day for 1 year to match what Watson can read & analyse in 3 seconds.

A single individual’s genome takes up about 100 gigabytes of memory on its own. Analyzing this information in addition to health records, journals, studies, and textbooks would take up countless hours, which doctors could instead use to attempt treating their patients. Watson, on the other hand, can complete this task in a matter of minutes, producing a visualization of the patient’s case and choosing potentially useful drugs based on the patient’s DNA profile.

IBM’s Watson really comes into its own in this field with it being able to search through millions of patient records, learn from previous diagnoses and build a more detailed map and understanding of the links between symptoms and conditions. This has resulted in IBM’s Watson having an accuracy rate of 90% in lung cancer diagnosis, human doctors could only manage 50% accuracy in diagnosis. So who do you want diagnosing your cancer?

Medical Examples:

From the Sense column in the EX Framework – we know the inputs that can be taken.

D.A.S.H

DIRECT.

AUGMENTED

SENTINEL

HELPER

Direct – This means complete assessment and diagnosis from medical input data by A.I. No human input at all.

Augmented – This is an A.I decision support layer on screens/instruments to enhance human performance – e.g tiny cameras such as otoscopes, robotic surgery instruments and endoscopes with automated lesion identification. As well as surgical telestration systems powered by computer vision assisting surgeon performance through attention guidance and even automated operative control.

Sentinel – These are background systems continuously monitoring for abnormalities / exceptions based on relevant population baselines / expected behavioural routines. Can be for macro e.g whole wards of hospitals – automatically monitoring vitals and reporting exceptions or can be micro – focused on individual patients mood / mental state / vital signs through monitoring of behaviours / speech / eye gaze – alerting to deteriorative signs.

Helper – These are an intelligent layer for direct communication with patients available on any device/platform – e.g Chatbot interface or Alexa speech interface. This enables 24/7 guided data collection from patients to fully enable remote monitoring, triage, professional timely advice that takes into account the patient’s situation and replaces the inconsistency and inaccuracies of Dr Google and Wikipedia which have no contextual awareness.

These 4 A.I layers enable a suite of brand new applications and locations for improved health outcomes, lets explore some examples:

Direct – Medical fields that are completely digitised and that involve assessment of images can already be performed better by A.I than domain experts. X-rays, MRIs, 6 Pathologists, Cancer diagnosis. Just like we use image recognition to transcribe text in documents automatically, we will also use direct analysis for most laboratory tests and imaging tests. Where outsourced radiologists currently provide this support, a direct A.I service would be faster, cheaper and also be able to capture outcome data to enable refinement of its diagnostic capabilities. This creates a situation where the world’s best diagnostic abilities are available by API for pennies and the value / cost saved is transferred to patient specific interventions. Enabling greater more accurate health outcomes for less money.

Augmented – Just like most camera phones show a square over a face when you are taking a photograph, context specific tools will provide procedurally relevant image recognition / support services. Such as when performing an endoscopy, relevant tissue regions could be highlighted in real time such as polyps, ulcers, and even pre-lesion tissues which are undetectable to our eyes due to our eye’s spectral ranges. This layer has the ability to enable super surgeons and doctors who are able to intervene at the earliest possible stages with interventions previously impossible such as stem cells, gene therapy, medications and other less invasive options.

Sentinel – This provides a helping monitoring hand to direct scarce human Resources such as nurses, doctors and even surgeon attention to exactly where it is needed. This A.I understands the rhythm / flow of normal situations as it is maintaining a baseline and then is able to alert professionals as to where to intervene and when – this enables just in time care, similar to the Kanban system in Toyota’s lean production system. These types of systems are already possible but present data protection nightmares for hospitals / clinics as they are the owners of the data. This could be resolved by changing the model from hospital enabled care to patient enabled care. A blockchain type patient record could enable the patients to opt into these A.I systems whilst in hospital enabling them to make their data live similar to a snapchat photo. This gives the patient peace of mind of excellent care independent of the staff that happens to be on duty at the time they arrive – e.g 4am. This also ensures that it is visible and accessible to the hospital only for a specific time to enable the real time monitoring but then removing it, locking and protecting it away so that the patient owns their own data and can choose when to open access to it. This allows patients to broadcast huge amounts of data to enable better care in their interest, but protects it so that it can’t be misused for discrimination purposes.

Helper -We are all constantly messaging and communicating, but we often lack specific actionable information. E.g if you are a diabetic and have an anaemic tendency but are are considering making some dietary changes e.g becoming Vegan / Vegetarian, you need to know what you need to eat to ensure you don’t slip into a negative health state. Or if you are a mother and are concerned about your child’s symptoms, they are coughing and wheezing, is this significant or not? A connected A.I helper in an app on your phone could ask questions automatically, accept photos of your child along with audio recordings of their breathing and provide you with actionable advice and information. Again this helper could check in after the intervention recommended and thus collect data on the outcome and use this as part of an ever learning system to provide better care automatically for pennies.

Why is this so important?

Medicine and healthcare are collections of expert systems. Experts are traditionally educated in different verticals of medicine such as oncology, orthopaedics, gynaecology, haematology, dermatology, opthalmology. It can take 10–15 years for a doctor to complete their training to become a consultant in their field. Expert AI systems can be created which can exceed human diagnostic performance in all these areas and they can also incorporate neighbouring expert system specialities to enable a multi specialty assessment of a patient. This is particularly useful in complex diseases such as diabetes which require multiple specialist assessments to be expertly managed. With human doctors this is only possible in cross-specialty consultations which are extremely time consuming, resource constrained and expensive. This will enable greater understanding and breakthroughs in complex disease systems.

Processing / Computing power – what can be analysed / reported / calculated

The processing power of AI based systems is incredible, IBM’s Watson can read 200 million pages of text in 3 seconds. It would take 1000 doctors reading 547 pages of scientific work per day every day for 1 year to match what Watson can read & analyse in 3 seconds.

A single individual’s genome takes up about 100 gigabytes of memory on its own. Analyzing this information in addition to health records, journals, studies, and textbooks would take up countless hours, which doctors could instead use to attempt treating their patients. Watson, on the other hand, can complete this task in a matter of minutes, producing a visualization of the patient’s case and choosing potentially useful drugs based on the patient’s DNA profile.

Cost

IBM’s systems are now available via API (Application Programming Interface) this means other programs and systems can tap into Watson’s power on demand on a pay as you go basis. This presents unparalleled computer power at a low cost from often as low as $300/month

Future of AI / Machine Learning / Deep Learning

With processing powers of AI based systems we are entering a new era of cognitive computing. The computers already exceed human performance in certain diagnostic areas, over the coming years we will see AI transform every area of healthcare it touches. It is the de-centralised always on, always alert brain that can power, detect, analyse and inform to help everyone live longer healthier lives. There will be a huge demand for experts that can help organise the datasets needed and setup the AI systems to power this new AI based healthcare.

--

--

Aalok Yashwant Shukla
Above Intelligent™ — Latest in Artificial Intelligence

Co-founder + Director of Innovation, Data & Technology straightteethdirect.com #healthtech #mhealth #medtech #digitalhealth #artificialintelligence