Spotlight: new frontiers in technology for dementia

Notes from the Technology and Dementia preconference — Alzheimer’s Association International Conference (Los Angeles, 14–18 July)

Dr Maura Bellio
UCLIC
6 min readDec 13, 2019

--

When hearing about clinical dementia research our first thoughts could well be about the latest molecule being investigated for drug development or the newest hard-to-pronounce neurological region that has been mapped with functional medical imaging technology. The what and why (where?) of biology and medicine.

Photo by Jens Johnsson on unsplash

However, it is not just about that.

Dementia research brings together a whole universe of research themes, and newly discovered targets and subjects for investigation. One of them is how technology can support, facilitate, even trigger, new exciting discoveries.

Every year, at the Alzheimer’s Association International Conference (AAIC), the Technology Professional Interest Area (TPIA) holds a pre-conference day that focuses on technological-based research (including mobile, home-based, or social networking) and how it is applied to innovative study design and potential solutions in the field. The whole event is an insightful exchange of long talks, short lightning sessions, posters display, and — of course — a lot of dedicated discussion.

In this post, I will share a snapshot on how technology is being used in this space and its current potential. Reference to the conference abstracts are at this link.

Technology for exploitation

In a scenario of converging disciplines, success is determined by the ability for mutual comprehension and merging the gaps. All the works presented in this section contributed to a better understanding of how we can translate technology amongst research, industry, and clinical practice. Research on modelling, tracking, and monitoring techniques can produce systems that are able to detect the condition at early stages. Despite the firm contribution of making complex technical models more appealing in the form of web platform or mobile app, the question to address remains: how can we best design these innovations for healthcare and patients’ use? More specifically, how can we build systems that support instead of disrupt, that build trust and fit well with the user’s beliefs? (Bellio et al., Berisha et al., Yu et al.)

Technology for clinical applications

The range of clinical application opportunities was the richest and most diverse within the preconference themes. There are three main threads that can represent the variety of interests.

Assessment

Technology is intensively used to automate processes, being applied to all procedures that can support medical specialists in various phases of the clinical assessment. In particular, this can be used in the collection of patients’ responses, such as surveys or cognitive tests, and the digital storing of data.

The assessment of a complex condition such as AD requires the collection of multiple clinical data. In this context, the most represented were:

· Computerised assessment. What has been known for decades as pen-and-paper testing can provide numerous advantages in its digital form. Not only could data be safely stored and investigated, but the whole process becomes automated and digitised, making it suitable for data mining techniques. For example, one of the main indicators of early deterioration, verbal assessment, is getting revolutionary support from automated speech analysis procedure. In fact, digital speech features such as pausing patterns (Lai et al.), might represent new markers in disease detection. The option to run it remotely is an advantage in accessibility and outreach (Taptiklis et al., Cormack et al.).

· Behaviour. Possibly one of the hardest factors to quantify when it comes to technology are changes in behaviour. One of the works (Robert et al.) showed how mini games can be used in assessing apathy, a behavioural and psychological condition that implies lack of interest or emotions. Another work described the in-home monitoring set up, which targets apathy as well as sleeping patterns, or agitation (Vahia et al.).

· Physiology. More sophisticated technology is used in digitising new biomarkers based on retinal blood flow in the eye and structural disturbances (DeBuc), or patterns in the electrical activity of the brain (Jiang et al.).

Digital markers

With the digitisation of assessment procedures and the use of technology, markers for AD can become digital. The primary advantage of digital markers is that they are more accurate indicators which ensures data quality in AD clinical trials, while reducing costs (Solomon). Another implication, as previously mentioned, comes from the development of modelling techniques, such as Machine Learning and predictive models. These will support improvements in screening procedures (Köning), early detection of the disease (Lancaster), and data mining techniques.

Data mining

Data mining for AD is made possible by the increasing availability of digital markers and big datasets. The development of automated AI procedures can easily bring additional knowledge from the amount of data available, in a way that would not be achievable through human analysis? (capabilities). More specifically, data mining can define disease trajectories (Köning), and discover subtypes within a particular disease (Alexander et al.). The real advantage of these features will be much more evident when disease modifying treatments become available. Then we will be able to dispense different medications according to the individual subtype or trajectory, fostering precision medicine. However, data mining outcomes remain quite a technical approach and are still at too early a stage to be immediately beneficial in AD clinical practice. However, some research is looking at how we can bridge this gap and design interfaces that allow clinicians to interact with these algorithms and be supported in their decisions (Van Maurik et al., Bellio et al.). Another way in which data mining can advance understanding and timely intervention in AD is by detecting prodromal features of the disease (Mackintosh, Moore).

Assistive technology

Perhaps the most intuitive application of technology for Dementia, assistive technology fills an intriguing space in this area. Works presented at the conference focused mainly on two topics: monitoring and smart environment. Monitoring tools include research on wearables for sleep and physical activity patterns (De Vito), as well as the more ambitious goal of monitoring cognition (Campbell). The increased availability of monitoring technologies, and the effort to integrate them in a more ecological setting, motivates research towards smart environments. These are typically home-based solutions that include a multitude of monitoring systems, cameras, and robots (Kim), with the goal to assist and monitor patients’ daily-life activities. What is important to mention is that these are mostly exploratory works or feasibility studies, therefore more validation is required before we could see their impact in the real-world scenario.

Technology for interventions

Other than assistive devices, technologies have been developed as part of a protocol to track or stimulate disrupted abilities, or to support wellbeing. Work presented at the conference address these needs in the following areas: investigations, cognitive interventions, and patient/caregiver experience. The investigation branch includes innovations that can track or detect disease severity, with interesting contributions from Chen et al. on 3D skeleton trackers. For the cognitive intervention, Scullin et al. propose a personal assistant that can support memory for events and activities planned in the future, with the indirect goal to train them through the consistent use of the device. Finally, some applications in the patient/caregiver experience refer to works exploring digital supports for caregivers, through telehealth interventions (Shofner et al., Utz). Differently, other works aim to improve the overall visit experience for in-patients and their caregivers, through games performed on a touch screen app (McCabe et al.). As for assistive technologies, most of this work is still exploratory or in the design-stage. Nonetheless, they show good potential to improve both our understanding of AD, but also patients and caregivers’ daily experience and quality of life.

Future of work

Resources and funding allocated to technologies that can support the understanding, tracking, and interventions on Alzheimer’s Disease research and practice have exponentially increased in recent years. Moreover, we have seen a variety of different fields being involved, from the more computational, to engineering, and the clinical. The innovations proposed in this conference seem more focused on increasing experts’ knowledge (meaning scientists or clinicians) of the disease, rather than having a direct impact on patients’ life. This might be due to the fact that most of them are at early stages, and in a testing phase, therefore it is quite hard to reach both the readiness level and the longitudinal information to assess their efficacy. The future of this research looks exciting and promising, especially with new and more sophisticated technologies constantly hitting the scene. In parallel, what is also becoming of high relevance with the revolution towards digitisation and big data, is the importance of ethical practice and policies in data management and sharing.

--

--

Dr Maura Bellio
UCLIC
Editor for

Sr Clinical UX Specialist | Neuropsychologist with a PhD in human-computer interaction | Career mentor and personal development coach