Future Wearables: Smart Contact Lenses Analyze Your Tears

Alex Senemar
Sherbit News
Published in
6 min readOct 30, 2015

The Sherbit Blog is back after a short hiatus! We were hard at work finishing the beta version of our iPhone client, and we’ve finally been approved for the App Store! (You’ll need an invitation code to start using the app. If you haven’t signed up yet, just click “Get Early Access” above; if you’ve already signed up, don’t worry — we’ll start moving people off the waitlist soon!) This week, we’ll be continuing our series on the “Future of Wearables”…previously we looked at “smart” underwear and “ingestible” sensors, two fascinating forays in medical technology. Now, Google is developing “smart” contact lenses to analyze key biomarkers found in your tears — and patent filings and public statements suggest even more ambitious plans for the technology.

Earlier this year, Google’s leadership announced a major restructuring of the multinational conglomerate, separating many components of the business and placing them under an umbrella corporation with the friendly name “Alphabet.” The profitable parts of Google — YouTube, web search, Maps, and advertising — will remain a part of ‘core Google.’ Meanwhile, Google’s experimental, research-oriented divisions — like the secretive Google X, and the home automation team at Nest Labs — will be spun off as independent companies, to make clearer to investors where and how Google Alphabet is earning (and losing) money.

A map of Google’s new corporate structure.

Google co-founder Sergey Brin recently revealed a new Alphabet subsidiary called “Google Life Sciences,” formerly a division of Google X, dedicated to healthcare biotechnology research. We’ve covered some of Google’s Alphabet’s ambitious healthcare-related projects in previous posts — from ‘curing death‘ with “anti-aging” technology to ‘precision medicine’ using a global database of DNA. To support the Life Sciences division, Google has been funding and acquiring various biotechnology companies, in diverse fields as oral drug delivery, bioinformatics, autism and cancer detection, and even tremor-correcting “smart” spoons” for patients with Parkinson’s Disease. But Google Life Science’s first, highly-publicized project is a venture into a different (and very profitable) sector of the healthcare industry: diabetes.

Diabetes‘ refers to a group of metabolic diseases related to the regulation of insulin, a hormone that helps convert sugar (glucose) in the blood into energy. Without insulin, the body cannot use glucose for fuel, so it begins burning fat for energy — releasing toxic levels of ketones into the bloodstream. In diabetic patients, either the pancreas is not producing enough insulin (“type 1“), or cells are not responding properly to the insulin produced (“type 2“). It is estimated that nearly 10 percent of adults worldwide have diabetes, and each year the disease kills more people than breast cancer and AIDS combined. The underlying causes of diabetes are not well-understood, but it is clear that its incidence is much higher in ‘developed,’ highly-industrialized countries.

People with diabetes must check their blood sugar levels throughout the day, and inject insulin at appropriate times to keep their blood sugar under control. Many things can cause blood sugar levels to change, often unexpectedly, and ‘pricking’ yourself four or five times a day can become very inconvenient. Some non-needle glucose monitoring systems have been explored in the past: a “thumb cuff” that applies pressure to temporarily occlude blood flow (like taking a blood pressure measurement), then detects light transmitted through the finger to make an inference about the blood’s content; or a wristwatch that uses low-level electric currents to pull fluid from the skin. But Google is betting that passively monitoring glucose levels in tears will be even less invasive, and more comprehensive, than other technologies for managing diabetes.

“Google reveals smart contact lens” (The Telegraph)

The lens consists of a wireless chip and miniaturized glucose sensor, sandwiched between two soft layers of lens material — so the electronics don’t come into contact with the surface of the eye. They’re arranged outside the pupil and iris, to prevent obstructing the user’s vision; a small pinhole in the lens allows tear fluid to seep into the sensor, and a tiny antenna is used to gather, read, and analyze the data collected. According to the patent filed earlier this year, the system will be powered by drawing energy from radio frequency waves transmitted by a ‘reader’ device, possibly integrated into eyeglasses, jewelry, or clothing — anything that is close enough to communicate effectively with the lenses.

The “smart” contact lens project began with a research paper published by electrical engineers at the University of Washington, proposing lenses with built-in electronics for ‘augmented reality’–perhaps to aid people with impaired hearing, or as an indicator in a videogame. They pursued the idea with limited funding from the National Science Foundation, but were lured to Google by the enormous resources it offered, particularly its cutting-edge microchips. In July 2014, the Swiss pharmaceutical conglomerate Novartis announced it would join the Google project, contributing expertise from its eye care division Alcon (the second-largest contact lens supplier in the world) — in exchange, Novartis will receive exclusive licensing rights to distribute the product in the future.

The technology’s financial promise lies in its multitude of applications. Tears could also be tested for levels of cholesterol, sodium, and potassium — or to detect lacryglobin, a biomarker for many types of cancer, as a warning system for cancer patients in remission. Beyond passive sensors, the Google team also has plans to develop more assistive applications; one prototype uses photodiodes to detect light hitting the eye and infer where the eye is looking, to “autofocus” by adjusting the lens’ shape. Another team of researchers at University of Michigan is experimenting with graphene to develop lenses that detect infrared wavelengths — opening the possibility of night vision contacts. Google’s patent filing suggests that the lenses could eventually incorporate cameras — for example, to alert a blind wearer to an approaching car by taking an image and converting the information to sound.

“Google Glass”

There are still many obstacles for researchers to overcome before the “smart” contact lenses become viable medical products–the safety of the devices, the cost to manufacture them, and the accuracy of their measurements, to name just a few. Still, the engineers at Google are optimistic. Before Babak Parviz joined Google as one of the project’s leads, he predicted in a 2009 article a sort of “Google Glass” in a contact lens: a ‘heads-up‘ display using an array of microlenses that “pushes” a virtual image forward, to within the range of a human eye’s focus. In combination with a smartphone, a high-resolution lens could then translate speech into captions, or show visual cues from a navigation system–and any of the features of the now-infamous “Google Glass” teaser (above). “With basic image processing and Internet access,” Parviz wrote, “a contact-lens display could unlock whole new worlds of visual information, unfettered by the constraints of a physical display.”

The British writer Jesse Armstrong imagined a evolved form of a similar technology in the short film “The Entire History of You,” from the sci-fi anthology series Black Mirror. (You can view the film in its entirety for free on YouTube, or via Netflix). In this vision of the future, ‘smart lenses’ are no longer merely ‘wearables’ — they are implants inserted at birth, persistently recording all human activity and uploading it to a ‘cloud,’ becoming ubiquitous with human experience in extremely affluent societies. This is not so different from Google’s Alphabet’s vision. Ray Kurzweil, Director of Engineering at Google, recently predicted that in just fifteen years, human brains will be able to connect directly to the cloud: “Our thinking,” he proclaimed, “will be a hybrid of biological and non-biological thinking.” In Armstrong’s story, the protagonist is tormented by his inability to escape the images recorded by his ‘non-biological’ brain; if Kurzweil’s prophecy comes true, will humans be celebrating the triumph of Google’s technological genius… or cursing the cruel dystopia that came at its price? Stay tuned.

Originally published at www.sherbit.io on October 30, 2015.

--

--

Alex Senemar
Sherbit News

Working on disruptive ideas in blockchain and healthcare.