Google Glass Might One Day Diagnose And Track Diseases Like HIV

Ozcan group member Steve Feng uses Google Glass to read a rapid diagnostic test. Credit: ACS Nano

Ozcan group member Steve Feng uses Google Glass to read a rapid diagnostic test. Credit: ACS Nano

When Google began releasing its new head-mounted computer to beta testers last year, technology enthusiasts were pumped. After all, the futuristic device, called Glass, might one day enable people to answer email hands-free or view driving directions projected onto the road in front of them. Others, though, have complained that Google Glass is a cool piece of tech that hasn’t yet justified its existence. (Still others have complained that Glass is creepy, but that’s a story for another day.)

Slowly but surely, though, beta testers in Google’s Explorers program have been making a case for the sophisticated eyewear by demonstrating its unique—sometimes scientific–capabilities. Physics teacher Andrew Vanden Heuvel famously shared his visit to the Large Hadron Collider, in Switzerland, with his students via Glass. Ohio surgeon Christopher Kaeding gave medical students a live, bird’s eye view of a knee operation he conducted while wearing the device.

And now, a research team led by Aydogan Ozcan of the University of California, Los Angeles, is using Google Glass to help diagnose and track disease. The engineers designed an app for the wearable computer that images and reads rapid diagnostic tests such as pregnancy pee sticks. It also links the results to a scannable QR code, stores them, and tags them geographically.

“The new technology could enhance the tracking of dangerous diseases and improve response time in disaster-relief areas or quarantine zones where conventional medical tools are not available or feasible,” Ozcan says.

Among the first to be selected by Google as Explorers, Ozcan and his team demonstrated the capabilities of their new app by using it to read a few types of home HIV and prostate cancer tests—ones that require an oral swab or a drop of blood to work. They recently published their efforts in ACS Nano (2014, DOI: 10.1021/nn500614k).

True, it doesn’t take much to read one of these tests—either lines appear or they don’t in the case of the HIV tests. But the app could save time for clinicians who routinely have to read a multitude of different types of sticks and remember which symbols and lines signify a positive or negative result, Ozcan points out. After a single calibration run, the online tool recognizes a particular test stick and can even assign a biomarker concentration to the lines that appear.

These rapid diagnostic tests typically use nanoparticles to create these lines (which is why Ozcan’s study appears in ACS Nano). Coated with an antibody, the particles recognize a specific biomarker in blood, urine, or saliva samples and bind to it. As the particle-biomarker complex flows down the test stick, rows of a different type of antibody already adsorbed to the stick recognize the biomarker too. Once this second set of antibodies binds to the biomarker, the nanoparticles become immobilized on the test strip, trapped by an antibody-biomarker-antibody sandwich. And once enough of the particles clump together, they form red or blue lines. (You can see a schematic of the process here.)

According to Ozcan, once calibrated, his group’s Google Glass app can read biomarker lines on these test sticks down to a few parts-per-billion sensitivity—far greater than the naked eye. So in addition to helping clinicians catalog their results, the app might eliminate interpretation errors or even analyze line intensities to determine the aggressiveness of a disease such as, say, prostate cancer.

Although there’s no new chemistry in Ozcan’s Google Glass report, “the paper is great as a message that the convergence of nanoparticles and consumer electronic devices have tremendous potential for making quantitative diagnostic technologies available outside laboratory settings, such as point-of-care and personalized medicine applications,” says Russ Algar, a chemist at the University of British Columbia. Algar and others have been adapting common electronics such as cell phones to image cells and viruses and detect proteins and chemicals for point-of-care use in developing countries. The hands-free aspect of Google Glass could be advantageous in many settings, Algar says, but because the technology is still new and has low computational power, it’s still unclear whether it will eventually outperform cell phones and other more powerful mobile devices.

Ozcan acknowledges the still-nascent state of Glass. One challenge his team experienced in getting their app up and running was the frequent software updates Google made to the device. “This sometimes entirely failed our app, and we needed to fix it on a continuous basis,” Ozcan says. “But with each successive iteration of Glass, we have experienced better performance.”

The next step for the UCLA team’s app is commercialization, Ozcan says. Los Angeles-based Holomic, a firm Ozcan cofounded, is already—perhaps not surprisingly—interested.

Author: Lauren Wolf

Share This Post On