Our website uses cookies. They help us understand how customers use our website so we can give you the best experience possible. By continuing to browse this site or choosing to close this message, you give consent for cookies to be used. The cookies are not used for advertising. This applies to visitors from EU.

Eye Scans to Detect Cancer and Alzheimer’s Disease

Posted by Neelansh Bhartiya 03/09/2017 0 Comment(s)

It is said that eyes are the window to the soul. Well, what about the window to our health?

We recently reported on two AI systems trained to detect eye diseases, specifically diabetic retinopathy and congenital cataracts. Now, we’ve found groups extending that concept to illnesses beyond the eye. Two new projects pair imaging systems with advanced software in an effort to catch early symptoms of pancreatic cancer and Alzheimer’s disease.

At the University of Washington, a team led by computer scientist Shwetak Patel created a smartphone app to screen for pancreatic cancer with a quick selfie. Developed over the last year and a half, the team recently tested their system in a clinical study of 70 people. They were able to identify cases of concern with 89.7 percent sensitivity and 96.8 percent accuracy.

While the app is not yet ready to be used as a diagnostic tool based only a single small trial, it might soon provide a tool for doctors monitor disease progression in patients undergoing treatment with a simple photograph, rather than a blood test, says first author Alex Mariakakis, a UW graduate student.

The app, called BiliScreen, monitors levels of bilirubin, a metabolic byproduct that builds up in cases of jaundice, causing a yellowing of the skin and whites of the eyes. Increased bilirubin in the blood is one of the earliest symptoms of pancreatic cancer, but it is currently detected by a blood test in a doctor’s office and done only for patients at high-risk or with other symptoms.

To create an easy-to-use, non-invasive early screen for pancreatic cancer—which claimed the life of study co-author James Taylor’s father—the researchers designed a three-step system. First, users take a selfie with their smartphone using one of two accessories to control for environmental conditions: either a cardboard box to block out the light, or colored glasses to give a color reference. In the study, the team found that the box was slightly better at controlling light conditions than the glasses.

Once an image is captured, the software relies on computer vision algorithms to isolate the sclera, or white of the eye, from the skin and pupil. The clinical trial version of the software was based on an algorithm from Microsoft Research called GrabCut, says Mariakakis. A newer iteration, which works even better to isolate the sclera, uses a fully convolutional neural network to identify the most important local features in each image.

For more details: - https://spectrum.ieee.org/the-human-os/biomedical/diagnostics/eye-scans-to-detect-cancer-and-alzheimers-disease

Author’s Bio

...

Author’s name: Neelansh Bhartiya

Write a Comment