Researchers at the University of Washington have developed an app – BiliScreen app – that could allow people to easily screen for pancreatic cancer and other diseases when they take a selfie.
Pancreatic cancer has proven to have one of the worst survival rates of any cancer because there are no symptoms or non-invasive screening tools to catch a tumor before it spreads.
The app named BiliScreen uses a smartphone camera, computer vision algorithms and machine learning tools to detect increased bilirubin levels in a person’s sclera, or the white part of the eye.
Lead author Alex Mariakakis, a doctoral student at the Paul G. Allen School of Computer Science & Engineering said;
“The problem with pancreatic cancer is that by the time you’re symptomatic, it’s frequently too late. The hope is that if people can do this simple test once a month — in the privacy of their own homes — some might catch the disease early enough to undergo treatment that could save their lives.”
The new app is meant to be presented at Ubicomp 2017, the Association for Computing Machinery’s International Joint Conference on Pervasive and Ubiquitous Computing on Sept. 13.
Symptoms of pancreatic cancer that BiliScreen app tracks
One of the symptoms of pancreatic cancer is jaundice – a yellow discoloration of the skin and eyes caused by bilirubin in the blood – and identifying jaundice when bilirubin levels are still minimal can be a pretty difficult task as it may not be visible to the naked eye. Considering that early detection is always key in tackling cancer, the individual becomes more at-risk.
Biliscreen would enable an entirely new screening program for at-risk individuals. A clinical study was carried out with 70 people for the BiliScreen app in conjunction with a 3-D printed box that controls the eye’s exposure to light. The app was able to correctly identify cases of concern 89.7 percent of the time, compared to the blood test currently used.
BiliScreen uses a smartphone’s built-in camera and flash to collect pictures of a person’s eye as they snap a selfie. The team developed a computer vision system to automatically and effectively isolate the white parts of the eye, which is a valuable tool for medical diagnostics. The app then calculates the color information from the sclera — based on the wavelengths of light that are being reflected and absorbed — and correlates it with bilirubin levels using machine learning algorithms.
In order to account for different lighting conditions, the team tested BiliScreen with two different accessories: paper glasses printed with colored squares to help calibrate color and a 3-D printed box that blocks out ambient lighting. When the Biliscreen app was used with the box accessory, there were slightly better results.
The team continues to work on testing the app on a wider range of people at risk for jaundice and underlying conditions, as well as continuing to make usability improvements.