The processing power of today’s smartphones, along with their high-quality cameras and an array of ever-improving sensors, makes them potentially useful in the medical world for diagnosing different conditions quickly. Let’s take a look at a few early-stage but highly promising examples.
A smartphone test for female fertility: For the purposes of procreation, knowing when a woman is ovulating can be key to successful outcomes, and a smartphone-based saliva test could remove a lot of the guesswork. The technology was developed by scientists at Brigham and Women’s Hospital and demonstrated in 2018, offering a possible alternative to urine tests or basal body temperature analysis as a way of gauging fertility.
The test consists of a glass slide, which a saliva sample is deposited onto. As the fluid dries, it crystallizes into a fern-like structure with distinct patterns, a process known as salivary ferning, which can reveal which stage of the menstrual cycle the subject is in. The slide is inserted into an optical device that is placed over the smartphone’s camera. An artificial intelligence app then carries out analysis of the sample, with experiments showing that it could identify the ovulation phase of the menstrual cycle with 99 percent accuracy.
Type 2 diabetes: A recent promising study was recently carried out in which a smartphone was used to detect type 2 diabetes with incredible accuracy, using nothing but the camera. The technique makes use of photoplethysmography (PPG), a technique where blood volume changes can be detected by shining light onto tissues.
A team from the University of California, San Francisco used this approach, with the help of a smartphone camera and flash, and a deep-learning algorithm trained on 2.6 million PPG recordings, to detect diabetes in three separate cohorts. The technique accurately detected diabetes in around 80 percent of subjects and proved even more precise when basic patient data was factored in, such as body mass index and age.
Concussions: One-way scientists are looking to improve the way we detect and treat concussions is by looking into the eyes, where erratic movements or an inability to track moving objects can be indicative of a brain injury. An app developed by the University of Washington scientists uses the smartphone’s flash to stimulate the eye, its camera to record a three-second video, and a deep-learning algorithm to detect changes in the way pupils respond to the light. In a pilot study involving 48 subjects, the team was able to use this approach to diagnose concussions with almost perfect accuracy.
Pancreatic cancer: Another app from the University of Washington detects signs of jaundice, one of pancreatic cancer’s early symptoms. More specifically, the app uses the smartphone’s camera and computer vision algorithms to search for elevated levels of a substance called bilirubin, which leads to the yellowing of the skin and eyes seen in jaundice. It does this by assessing the wavelengths of light that are absorbed by the part of the eyeball. In their testing, the app correctly detected early signs of pancreatic cancer 89.7 percent of the time.
Skin cancer: Stanford University figured out how to use a camera and artificial intelligence to detect early-stage melanomas after the Ai was trained on more than 100,000 images of skin lesions. The team then ran experiments where the performance of the system was compared to professional dermatologists and found that it could classify skin cancers on a comparable level to the trained experts.
HIV and syphilis: Columbia University created a multifunctional dongle that could be plugged into a smartphone to enable it to detect both HIV and syphilis. The platform relies on disposable plastic cassettes that are loaded up with reagents that can detect antibodies for both conditions, and needs just a drop of the patient’s blood to do it. Results from the test come in just 15 minutes. Considering the low manufacturing cost of the dongle, this device could be a promising healthcare solution in developing areas.