Google tests catching heart, eye issues from smartphone sensors

Oakland, Calif. (Reuters): Google on Thursday announced its latest plans to use smartphones to monitor health, saying it will test whether capturing heart sounds and eyeball images can help people identify issues from home. may help.

The company, a unit of Alphabet Inc., is investigating whether the smartphone’s built-in microphone can detect heartbeats and murmurs when placed over the chest, head of health AI Greg Corrado told reporters. The readings may enable early detection of heart valve disorders, he said.

“It’s not at the level of diagnosis, but at the level of knowing whether there is a high risk,” Corrado said, adding that questions remain about accuracy.

Eye research focuses on detecting diabetes-related diseases from photographs. Google said it had reported “early promising results” using tabletop cameras in clinics and would now test whether smartphone photos might work as well.

Corrado said his team “sees a future where people can better understand and make decisions about their health status from home with the help of their doctors.”

Google also plans to test whether its artificial intelligence software can analyze ultrasound screenings taken by less-skilled technicians, as long as they follow a set pattern. The technology could address a shortage of high-skilled workers and allow home birth parents to be evaluated.

The projects follow announcements last year about measuring heart and breathing rates using smartphone cameras – features now available on multiple devices via the Google Fit app.

While Google has long sought to bring its technical expertise to health care, it has said little about whether the efforts are generating significant revenue or usage.

Corrado said the launch capability was “a big step” and that adoption would take time.

“When you think about breathing and heart rate, any level we adopt today just scratches the surface,” he said.