Microscopes outfitted with AI-driven augmented reality could improve the accuracy of cancer diagnoses.
What’s happened: Google Health developed an attachment for analog microscopes that outlines signs of breast and prostate cancer in real time.
How it works: A computer-vision system spots cancer in a cell slide, while augmented-reality tech superimposes the AI’s prediction over the slide at around 27 frames per second.
- The developers combined the Inception V3 image classifier with a fully convolutional neural network, which allowed the system to recognize tumorous patterns much faster.
- A camera captures a head-on view of the slide and projects it, overlaid with the AI prediction, into the microscope eyepiece.
Behind the news: Pathologists use microscopes to measure tumor size relative to nearby lymph nodes and to count the number of cells nearing or undergoing mitosis. That information tells them how aggressively a patient’s cancer is spreading.
Why it matters: Interpreting cell slides is subjective, and one pathologist’s understanding can differ greatly from another’s. Patients in locations where trained pathologists are scarce tend to suffer most from this inconsistency. AI-enhanced tools could help make diagnoses more reliable.
We’re thinking: AI is a natural complement to digital microscopes, but analog microscopes are far more common. This technology promises to upgrade those tools at a fraction of the cost of replacing them.