Skip to main content

Google says its Lens image search can now help identify skin conditions

Google says its Lens image search can now help identify skin conditions

/

‘Just take a picture or upload a photo through Lens, and you’ll find visual matches to inform your search.’

Share this story

Illustration of Google’s wordmark, written in red and pink on a dark blue background.
Illustration: The Verge

Google Lens is now able to offer more information on that pesky rash that you’re not sure whether to worry about. In a blog post published this week, Google outlined how the Lens image search feature built into its apps on iOS and Android can “search for skin conditions” like an “odd mole or rash.” It’ll also work on other parts of your body if you want more information about a bump on your lip, line on a nail, or hair loss from your scalp. 

“Just take a picture or upload a photo through Lens, and you’ll find visual matches to inform your search,” the blog post reads. Crucially, however, Google specifically warns that results are “informational only and not a diagnosis” and says users should “consult your medical authority for advice.” The feature is available to everyone in the US, across all languages, Google spokesperson Craig Ewer confirmed to The Verge.

Three screenshots of the identification process.
Google says Lens can identify skin conditions from a photograph.
Image: Google

Google has been exploring the use of AI image recognition for skin conditions for years. At its I/O developer conference in 2021 the company previewed a tool that attempted to identify skin, hair, and nail conditions using a combination of photos and survey responses. At the time Google said the tool could recognize 288 different conditions, and would present the correct condition in the top three suggestions 84 percent of the time.

As of this writing, Google’s website for its DermAssist tool says it’s “currently undergoing further market testing through a limited release.” It adds that while it’s CE-marked as a Class 1 Medical Device (a label for goods in the European Economic Area), it’s not been evaluated by the United States FDA and “is intended for informational purposes only and does not provide a medical diagnosis.”

That’s all well and good, but that won’t prevent people from trying to use tools like these for diagnosis. Arguably, adding that sort of disclaimer only shifts liability onto the user, while letting Google still offer the same underlying service.

There’s good reason, too, to be cautious about AI diagnostic tools. One persistent criticism when it comes to identifying skin conditions, is that such software is less accurate for users with darker skin tones. Research cited by The Guardian in 2021 noted a lack of skin type category data across many freely available image databases used to train AI systems, and a lack of images of dark skinned individuals in databases that did include this information.

Google has been working to address these complaints. Last year it partnered with Harvard professor Ellis Monk to promote his Monk Skin Tone Scale (MST) and best practices for how it can be used in AI development. The search giant says its teams have been using the MST Scale for its own “ML labeling tasks, ethnographic research, and fairness testing.

The company has also suggested in 2021 that its deep learning system was actually more accurate at identifying skin conditions for Black patients. In slides provided by Google to Motherboard, the company said its system had an accuracy rate of 87.9 percent for Black patients, higher than other ethnicities.

In response to The Verge’s questions about how well the feature works across different skin tones, Google spokesperson Craig Ewer said the company has attempting to build the feature in an equitable way by working with organizations and clinicians that serve patients from “diverse backgrounds.” He added that the company worked with dermatologists who are experts in different skin tones to curate thumbnail images.

Update June 16th, 3:15AM ET: Updated with comment from Google.