The algorithm looked at nearly 130,000 images of moles, rashes and lesions using a deep learning image recognition technique. It was then tested head-to-head against 21 human dermatologists, where it performed “at least” 91% as good. In the future, the researchers suggest the program could be used to create a mobile app for detecting skin cancer — the most common form of cancer diagnosed in the U.S. — at home.
“We realized it was feasible, not just to do something well, but as well as a human dermatologist,” said Sebastian Thrun, founder of research and development lab Google X and lead author of the study. “That’s when our thinking changed. That’s when we said, ‘Look, this is not just a class project for students, this is an opportunity to do something great for humanity.'”
The algorithm uses a type of computer software called a convolutional neural network that learns to recognize different concepts. By downloading digital images, researchers can “tell” the computer they are images of skin cancer, or without skin cancer. The machine will basically try to learn some rules that can predict whether it’s cancer.
Each year in the U.S., some 5.4 million new cases of skin cancer are diagnosed. The usual process for identifying the disease involves a visual examination of moles or other marks on the skin by a dermatologist.
Melanomas represent less than 5% of all skin malignancies diagnosed each year in the U.S., yet they account for nearly 75% of all deaths related to this type of cancer. If detected early, the five-year survival rate for melanoma is 99%. When detected in its latest stage, the survival rate is just 14%.
The goal of the technology is not to replace human dermatologists, according to the researchers, but to offer people an inexpensive option for early screening. However, this would take additional training for the AI and more rigorous assessments of its safety would need to be made before such an app could go public.