Artificial intelligence detects skin cancer

31 January 2017

Interview with

Andre Esteva, Stanford University, and Sancy Leachman, Oregon Health and Science University.

Engineers in America have developed a computer programme that trains itself to spot skin cancers in photos from a patient's skin and, in tests, it does it as successfully as a panel of trained skin specialists. Stanford PhD student Andre Esteva is the inventor…

Andre - What we’ve done is to build a computer algorithm, like a computer programme that can match the performance of board certified dermatologists at identifying whether or not an image of a skin lesion is benign or malignant. And we’ve tested it across three really important medical diagnostic use cases, which include identifying carcinomas, including basal and squamous carcinomas from their benign counterparts as well as identifying malignant melanoma from normal ordinary moles.

Chris - And you do this by showing the computer programme images of these respective lesions?

Andre - That’s correct. We use a data driven approach which, in contrast to previous computer programmes where you would tell the computer do step one, do step two, to step three, instead what we do is we feed the computer a massive amount of data. We show it images and we tell it what those images are of, for instance, malignant melanoma and it learns through a training process how to distinguish between benign and malignant all on its own.

Chris - Now when you say you feed it a massive amount of data, just define what does that mean in practical terms how much data?

Andre - We’re using about 1.4 million total images. We use about 1.28 million images of normal everyday objects.  You see, training this algorithm is split up into two steps and the first step you’re sort of teaching the algorithm what the world looks like. You show it images of everyday objects like cats and dogs, and tables and chairs. In the second step, you show it images of skin disease and there we’ll use almost 130,000 images of skin disease over 2,000 different disease types.

Chris - So it then learns what it’s looking for as a first a priori thing, and then once it knows what it’s looking for - ah, that is skin, that is a skin lesion and then it begins to extract the corresponding data that tells it what the diagnosis might be, benign, malignant, and what sort of malignant disease?

Andre - that’s about right, yes.

Chris - How does it know it’s got it right?

Andre - We know the ground truth. So we have a tessat of images that the algorithm has never seen before. And after we train the algorithm we test it on just under 2,000 different images, all biopsy proven, which means that a pathologist has confirmed that they’re benign or malignant and so we can gauge its accuracy.

Chris - And how accurate was it? In other words, how good is it?

Andre - What we did in this work was an image by image comparison. We showed to the dermatologist and image of a lesion and then we showed to the algorithm an image of the exact same lesion. We asked them do you biopsy or treat this lesion or would you reassure the patient and that allowed us to determine a sensitivity and specificity for each. What we found is that the algorithm performed on par with all tested experts.

Chris - In fact it performed as well or better than a panel of 21 dermatologists or skin doctors. So what do the experts make of it.

Sancy - My name is Sancy Leachman. I’m the Chair of the Department of Dermatology at the Oregon Health and Science University.

What we struggle with in dermatology is not being able to quickly see enough patients who have something that might be concerning. What this particular machine does is it allows their moles or their skin lesions to be checked really quickly by an objective source without necessarily having to have a dermatologist on hand to do it.

Chris - Do we know whether a picture of a particular skin complaint is as good as showing the dermatologist the skin complaint literally in the flesh?

Sancy - We actually do have some data on that. There have been some papers published at looking at whether or not digital images are just as good as a human exam in person. And it turns out it’s not perfect, it’s not quite as good but it’s very, very close. It’s close enough that’s it’s probably good enough to triage people. To be able to tell people do you really need to see a doctor or is this clear enough that we can avoid that office visit? That’s huge when you have an overburdened health care system.

Chris - We should know all about that with the NHS. But what do you think the scale of the problem that it can can solve is? How big is this?

Sancy - Well, I mean when you’re talking about just this first step, thinking about all of dermatology if you really use it to detect all kinds of skin diseases, that would be pretty big. But if you think about getting it to work for dermatology and then having it extend to radiology, or pathology, or ophthalmology, then you’re talking about it extending throughout the entire field of medicine and it’s huge, it’s absolutely huge!

Chris - Are you comfortable with that though? Do you not think that there might be some shortcomings here because you are replacing a human being with a computer programme and computer programmes don’t have emotions, they don’t have human instincts but, moreover, they may also not spot other glaring diagnoses that a person with a mole that’s actually benign needn’t worry about, but the other thing that will kill them next week will be completely overlooked?

Sancy - Yes, that’s obviously a concern. It’s not that different from how we use automatic flying systems (guidance systems) in an aeroplane. You still need that pilot there to be able to override. I think it’s similar with this kind of technology that you want to have a backup person and I do think that the false sense of security part of this is that you still need a person to decide what lesion needs to be examined by the machine. So you might end up having a person who wants to check something that they think is bad but it turns out they have something that's much, much worse on their back that they don’t even know about and, if they’d gone in to see the doctor in person, that might have been detected.