Intelligent Edge: Building a Skin Cancer Prediction Application using Azure Machine Learning, CoreML and Xamarin

anusua trivedi
MICCAI Educational Initiative
5 min readAug 29, 2019

Anusua Trivedi, Senior Data Scientist, AI for Good Research Lab, Microsoft

Motivation

Artificial Intelligence (AI) has emerged as one of the most disruptive forces behind digital transformation that is revolutionizing the way we live and work. AI-powered experiences augment human capabilities and transform how we live, work, and play — and have enormous potential in allowing us to lead healthier lives.

Introduction

AI is accelerating the digital transformation for clinicians, by empowering them with deep insights, to help them make better decisions. The potential to save lives and money is tremendous. At Microsoft, the Healthcare NExT project is looking at innovative approaches to fuse research, AI and industry expertise to enable a new wave of healthcare innovations. The Microsoft AI platform empowers every developer to innovate and accelerate the development of real-time intelligent apps on edge devices. Real-time intelligent app runs on the target devices. It has two advantages:

Low latency for local decision making

Reduced reliance on internet connectivity

Imagine environments where there’s limited or no connectivity, whether it’s because of lack of communications infrastructure or because of the sensitivity of the operations and information involved. There the only alternative to cloud servers are proprietary data centers that cost heavily to set up and maintain. Such remote locations, that can benefit immensely from artificial intelligence, will have limited access to AI applications because of their poor connectivity. As IoT moves into more eccentric and disconnected environments, the necessity of such hybrid environments of cloud and edge computing become more prevalent.

Take Skin Cancer Detection app as an example. Skin cancer is the most common form of cancer, globally accounting for at least 40% of cases. If detected at an early stage, it can be controlled. What if we can create a real time AI app which can quickly suggest whether I need to seek help or not. The app would flag a set of images which in turn would help the doctors to be more efficient and only focus on the most critical patients.

This model/application is intended for research and development use only. The model/application is not intended for use in clinical diagnosis or clinical decision-making or for any other clinical use and the performance of the application for clinical use has not been established.

Dataset and Preprocessing

For this work, we us the ISIC Skin Cancer Research dataset. We have split up the ISIC Dataset for training and testing — 80% of the images for training and 20% of the images for scoring. ISIC dataset contains 2000 images as training data, including 374 “melanoma” images, 254 “seborrheic keratosis” images, and the remainder 1372 as benign images. The training data is provided as a ZIP file, containing dermoscopic lesion images in JPEG format and a CSV file with some clinical metadata for each image. Here we use only JPEG images for training our AI model. The ISIC dataset has far fewer melanoma examples than seborrheic keratosis, and nevus. Only about 20% of the default ISIC dataset is malignant, 374 images total. The skewed distribution has a big impact on how we judge our classifier, and how we train it. We apply different augmentation techniques (like rotation, cropping etc.) to balance the dataset.

Building Intelligent Skin Cancer Prediction App

For code, please refer to this GitHub link. For more details, refer to our video.

Training the AI model for app

We build the AI Model using Microsoft Azure Machine Learning Workbench. Azure Machine Learning is a cross-platform application, which makes the modelling and model deployment process much faster versus what was possible before. We create a deep learning model using open-source packages supported in Azure ML. We use Keras with Tensorflow backend to build the model.

First, we tried the transfer learning approach for training the AI models. Applying transfer learning on standard ImageNet pretrained ResNet-50 model was not giving a good result on such a smaller domain-specific dataset as ISIC. As seen in Figure 1, we used a smaller 2 Convolution layer network with sigmoid classifier. We did some hyperparameter tuning and Relu activation with Adam optimizer worked best for this model. This model trained on ISIC dataset gives ~97% training accuracy on out training set and ~89% test accuracy on our test set.

Deploy trained AI model as an Intelligent app

We want to run this trained model on our iPhone. We use CoreML to convert the trained Keras model on an iPhone compatible format. CoreML brings machine learning to iOS. Apps can take advantage of trained machine learning models to perform all sorts of tasks, from problem solving to image recognition. We pip installed the CoreML package on our AML environment. We can run the CoreML converter on AML Workbench and create an mlmodel (Figure 2).

Using Xamarin to develop an Intelligent Application

A key benefit of using Xamarin is that the UI uses native controls on each platform, creating apps that are indistinguishable from an application in iOS or in Android. We take sample Xamarin app from this GitHub link. Next, we change the name of the model in the view controller file and load the compiled CoreML model. In view controller file, we change the result extraction function to output the messages we want the app to spit out (Figure 3). With only changing the highlighted lines of code in our sample Xamarin app, we can run any AI model on our phone.

Combining all the 3 steps above, we get an Intelligent Skin Cancer prediction app on iOS (Figure 4).

Figure 1. Training DNN in AML
Figure 2. Convert Keras model to CoreML in AML
Figure 3. Intelligent Xamarin app
Figure 4. Skin Cancer Screening Architecture

Conclusion

In this blog post, we showed how we use Azure Machine Learning to train and test an AI model and how we can create an intelligent app on iOS. Such intelligent apps can help time-critical decisions at the edge and refer to the cloud where more intensive computation and historical analysis if needed. We look forward to seeing how you utilize the combination of Intelligent Cloud and Intelligent Edge for your business and build more.

Conflict of Interest Disclosure: Anusua Trivedi is an employee of Microsoft.

--

--