Artificial Intelligence

Apply machine learning and deep learning

MATLAB wins Hackathon

This post is from Paola Jaramillo, Application Engineer from the Benelux office.
Back in February, I attended a hackathon hosted by Itility: meeting for 3 hours to solve an image classification problem while also enjoying pasta and networking with peers. I was there primarily to learn and see how other engineers and researchers were using machine learning in daily-life applications. As the title of this blog post indicates, my team ended up getting impressive results and winning the hackathon!

The Challenge

The goal of the hackathon was to solve an image classification problem with ties to real-life research:
Given a simplified dataset of specific species of plants, can machine learning correctly identify the species in the images.
 
Original link to meetup is here. We were not given any restrictions on language or method to use for this classification task. We broke off into teams and each team began brainstorming. Teams decided to tackle this with various approaches:
  • - Traditional image processing skills: use pixel to correctly identify the image
  • - Using R and Python based on prior experience with the tools
  • - Machine learning approach, preprocessing the images to identify features

My Approach

My group and I had no prior expertise in plants seedlings and image processing to be able to come up with the right engineering features, so we decided to use deep learning techniques on the raw images. Given the size of the dataset and the limited time, we used a simple approach popular in the deep learning community known as transfer learning instead of starting from scratch.
While people were inspecting the images, and looking for the right libraries and packages to get started, I fired up MATLAB and searched the documentation for a transfer learning example. (https://www.mathworks.com/help/deeplearning/examples/transfer-learning-using-alexnet.html)
The original example shows completely different objects in the images, so it wasn't clear this would work for our data, but the example shows that by applying transfer learning, the pretrained model AlexNet is able to learn features and classify new images.
First, I changed the input to point to the location of the new data:
imagepath = fullfile(pwd,'Subset_from_NonsegmentedV2');
imds = imageDatastore(imagepath, 'IncludeSubfolders',true,...
    'LabelSource','FolderNames')
The images were not individually labeled, though they were separated into folders with the name of the specific species as the folder name. imageDatastore can automatically label images based on the folder name, so this saved us quite a lot of effort.
We decided before spending time preprocessing the images, we would explore the results of retraining AlexNet from the raw image data. For this, we only needed to resize the images, which is automated by the read function of imageDatastore
imagesize = layers(1).InputSize
 outputSize = imagesize(1:2);
 imds.ReadFcn = @(img)imresize(imread(img),outputSize);
*note, you can also resize images using a newer function called augmentedImageDatastore in 19a
We then split the dataset into training and validation. A separate folder of images was provided for testing.
[trainDS,valDS] = splitEachLabel(imds,0.7,'randomized')
Then we ran the training on a simple AlexNet model. This took approx 7 minutes to train with my laptop w/ GPU. MATLAB automatically detected the GPU and used it for training.
opts = trainingOptions('sgdm','InitialLearnRate',0.0001,...
    'ValidationData',valDS,...
    'Plots','training-progress',...
    'MiniBatchSize', 8,... %change according to the memory availability
    'ValidationPatience', 3,...
    'ExecutionEnvironment','auto') %'multi-gpu' or 'parallel' for scaling up to HPC

hackathon_net = trainNetwork(trainDS, layers_to_train, opts);
Training progress plot of the initial model
  The first training produced an accuracy of 92%. Not bad, but was this enough to win it all? I balanced the dataset to use only 100 of each category.
imds = splitEachLabel(imds,100,'randomized');
With the balanced dataset, the accuracy became much higher – resulting in 97% at the end of the session on the test dataset. We were able to try a variety of options and iterations and found that a simple AlexNet model would produce the best results.
Here is a table of the results by approach taken:
Tools MATLAB PyTorch Python Python Python TensorFlow-Keras
Model AlexNet Resnet-50 VGG-16 2-layer CNN Random Forest InceptionV3
Techniques Dataset balancing Adam optimization data augmentation (more data) By color channel
Accuracy 97% 88% 80% 77% 53% 22%
You can read more about the hackathon challenge here. Here is a quote from the blog post:
In the end the winning team used a rather simple 8-layer AlexNet model – but managed to reach an accuracy of 97% on the unlabeled dataset! And here is an interesting detail – not only did this team obtain the highest accuracy, they were also the only ones not using R or Python, but MATLAB
It appeared that people were expecting open source to win this challenge, but MATLAB was the winner!

Summary

This was a great opportunity to work with the Machine Learning community in a real-life challenge and I felt great about my participation and the results.
It’s important to remember that I am an engineer with background on signal processing systems and a basic understanding of machine learning, that uses MATLAB to solve a wide variety of problems. I was able to apply deep learning techniques to image data without previous background, in this case simply by searching the documentation for the right example andusing a pretrained model.
Overall, a quick way to get started with deep learning and put together a working model to solve and win a real-life challenge!
 
Thanks again to Paola for her participation in this event and her impressive results with the team. You can download the code from FileExchange. Leave a comment below for any questions you may have for Paola about this event.

|
  • print

Comments

To leave a comment, please click here to sign in to your MathWorks Account or create a new one.