Behind the Headlines

MATLAB and Simulink behind today’s news and trends

This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English version of the page.

Bug Brain Beats Machine Learning 7

Posted by Lisa Harvey,

Living organisms have long provided inspiration for technology. Biomimicry of birds helped us design our first aircraft, while the structure of seed burs was copied for Velcro. Today, biomimicry is being applied to advanced technology such as robotics and computer vision.

Another modern-day application of biomimicry is in artificial intelligence (AI).  With AI, machines take on natural cognitive functions such as learning and problem-solving. Artificial neural networks (ANNs) take biomimicry a step further by creating computing systems inspired by the brains of living organisms.

But just how intelligent can a system be that is modeled after a relatively unsophisticated biological brain? It turns out, thanks to evolution, even relatively simple brains of living creatures can be very intelligent when it comes to a task that is necessary for their survival. For a moth, that means the sense of smell.

Close-up image of moth head.

Sometimes, smaller is better

Even though a moth’s brain is the size of a pinhead, it is highly efficient when it comes to learning new odors. Its sense of smell is needed for finding food and finding mates, both critical tasks for their survival as a species.

Researchers from the University of Washington developed a neural network, dubbed MothNet, based on the structure of a moth’s brain.

“The moth olfactory network is among the simplest biological neural systems that can learn,” the researchers, Charles B. Delahunt, Jeffrey Riffell, and J. Nathan Kutz, stated in their paper, Biological Mechanisms for Learning: A Computational Model of Olfactory Learning in the Manduca sexta Moth, with Applications to Neural Nets.

The MIT Technology Review article, Why even a moth’s brain is smarter than AI, described the biological system copied by MothNet, “The olfactory learning system in moths is relatively simple and well mapped by neuroscientists. It consists of five distinct networks that feed information forward from one to the next.”

 

Diagram showing basic he structure of a moth’s brain used to model MothNet.

MothNet was designed to mimic the structure of a moth’s brain. Image credit: Delahunt and Kutz.

 

Instead of identifying scents, the researchers used supervised learning to train the ANN to recognize handwritten numbers with just 15 to 20 images of each digit from zero to nine. The training samples came from the MNIST (Modified National Institute of Standards and Technology) database of digits commonly used for training and testing in the field of machine learning. Some Examples from the MNIST database are shown below:

 

Handwritten numbers from a sample of the MINST database.

Sample from the MINST database. Image credit: Wikipedia, CC 4.0.

They found MothNet could learn much faster than machines.  MothNet “learned” to recognize numbers with just a few training samples with an accuracy of 75 to 85 percent. A typical convolutional neural network, by comparison, requires thousands of training examples to achieve 99% accuracy.

Developing Better Machine Learning Algorithms

The researchers found that the moth’s biological system was efficient at learning due to three main characteristics which could aid in the development of new machine learning algorithms:

  • First, it learned quickly by filtering information at each step and only passing along the most critical information to the next phase is the system. While the first of the five distinct networks starts with nearly 30,000 receptors in the antenna, the second network is comprised of 4,000 cells. By the time the information reaches the last network in the system, the neurons number in the 10s.
  • Second, the filtering process had the added benefit of removing noise from the signals. The sparse layer between the first two networks acts as an effective noise filter, protecting the downstream neurons from the noisy signal received by the “antennas”.
  • Lastly, the brain “rewarded” successfully identifying odors with a release of a chemical neurotransmitter called Octopamine, reinforcing the successful connections in the neural wiring. The active connections strengthen for an assigned digit, the rest wither away.

 

The green lines highlight the pathways in MothNet, the artificial neural network and the blue lines are the biological pathways. Image credit: Delahunt and Kutz.

 

“The results demonstrate that even very simple biological architectures hold novel and effective algorithmic tools applicable to [machine learning] tasks, in particular, tasks constrained by few training samples or the need to add new classes without full retraining,” the researchers stated in the paper.

All coding was done in MATLAB. The code for this paper can be found here.

 

 

 

 

 

7 CommentsOldest to Newest

Erick Oberstar replied on : 2 of 7
Kind of a joke to claim code is published. The above linked git repo is empty minus a readme and license file... i.e. NO Code.
Lisa Harvey replied on : 3 of 7
Thanks for pointing it out. The author's comment is "We are currently refactoring the Matlab code base for easier use."
Charles Delahunt replied on : 4 of 7
The full Matlab code for simulation of moths learning to read MNIST is now available at github/charlesDelahunt/PuttingABugInML. Thank you, Charles Delahunt
Charles Delahunt replied on : 6 of 7
Good morning- We have added new Matlab code to this github repo to enable training of Insect Cyborgs: A moth brain is trained, and its output neurons are fed as input features into a standard machine learning algorithm (eg a neural net). Because the insect architecture is naturally good at clustering, the moth''s output neurons have strong classification information. The cyborg improves accuracy by 10% to 30% relative to a pure-ML method baseline. The paper can be found at https://arxiv.org/abs/1808.08124 Thank you, Charles Delahunt

Add A Comment

Your email address will not be published. Required fields are marked *

Preview: hide