Deep Learning

Understanding and using deep learning networks

This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English version of the page.

Deep Learning in Action – part 1 7

Posted by Johanna Pingel,

Hello Everyone! Allow me to quickly introduce myself. My name is Johanna, and Steve has allowed me to take over the blog from time to time to talk about deep learning. Today I’d like to kick off a series called: “Deep Learning in Action: Cool projects created at MathWorks”   This aims to give you... read more >>

Semantic Segmentation Using Deep Learning

Posted by Steve Eddins,

Today I want to show you a documentation example that shows how to train a semantic segmentation network using deep learning and the Computer Vision System Toolbox. A semantic segmentation network classifies every pixel in an image, resulting in an image that is segmented... read more >>

Exporting to ONNX 1

Posted by Steve Eddins,

The MathWorks Neural Network Toolbox Team has just posted a new tool to the MATLAB Central File Exchange: the Neural Network Toolbox Converter for ONNX Model Format. ONNX, or Open Neural Network Exchange Format, is intended to be an open format for representing deep learning models. You need the latest release (R2018a)... read more >>

Deep Learning Network Analyzer

Posted by Steve Eddins,

Earlier this month, the Neural Network Toolbox team submitted a new Deep Learning Network Analyzer tool to the File Exchange. (Note: it requires the R2018a release.) This very useful tool helps you spot problems if you are building a network from scratch, or if you are modifying the structure of... read more >>

New Deep Learning Features in R2018a

Posted by Steve Eddins,

MathWorks shipped our R2018a release last month. As usual (lately, at least), there are many new capabilities related to deep learning. I showed one new capability, visualizing activations in DAG networks, in my 26-March-2018 post. In this post, I'll summarize the other new capabilities.... read more >>

Visualizing Activations in GoogLeNet

Posted by Steve Eddins,

The R2018a release has been available for almost two week now. One of the new features that caught my eye is that computing layer activations has been extended to GoogLeNet and Inception-v3. Today I want to experiment with GoogLeNet. net = googlenet net = DAGNetwork with properties: ... read more >>

Creating a DAG Network from DAG Parts 4

Posted by Steve Eddins,

In my 14-Feb-2018 blog post about creating a simple DAG network, reader Daniel Morris wanted to know if there's a less tedious way, compared to adding layers one at a time, to combine two (or more) DAGs into a network. I asked the development team about this. I learned that,... read more >>

Create a Simple DAG Network 6

Posted by Steve Eddins,

Creating a Simple DAG NetworkToday I want to show the basic tools needed to build your own DAG (directed acyclic graph) network for deep learning. I'm going to build this network and train it on our digits dataset. As the first step, I'll create the main branch, which follows the... read more >>

Defining Your Own Network Layer (Revisited) 1

Posted by Steve Eddins,

Today I want to follow up on my previous post, Defining Your Own Network Layer. There were two reader comments that caught my attention.The first comment, from Eric Shields, points out a key conclusion from the Clevert, Unterthiner, and Hichreiter paper that I overlooked. I initially focused just on the... read more >>

Defining Your Own Network Layer 9

Posted by Steve Eddins,

Note: Post updated 27-Sep-2018 to correct a typo in the implementation of the backward function. One of the new Neural Network Toolbox features of R2017b is the ability to define your own network layer. Today I'll show you how to make an exponential linear unit (ELU) layer.Joe helped me with today's... read more >>

These postings are the author's and don't necessarily represent the opinions of MathWorks.