R2024b: A Peek into New AI Features
MATLAB R2024b is the latest release and available for you to try. I am here to talk specifically about new AI features in the latest release, and if you're interested in other features, check out the MATLAB blog and Simulink blog.
This blog post provides only a quick peek into new AI features, but I will provide you with resources to learn more about all that’s new in AI. Comment below to tell the AI community which are your favorite new features, why, and how you are applying them to your projects.
Deep Learning
- Neural network Testing and Metrics - Test neural networks by using the testnet function. The function computes metrics such as accuracy, precision, and recall given a neural network and a test data set. There are many more metrics that you can request, and the function works for images, sequences, features, and combination of data types. Since we are talking about metrics, it’s worth mentioning that in R2024b you can use new and updated metrics to monitor the training progress of neural networks.
- Learning Rate Schedule - Learning rate is one of the most important hyperparmeters in deep learning training. You can now specify a learning rate schedule when training a deep neural network by using the LearnRateSchedule name-value argument of the trainingOptions function. This new option provides you with the flexibility and control to training that you need to improve training speed and network accuracy.
- Low-Code App Templates - The start page of the Deep Network Designer app now provides templates for 1-D convolutional neural networks. You can use the template to quickly create a 1-D convolutional neural network suitable for sequence-to-label and sequence-to-sequence classification tasks.
- PyTorch 2.0 Import - Among other new import and export functionality, like support for new PyTorch, TensorFlow, and ONNX layers, you can now import deep learning models from PyTorch 2.0 by using the importNetworkFromPyTorch function. To learn more about interoperation, see Convert Deep Learning Models between PyTorch, TensorFlow, and MATLAB.
Machine Learning
- Drift Detection in Simulink - An incremental learning model fits to data quickly and efficiently, which means it can adapt in real time to drifts in the data distribution. So, when you are building an incremental learning system in Simulink it’s important to be able to integrate drift detection, and now you can with the Detect Drift block. For an example, see Monitor Drift Using Detect Drift Block.
- Synthetic Data Generation - To overcome the issue of data availability, you can complement real data with synthetic data. You can now generate synthetic data from existing tabular data. There is some complicated math involved in this synthetic data generation, which MATLAB developers packaged into a function so you don’t have to.
- Neural Network Integration - In Statistics and Machine Learning toolbox you can create feedforward, fully connected neural networks for classification (with the fitcnet function) and regression (with the fitrnet function). A new feature provides better integration between two core AI toolboxes. You can now convert shallow neural networks created in Statistics and Machine Learning Toolbox to objects that can be used in Deep Learning Toolbox (see relevant release note).
net = dlnetwork (mdl)Where mdl is the neural network created in Statistics and Machine Learning Toolbox, and net is the corresponding Deep Learning Toolbox network. To learn more about new machine learning features, see Statistics and Machine Learning Toolbox release notes.
评论
要发表评论,请点击 此处 登录到您的 MathWorks 帐户或创建一个新帐户。