Artificial Intelligence

Apply machine learning and deep learning

Podcast Alert: Deploying Edge and Embedded AI Systems

The following blog post is from Daniel Prieve, Digital Marketing Manager.
Last month, Heather Gorr was interviewed for the TWIML AI Podcast (hosted by Sam Charrington). Heather shared knowledge, which she has gained as a MATLAB Product Manager, on how to prepare and test AI models before deploying the models to edge devices and embedded systems.
You can find the podcast on “Deploying Edge and Embedded AI Systems”, here:
Podcast on deploying edge and embedded AI systems
In this blog post, we highlight a few key points from the TWIML podcast on Edge AI. But you will certainly learn a lot more by listening to the full podcast.
Data Preparation: When you prepare data for training and testing an AI model that will later be deployed to the edge, you must take into consideration hardware limitations and how they will impact the quality and streaming process of the data. This is particularly important when streaming data captured by sensors.
Model Preparation: Research models might perform well on the desktop, but AI practitioners should consider additional steps before deploying their models to the edge. They need to consider (1) compressing the models (for example, with quantization) to ensure the models will fit on the target device, (2) applying explainability techniques to add transparency to AI decisions, and (3) verify the models’ robustness with testing and validation (for example, verifying robustness against adversarial examples).
Simulation: Simulating an AI model in a system-wide context before deployment tests how well the model integrates with other parts of the system. By simulating a physical system, you can generate synthetic data when enough data is not available for training an AI model.
Collaborative Effort: Embedding AI models into hardware systems requires collaboration across many teams: hardware experts, data scientists, and domain-focused engineers. These teams, which might be working on different platforms and have different skill sets, must be given the right tools for successful communication, collaboration, and sharing outcomes.
Is deploying AI models to the edge part of your workflow? And what do you spend most of your time on: data preparation, model preparation, or system simulation? What have you learned in the podcast that you can apply back into your work? Share your comments and thoughts below.
  • print


To leave a comment, please click here to sign in to your MathWorks Account or create a new one.