Guy on Simulink

Simulink & Model-Based Design

Using Test Points to log Signals for Software-In-The-Loop Simulation

This week's post is a suggestion from Raymond Estrada from MATLAB and Simulink Consulting Services. Raymond reached out recently to share a tip for verifying and validating logged signals in the code generated from Simulink.

The Problem

It's a common practice in the verification and validation world to run simulations and look at the value of outputs and (sometimes) intermediate signals, and compare those to expected values.
When generating code using Simulink Coder and Embedded Coder, you want to verify that those values are the same (within tolerance) as what you got in simulation. That’s what we call “equivalence testing” or “back-to-back testing”, where you compare the results from normal mode simulation to results from Software-in-the-Loop Simulation (SIL) and Processor-in-the-Loop Simulation (PIL).
Now here is where you can run into a problem: Some of those values you need to observe don’t exist in the generated code. Intermediate signals may not be available in generated code due various optimizations like Signal storage reuse and Eliminate superfluous local variables (Expression folding). Expression folding is an optimization that collapses block computations into single expressions in generated code.
To illustrate this, let's use a very simple model where I add two signals and multiply the result by 2. I want to verify the results for the logged signal "x".
Using the Simulink Test Manager we can create an Equivalence Test. This test simulates the model in normal mode and Software-in-the-Loop Simulation, and compare the results.
We run the test and see that if failed. When expanding the test results, we can see that the logged signal x is not present in the SIL results.
When hovering over the equivalence criteria result, we see this note:
One or more signals did not match within tolerance specified
Let's see how to debug and fix this.

The Explanation

In the Test Manager, if you look carefully at the logs sections of the test results, you will find more explanations:
Let's also run the test programmatically to make the log easier to search and copy. For that, I like to use the function matlab.unittest.TestSuite.fromProject. This function conveniently finds all the tests in a project and creates a test suite.
suite = matlab.unittest.TestSuite.fromProject(currentProject);
results = run(suite);
Setting up ProjectFixture Done setting up ProjectFixture: Project 'SIL_TestPoints' is already loaded. Setup is not required. __________ Running SILTest > SIL Equivalence ================================================================================ Verification failed in SILTest > SIL Equivalence/Basic SIL Equivalence. --------------------- Framework Diagnostic: --------------------- Failed criteria: Equivalence --> Logs: Simulation 2: Model, SumAndGain, simulating in SIL or PIL mode, contains these signals, which are configured for signal logging, but are not supported and are not logged during SIL or PIL simulation: SumAndGain/Add, 'Output Port 1': Signal not found in code description, or signal is virtual, inactive, or has a variable size. For list of signals that are nonvariable size and produce message "Signal not found in code description, or signal is virtual, inactive, or has a variable size.", this warning might be avoidable: 1. For each signal, select Signal Properties > Test point. 2. Clear model configuration parameter 'Ignore test point signals'. 3. For virtual subsystem output signals, make subsystem atomic (nonvirtual). Select subsystem block parameter 'Treat as atomic unit'. 4. If generated code is C++ and model configuration parameter 'CodeInterfacePackaging' is set to 'C++ class', to allow access to internal data, set parameter 'InternalMemberVisibility' to 'public' or in Code Mappings editor, in Data Visibility column, for 'Signals, states, and internal data', select 'public'. 5. If you are testing an atomic subsystem in a Simulink Test harness, make changes in original model that contains subsystem. Alternatively, for listed signals, disable signal logging. --> Simulink Test Manager Results: Results: 2025-Mar-02 12:57:59/SILTest/SIL Equivalence/Basic SIL Equivalence ================================================================================ . Done SILTest > SIL Equivalence __________ Tearing down ProjectFixture Done tearing down ProjectFixture: Teardown is not required. __________ Failure Summary: Name Failed Incomplete Reason(s) ============================================================================================== SILTest > SIL Equivalence/Basic SIL Equivalence X Failed by verification.
What this log tells us is that the generated code does not contain a variable corresponding to the logged signal x.
When looking at the generated model step function, we can confirm that expression folding has been applied, combining the Sum and Gain blocks in a single expression:

The Solution

Let's apply the first suggestion in the above log:
1. For each signal, select Signal Properties > Test point.
If you are not familiar with test points, I recommend this documentation: Configure Signals as Test Points.
In this case, a simple way to add the test point is using the Property Inspector:
With this change, we can run the test suite again to confirm that the test point fixes the issue and enables the test to pass.

Conclusion & Considerations

Adding a Test Point is one solution that allows for verification of requirements using signals that would otherwise be optimized via expression folding. However, note that the addition of a test point did change the generated code. In this case, the model step function becomes:
Since adding test points changes the code and can block optimizations, you only want to use this where it is truly necessary. Also, since they impact the generated code, test points should persist throughout the design and should not be toggled between the different stages of development (development, testing & verification, code generation).

Now it's your turn

Are you verifying and validating signals internal to your algorithms? Let us know in the comments below if you have tips and tricks for that.

|
  • print

Comments

To leave a comment, please click here to sign in to your MathWorks Account or create a new one.

Loading...
Go to top of page