The Problem
It's a common practice in the verification and validation world to run simulations and look at the value of outputs and (sometimes) intermediate signals, and compare those to expected values.
Now here is where you can run into a problem: Some of those values you need to observe don’t exist in the generated code. Intermediate signals may not be available in generated code due various optimizations like Signal storage reuse and Eliminate superfluous local variables (Expression folding). Expression folding is an optimization that collapses block computations into single expressions in generated code. To illustrate this, let's use a very simple model where I add two signals and multiply the result by 2. I want to verify the results for the logged signal "x".
We run the test and see that if failed. When expanding the test results, we can see that the logged signal x is not present in the SIL results.
When hovering over the equivalence criteria result, we see this note:
Let's see how to debug and fix this.
The Explanation
In the Test Manager, if you look carefully at the logs sections of the test results, you will find more explanations:
Let's also run the test programmatically to make the log easier to search and copy. For that, I like to use the function matlab.unittest.TestSuite.fromProject. This function conveniently finds all the tests in a project and creates a test suite. suite = matlab.unittest.TestSuite.fromProject(currentProject);
results = run(suite);
Setting up ProjectFixture
Done setting up ProjectFixture: Project 'SIL_TestPoints' is already loaded. Setup is not required.
__________
Running SILTest > SIL Equivalence
================================================================================
Verification failed in SILTest > SIL Equivalence/Basic SIL Equivalence.
---------------------
Framework Diagnostic:
---------------------
Failed criteria: Equivalence
--> Logs:
Simulation 2:
Model, SumAndGain, simulating in SIL or PIL mode, contains these signals, which are configured for signal logging, but are not supported and are not logged during SIL or PIL simulation:
SumAndGain/Add, 'Output Port 1': Signal not found in code description, or signal is virtual, inactive, or has a variable size.
For list of signals that are nonvariable size and produce message "Signal not found in code description, or signal is virtual, inactive, or has a variable size.", this warning might be avoidable:
1. For each signal, select Signal Properties > Test point.
2. Clear model configuration parameter 'Ignore test point signals'.
3. For virtual subsystem output signals, make subsystem atomic (nonvirtual). Select subsystem block parameter 'Treat as atomic unit'.
4. If generated code is C++ and model configuration parameter 'CodeInterfacePackaging' is set to 'C++ class', to allow access to internal data, set parameter 'InternalMemberVisibility' to 'public' or in Code Mappings editor, in Data Visibility column, for 'Signals, states, and internal data', select 'public'.
5. If you are testing an atomic subsystem in a Simulink Test harness, make changes in original model that contains subsystem.
Alternatively, for listed signals, disable signal logging.
--> Simulink Test Manager Results:
Results: 2025-Mar-02 12:57:59/SILTest/SIL Equivalence/Basic SIL Equivalence
================================================================================
.
Done SILTest > SIL Equivalence
__________
Tearing down ProjectFixture
Done tearing down ProjectFixture: Teardown is not required.
__________
Failure Summary:
Name Failed Incomplete Reason(s)
==============================================================================================
SILTest > SIL Equivalence/Basic SIL Equivalence X Failed by verification.
What this log tells us is that the generated code does not contain a variable corresponding to the logged signal x.
When looking at the generated model step function, we can confirm that expression folding has been applied, combining the Sum and Gain blocks in a single expression:
The Solution
Let's apply the first suggestion in the above log:
In this case, a simple way to add the test point is using the Property Inspector:
With this change, we can run the test suite again to confirm that the test point fixes the issue and enables the test to pass.
Conclusion & Considerations
Adding a Test Point is one solution that allows for verification of requirements using signals that would otherwise be optimized via expression folding. However, note that the addition of a test point did change the generated code. In this case, the model step function becomes:
Since adding test points changes the code and can block optimizations, you only want to use this where it is truly necessary. Also, since they impact the generated code, test points should persist throughout the design and should not be toggled between the different stages of development (development, testing & verification, code generation).
Now it's your turn
Are you verifying and validating signals internal to your algorithms? Let us know in the comments below if you have tips and tricks for that.
Comments
To leave a comment, please click here to sign in to your MathWorks Account or create a new one.