Developer Zone

Advanced Software Development with MATLAB

Additional Ways to Consume the Secret Sauce

For today's blog entry I'd like to introduce Stephen Carter. Stephen is an engineer on the MATLAB Test Frameworks team and has been involved in many exciting new features over the past few releases (and he continues working on more to come!). Stephen was excited to write a post or two to give you a few hints to help your test diagnostic workflows.

As Andy pointed out in a recent post, we try to cook up our secret sauce in a way to make it taste just right. But is taste everything? For example, although Sam, from Dr Seuss's "Green Eggs and Ham", came to realize that he did in fact like to eat them in a boat, with a goat, in the rain, and on a train, I personally don't share the same viewpoint as Sam. Even if they are enjoyable on a boat or on a train, I don't think I would want them with a goat or in the rain.

What I am getting at is this: in addition to enhancing the diagnostics themselves, we recognize the need to continually provide additional ways for users to consume our diagnostics if they are to be useful for those with different preferences and use cases. Our recent posts have been demonstrating ways to consume our diagnostics for the continuous integration use case. But there are a lot more use cases to consider.

First, let me start off with a scenario that some of you might relate to. You have just made some final changes to your source code and have already ran a handful of unit tests to give you some confidence to be able to run your full suite of tests before submitting your changes. You run the full suite using runtests, which takes, let's say, five to ten minutes, and you see afterward that there are a few failures to address. From the MATLAB command window, you scroll up and find your first failure diagnostic and after reading it you try a few things at the command window to help you understand the failure better. And then of course you do what you always do when you are done playing around at the command window, and you type clc to clear the window.

Wait... what... oh no! You just lost your entire diagnostic output and you haven't even read the diagnostic messages for the other failures. At this point you are thinking, "Do I really need to rerun my entire suite to find my failures?".

Before rerunning your entire test suite, first check to see if you still have the TestResult output that runtests produced.

Prior to R2016a, you would see:

     
results = 

  1x1151 TestResult array with properties:

    Name
    Passed
    Failed
    Incomplete
    Duration
 
Totals:
   1148 Passed, 3 Failed, 0 Incomplete.
   266.5616 seconds testing time.
 

You'll notice that pass/fail information is available. You can use this to shrink the test suite down so that you only have to rerun the tests that did not pass:

didNotPassResults = results(~[results.Passed]);
testsToRerun = {didNotPassResults.Name};
runtests(testsToRerun); % Give me my diagnostics back!

Starting in R2016a, we have provided a more efficient solution to this problem by serving up the diagnostics for you in a new way. You will now see the following TestResult display:

results
results = 

  1×1151 TestResult array with properties:

    Name
    Passed
    Failed
    Incomplete
    Duration
    Details

Totals:
   1148 Passed, 3 Failed, 0 Incomplete.
   261.3082 seconds testing time.

Do you see what's new? I'll give you a minute.... do you see the difference now? Ah yes, the new Details property. Let's take a look at the details of the first failing result:

firstFailingIndex = find([results.Failed],1);
firstFailingDetails = results(firstFailingIndex).Details
firstFailingDetails = 

  struct with fields:

    DiagnosticRecord: [1×1 matlab.unittest.plugins.diagnosticrecord.QualificationDiagnosticRecord]

firstFailingDetails.DiagnosticRecord
ans = 

  QualificationDiagnosticRecord with properties:

         TestDiagnosticResult: {1×1 cell}
    FrameworkDiagnosticResult: {'Negated IsSubsetOf failed....'}
                        Stack: [1×1 struct]
                        Event: 'VerificationFailed'
                        Scope: 'ATitanicTest/testAvoidanceSystem'
                       Report: '==========================================...'

firstFailingDetails.DiagnosticRecord.Report
ans =

================================================================================
Verification failed in ATitanicTest/testAvoidanceSystem.

    ----------------
    Test Diagnostic:
    ----------------
    Let's try to avoid hitting things that could end in disaster.

    ---------------------
    Framework Diagnostic:
    ---------------------
    Negated IsSubsetOf failed.
    --> The actual value is a subset of the prohibited superset.
    
    Actual Value (cell):
            'iceberg'
    Prohibited Superset (cell):
            'land'    'iceberg'    'boat'

    ------------------
    Stack Information:
    ------------------
    In H:\Documents\MATLAB\SecretSauce\ATitanicTest.m (ATitanicTest.testAvoidanceSystem) at 5
================================================================================


As you can see, runtests now captures failing diagnostics on the TestResult output. The driving force behind this is actually a new DiagnosticsRecordingPlugin. Manually adding this plugin provides users with even more options, like recording passing diagnostics and controlling the verbosity at which logged diagnostics are recorded.

For example, to record passing diagnostics you can do the following:

import matlab.unittest.TestRunner;
import matlab.unittest.plugins.DiagnosticsRecordingPlugin;

runner = TestRunner.withTextOutput;
plugin = DiagnosticsRecordingPlugin('IncludingPassingDiagnostics',true);
runner.addPlugin(plugin);

results = runner.run(testsuite);

Now we can find the first passing diagnostic recorded:

allDetails = [results.Details];
allDiagnosticRecords = [allDetails.DiagnosticRecord]
passedRecords = selectPassed(allDiagnosticRecords)
passedRecords(1) % look at the record for the first passing event
allDiagnosticRecords = 

  1×1151 QualificationDiagnosticRecord array with properties:

    TestDiagnosticResult
    FrameworkDiagnosticResult
    Stack
    Event
    Scope
    Report


passedRecords = 

  1×1148 QualificationDiagnosticRecord array with properties:

    TestDiagnosticResult
    FrameworkDiagnosticResult
    Stack
    Event
    Scope
    Report


ans = 

  QualificationDiagnosticRecord with properties:

         TestDiagnosticResult: {}
    FrameworkDiagnosticResult: {'IsEqualTo passed....'}
                        Stack: [1×1 struct]
                        Event: 'VerificationPassed'
                        Scope: 'CapitalGroupStrategyTest/testStockTicker'
                       Report: '==========================================...'

passedRecords(1).Report
ans =

================================================================================
Verification passed in CapitalGroupStrategyTest/testStockTicker.

    ---------------------
    Framework Diagnostic:
    ---------------------
    IsEqualTo passed.
    --> StringComparator passed.
        --> The character arrays are equal (ignoring case).
        
        Actual char:
            Goog
        Expected char:
            GOOG

    ------------------
    Stack Information:
    ------------------
    In H:\Documents\MATLAB\SecretSauce\CapitalGroupStrategyTest.m (CapitalGroupStrategyTest.testStockTicker) at 5
================================================================================


Stay tuned for even more ways to consume our secret sauce.




Published with MATLAB® R2016b

|
  • print

Comments

To leave a comment, please click here to sign in to your MathWorks Account or create a new one.