Comma Separated Goodness
Hi folks, today I'd like to introduce ChangQing Wang. ChangQing is the lead developer on MATLAB's performance framework, and in addition to all the great performance testing features he has delivered, he has also found a really easy way to integrate performance results into Jenkins. Check it out!
Contents
MATLAB performance testing on Jenkins
"Is it build 3301 or build 3319?" CQ scratched his head, confusion written all over his face. Somehow he noticed a significant increase in the code runtime, but had no clue which change caused it. He wished he had logged the performance of every change in the project.
As continuous integration is becoming one of the key principles in Agile processes, and as more and more products are adopting continuous delivery practices, performance testing is a crucial step to add to the workflow. CQ never wanted to introduce a performance regression, yet realized too late that there is a real risk of this every time he touches the code for a bug fix or a new feature. "A passing build does not mean everything is OK", CQ pondered, "How can I monitor the performance of my MATLAB project on CI system?"
The anwer to this question is actually two fold:
- He needs to add performance tests for the project.
- He needs to schedule performance testing runs for each build and report the result.
If you have a MATLAB project and wonder how to write a performance test using the latest (and coolest) testing framework in MATLAB, this page is a good starting point. You can also look at the other blog posts we've made in the past on the topic. In this blog, I will not go through the details of how to write performance tests but rather show an example project with performance tests already written. The project I am using to highlight this example is a super lightweight "library" for three different matrix operations, computing the mean, sum, and eigenvalues of a given matrix.
matrix_mean.m
function out = matrix_mean(M) % CQ's library for matrix operation sum = matrix_sum(M); nrow = size(M, 1); ncol = size(M, 2); out = sum/(nrow*ncol); end
matrix_sum.m
function out = matrix_sum(M) % CQ's library for matrix operation out = 0; nrow = size(M,1); ncol = size(M,2); for i = 1:nrow for j = 1:ncol out = out + M(i,j); end end end
matrix_eig.m
function out = matrix_eig(M) % CQ's library for matrix operation out = roots(round(poly(M))); end
tMatrixLibrary.m
classdef tMatrixLibrary < matlab.perftest.TestCase properties(TestParameter) TestMatrix = struct('midSize', magic(600),... 'largeSize', magic(1000)); end methods(Test) function testSum(testCase, TestMatrix) matrix_sum(TestMatrix); end function testMean(testCase, TestMatrix) matrix_mean(TestMatrix); end function testEig(testCase, TestMatrix) testCase.assertReturnsTrue(@() size(TestMatrix,1) == size(TestMatrix,2), ... 'Eig only works on square matrix'); testCase.startMeasuring; matrix_eig(TestMatrix); testCase.stopMeasuring; end end end
The performance test tMatrixLibrary has three parameterized tests for each of the source files. Notice in testEig, we use an assertTrue qualification to guarantee the matrix passed in the test is square, as well as start/stopMeasuring to designate the measurement boundary in the test point. There are multiple ways to run the performance tests in MATLAB, but the easiest is probably to use runperf to obtain the results. Once we have the results it is easy to get a high level overview using sampleSummary:
results = runperf('tMatrixLibrary.m')
results.sampleSummary
Running tMatrixLibrary .......... .......... .......... .......... .......... .......... .......... ...... Done tMatrixLibrary __________ results = 1×6 MeasurementResult array with properties: Name Valid Samples TestActivity Totals: 6 Valid, 0 Invalid. ans = 6×7 table Name SampleSize Mean StandardDeviation Min Median Max _____________________________________________ __________ _________ _________________ _________ _________ _________ tMatrixLibrary/testSum(TestMatrix=midSize) 7 0.0021399 0.00013117 0.0020023 0.0020896 0.0023467 tMatrixLibrary/testSum(TestMatrix=largeSize) 17 0.0082113 0.00092846 0.0050781 0.0084503 0.0095599 tMatrixLibrary/testMean(TestMatrix=midSize) 12 0.0021527 0.00020086 0.0019554 0.0021054 0.002559 tMatrixLibrary/testMean(TestMatrix=largeSize) 8 0.0085206 0.00062801 0.0077265 0.0084615 0.0093073 tMatrixLibrary/testEig(TestMatrix=midSize) 4 0.15444 0.0010901 0.15364 0.15405 0.15604 tMatrixLibrary/testEig(TestMatrix=largeSize) 4 0.41783 0.013677 0.40623 0.41421 0.43668
These are nice numbers to evaluate the performance of the project from the MATLAB Command Window. Now let's see how we can report them in a CI system. If we use Jenkins as an example, we can create a "Simple Matrix Library" project containing the source and test files shown above:
As a prerequisite, to enable logging of performance data on Jenkins, you can use the performance plugin on Jenkins. The plugin can be searched and installed from the Jenkins plugin manager . It enables a post build process to capture reports from major testing tools and then generates trend plots over the build history. In addition, it allows setting the latest build status as passed, unstable or failed, based on the reported error percentage. There are several supported report formats including Final Stats XML, JMeter format, JUnit XML, and so forth. However, we pick the JMeter CSV format for our MATLAB project since the output measurement result object from runperf is already storing the information in tabular form and as you will see it is quite straightforward to generate a JMeter CSV out of these tables. Here are the detailed steps:
Step 1: Convert the performance results to CSV format
To kickoff, we will create a JMeter CSV file from a measurement result object. First we need to gather the information required. The standard JMeter CSV format includes 16 variables: timeStamp, elapsed , label , responseCode , responseMessage , threadName , dataType , success , failureMessage , bytes , sentBytes , grpThreads, allThreads, latency, IdleTime, and connect. Some of these variables are important for our use case and some we can ignore. Four of them are available in the TestActivity table from the measurement result: timeStamp, elapsed (from "MeasuredTime"), label (from "Name") and success (from "Passed"). So let's use the results from our runperf call above. We can extract these columns into a samplesTable and rename the variables:
activityTable = vertcat(results.TestActivity); activityTable.Properties.VariableNames'
ans = 12×1 cell array {'Name' } {'Passed' } {'Failed' } {'Incomplete' } {'MeasuredTime' } {'Objective' } {'Timestamp' } {'Host' } {'Platform' } {'Version' } {'TestResult' } {'RunIdentifier'}
samplesTable = activityTable(activityTable.Objective == categorical({'sample'}),:); nrows = size(samplesTable, 1); % Trim the table and change variable names to comply with JMeter CSV format samplesTable = samplesTable(:, {'Timestamp', 'MeasuredTime', 'Name', 'Passed'}); samplesTable.Properties.VariableNames = {'timeStamp', 'elapsed', 'label', 'success'};
A couple of things to note are that the timestamp in JMeter is in unix style format, and the elapsed time reported in JMeter is in milliseconds, both of which are different from the MATLAB measurement result. Also, for failed cases, we need to replace the missing values NaN and NaT in the measurement result by some acceptable values in JMeter. Let's address both of these cleanup items:
% Convert timestamp to unix format, and fill NaT with previous available time samplesTable.timeStamp = fillmissing(samplesTable.timeStamp,'previous'); samplesTable.timeStamp = posixtime(samplesTable.timeStamp)*1000; % Convert MeasuredTime to millisecond, and fill NaN with 0 samplesTable.elapsed = fillmissing(samplesTable.elapsed,'constant',0); samplesTable.elapsed = floor(samplesTable.elapsed*1000);
The "Passed" column by default stores logical values, we need to convert them to string for JMeter CSV:
% Convert pass/fail logical to string
samplesTable.success = string(samplesTable.success);
Next, we need to create some default values for the 12 other variables that are less important for us, and append them to the samplesTable:
% Generate additional columns required in JMeter CSV format responseCode = zeros(nrows, 1); responseMessage = strings(nrows, 1); threadName = strings(nrows, 1); dataType = strings(nrows, 1); failureMessage = strings(nrows, 1); bytes = zeros(nrows, 1); sentBytes = zeros(nrows, 1); grpThreads = ones(nrows, 1); allThreads = ones(nrows, 1); latency = zeros(nrows, 1); idleTime = zeros(nrows, 1); connect = zeros(nrows, 1); auxTable = table(responseCode, responseMessage, threadName, dataType, ... failureMessage, bytes, sentBytes, grpThreads, allThreads, ... latency, idleTime, connect); % Append additional columns to the original table JMeterTable = [samplesTable, auxTable];
Voila! We now have a table in JMeter format with the full set of 16 variables. We can now simply use the writetable function to write it to a CSV file. Notice the strings are quoted to ensure the commas in the test name are not treated as delimiters.
% Write the full table to a CSV file writetable(JMeterTable, 'PerformanceTestResult.csv', 'QuoteStrings', true);
Step 2: Configure build and post-build actions
Now we can set up performance monitoring on Jenkins! The good news is that after the hard work in step 1, the rest is super easy. Just put the conversion code we've developed above into a function (we called it convertToJMeterCSV) and make sure it is available from the workspace of your Jenkins build. Then you just invoke that function as part of the Jenkins build. Open the project configuration page, add an "Execute Windows batch command" build step and write the following into the command:
matlab -nodisplay -wait -log -r "convertToJMeterCSV(runperf('tMatrixLibrary.m')); exit"
The output from runperf will be converted and saved to "PerformanceTestResult.csv" locally.
Next, click "Add a post-build action". With the performance plugin successfully installed on Jenkins, the "Publish Performance test result report" option should appear. Select that, and enter the csv file name in the "Source data files" field. There are also other options to tweak but we will leave them as they are for now. Click the save button to exit the configuration page.
Step 3: Build the project and review the trend
Everything is done, you can build the project several times, and click the "Performance Trend" link on the left to view the trend plots of the response time and percentage of errors:
Notice the statistics in the response time trend are calculated over all tests, which is why the median value can be very different from the average. We can reach another view of the trend by clicking into any build (say #14 in our case) and then click the "Performance Trend" link:
Here we will see nice summary table presenting the stats of all tests with green/red indicators showing the result difference compared to the previous build. All non-passing tests will show up in red, glad we don't have any of those.
That's how we can add a performance report for our MATLAB project on Jenkins, isn't that easy? Share your thoughts on how the process can be improved. Is there any other performance statistics trend you would like to follow for your MATLAB project?
- Category:
- Continuous Integration,
- Performance,
- Testing
Comments
To leave a comment, please click here to sign in to your MathWorks Account or create a new one.