{"id":1857,"date":"2018-09-18T16:58:12","date_gmt":"2018-09-18T20:58:12","guid":{"rendered":"https:\/\/blogs.mathworks.com\/developer\/?p=1857"},"modified":"2018-09-18T16:58:46","modified_gmt":"2018-09-18T20:58:46","slug":"jmeter-results-jenkins","status":"publish","type":"post","link":"https:\/\/blogs.mathworks.com\/developer\/2018\/09\/18\/jmeter-results-jenkins\/","title":{"rendered":"Comma Separated Goodness"},"content":{"rendered":"<div class=\"content\"><!--introduction--><p><i>Hi folks, today I'd like to introduce ChangQing Wang. ChangQing is the lead developer on MATLAB's performance framework, and in addition to all the great performance testing features he has delivered, he has also found a really easy way to integrate performance results into Jenkins. Check it out!<\/i><\/p><!--\/introduction--><h3>Contents<\/h3><div><ul><li><a href=\"#7bdcae1d-efa4-43f2-985f-85eff603e114\">MATLAB performance testing on Jenkins<\/a><\/li><li><a href=\"#aba2ac37-5e59-4d2e-9906-7bd7676352fe\">Step 1: Convert the performance results to CSV format<\/a><\/li><li><a href=\"#79a8f9d1-c53c-41a4-948d-852f40171898\">Step 2: Configure build and post-build actions<\/a><\/li><li><a href=\"#18a3f122-4228-4f9b-b313-129baeb20cc9\">Step 3: Build the project and review the trend<\/a><\/li><\/ul><\/div><h4>MATLAB performance testing on Jenkins<a name=\"7bdcae1d-efa4-43f2-985f-85eff603e114\"><\/a><\/h4><p>\"Is it build 3301 or build 3319?\" CQ scratched his head, confusion written all over his face. Somehow he noticed a significant increase in the code runtime, but had no clue which change caused it. He wished he had logged the performance of every change in the project.<\/p><p>As continuous integration is becoming one of the key principles in Agile processes, and as more and more products are adopting continuous delivery practices, performance testing is a crucial step to add to the workflow. CQ never wanted to introduce a performance regression, yet realized too late that there is a real risk of this every time he touches the code for a bug fix or a new feature. \"A passing build does not mean everything is OK\", CQ pondered, \"How can I monitor the performance of my MATLAB project on CI system?\"<\/p><p>The anwer to this question is actually two fold:<\/p><div><ol><li>He needs to add performance tests for the project.<\/li><li>He needs to schedule performance testing runs for each build and report the result.<\/li><\/ol><\/div><p>If you have a MATLAB project and wonder how to write a performance test using the latest (and coolest) testing framework in MATLAB, <a href=\"https:\/\/www.mathworks.com\/help\/matlab\/matlab_prog\/overview-of-performance-test-framework.html\">this page<\/a> is a good starting point. You can also look at the <a href=\"https:\/\/blogs.mathworks.com\/developer\/category\/performance\/?s_tid=Blog_developer_Category\">other blog posts<\/a> we've made in the past on the topic. In this blog, I will not go through the details of how to write performance tests but rather show an example project with performance tests already written. The project I am using to highlight this example is a super lightweight \"library\" for three different matrix operations, computing the mean, sum, and eigenvalues of a given matrix.<\/p><p><img decoding=\"async\" vspace=\"5\" hspace=\"5\" src=\"http:\/\/blogs.mathworks.com\/developer\/files\/y2018project_overview.png\" alt=\"\"> <\/p><p><b>matrix_mean.m<\/b><\/p><pre class=\"language-matlab\">\r\n<span class=\"keyword\">function<\/span> out = matrix_mean(M)\r\n<span class=\"comment\">% CQ's library for matrix operation<\/span>\r\n\r\nsum = matrix_sum(M);\r\nnrow = size(M, 1);\r\nncol = size(M, 2);\r\nout = sum\/(nrow*ncol);\r\n\r\n<span class=\"keyword\">end<\/span>\r\n\r\n<\/pre><p><b>matrix_sum.m<\/b><\/p><pre class=\"language-matlab\">\r\n<span class=\"keyword\">function<\/span> out = matrix_sum(M)\r\n<span class=\"comment\">% CQ's library for matrix operation<\/span>\r\n\r\nout = 0;\r\nnrow = size(M,1);\r\nncol = size(M,2);\r\n<span class=\"keyword\">for<\/span> i = 1:nrow\r\n    <span class=\"keyword\">for<\/span> j = 1:ncol\r\n        out = out + M(i,j);\r\n    <span class=\"keyword\">end<\/span>\r\n<span class=\"keyword\">end<\/span>\r\n\r\n<span class=\"keyword\">end<\/span>\r\n\r\n<\/pre><p><b>matrix_eig.m<\/b><\/p><pre class=\"language-matlab\">\r\n<span class=\"keyword\">function<\/span> out = matrix_eig(M)\r\n<span class=\"comment\">% CQ's library for matrix operation<\/span>\r\n\r\nout = roots(round(poly(M)));\r\n\r\n<span class=\"keyword\">end<\/span>\r\n\r\n<\/pre><p><b>tMatrixLibrary.m<\/b><\/p><pre class=\"language-matlab\">\r\n<span class=\"keyword\">classdef<\/span> tMatrixLibrary &lt; matlab.perftest.TestCase\r\n    \r\n    <span class=\"keyword\">properties<\/span>(TestParameter)\r\n        TestMatrix = struct(<span class=\"string\">'midSize'<\/span>, magic(600),<span class=\"keyword\">...<\/span>\r\n            <span class=\"string\">'largeSize'<\/span>, magic(1000));\r\n    <span class=\"keyword\">end<\/span>\r\n    \r\n    <span class=\"keyword\">methods<\/span>(Test)\r\n        <span class=\"keyword\">function<\/span> testSum(testCase, TestMatrix)\r\n            matrix_sum(TestMatrix);\r\n        <span class=\"keyword\">end<\/span>\r\n        \r\n        <span class=\"keyword\">function<\/span> testMean(testCase, TestMatrix)\r\n            matrix_mean(TestMatrix);\r\n        <span class=\"keyword\">end<\/span>\r\n        \r\n        <span class=\"keyword\">function<\/span> testEig(testCase, TestMatrix)\r\n            \r\n            testCase.assertReturnsTrue(@() size(TestMatrix,1) == size(TestMatrix,2), <span class=\"keyword\">...<\/span>\r\n                <span class=\"string\">'Eig only works on square matrix'<\/span>);\r\n            testCase.startMeasuring;\r\n            matrix_eig(TestMatrix);\r\n            testCase.stopMeasuring;\r\n            \r\n        <span class=\"keyword\">end<\/span>\r\n    <span class=\"keyword\">end<\/span>\r\n<span class=\"keyword\">end<\/span>\r\n\r\n<\/pre><p>The performance test tMatrixLibrary has three parameterized tests for each of the source files. Notice in testEig, we use an <a title=\"https:\/\/www.mathworks.com\/help\/matlab\/ref\/matlab.unittest.qualifications.assertable.asserttrue.html (link no longer works)\">assertTrue<\/a> qualification to guarantee the matrix passed in the test is square, as well as <a href=\"https:\/\/www.mathworks.com\/help\/matlab\/ref\/matlab.perftest.testcase.startmeasuring.html\">start\/stopMeasuring<\/a> to designate the measurement boundary in the test point. There are multiple ways to run the performance tests in MATLAB, but the easiest is probably to use <a href=\"https:\/\/www.mathworks.com\/help\/matlab\/ref\/runperf.html\">runperf<\/a> to obtain the results. Once we have the results it is easy to get a high level overview using <a href=\"https:\/\/www.mathworks.com\/help\/matlab\/ref\/matlab.unittest.measurement.measurementresult.samplesummary.html\">sampleSummary<\/a>:<\/p><pre class=\"codeinput\">results = runperf(<span class=\"string\">'tMatrixLibrary.m'<\/span>)\r\nresults.sampleSummary\r\n<\/pre><pre class=\"codeoutput\">Running tMatrixLibrary\r\n..........\r\n..........\r\n..........\r\n..........\r\n..........\r\n..........\r\n..........\r\n......\r\nDone tMatrixLibrary\r\n__________\r\n\r\n\r\nresults = \r\n\r\n  1&times;6 MeasurementResult array with properties:\r\n\r\n    Name\r\n    Valid\r\n    Samples\r\n    TestActivity\r\n\r\nTotals:\r\n   6 Valid, 0 Invalid.\r\n\r\n\r\nans =\r\n\r\n  6&times;7 table\r\n\r\n                        Name                         SampleSize      Mean       StandardDeviation       Min        Median         Max   \r\n    _____________________________________________    __________    _________    _________________    _________    _________    _________\r\n\r\n    tMatrixLibrary\/testSum(TestMatrix=midSize)            7        0.0021399       0.00013117        0.0020023    0.0020896    0.0023467\r\n    tMatrixLibrary\/testSum(TestMatrix=largeSize)         17        0.0082113       0.00092846        0.0050781    0.0084503    0.0095599\r\n    tMatrixLibrary\/testMean(TestMatrix=midSize)          12        0.0021527       0.00020086        0.0019554    0.0021054     0.002559\r\n    tMatrixLibrary\/testMean(TestMatrix=largeSize)         8        0.0085206       0.00062801        0.0077265    0.0084615    0.0093073\r\n    tMatrixLibrary\/testEig(TestMatrix=midSize)            4          0.15444        0.0010901          0.15364      0.15405      0.15604\r\n    tMatrixLibrary\/testEig(TestMatrix=largeSize)          4          0.41783         0.013677          0.40623      0.41421      0.43668\r\n\r\n<\/pre><p>These are nice numbers to evaluate the performance of the project from the MATLAB Command Window. Now let's see how we can report them in a CI system. If we use Jenkins as an example, we can create a \"Simple Matrix Library\" project containing the source and test files shown above:<\/p><p><img decoding=\"async\" vspace=\"5\" hspace=\"5\" src=\"http:\/\/blogs.mathworks.com\/developer\/files\/y2018JenkinsProject.png\" alt=\"\"> <\/p><p>As a prerequisite, to enable logging of performance data on Jenkins, you can use the <a href=\"https:\/\/wiki.jenkins.io\/display\/JENKINS\/Performance+Plugin\">performance plugin<\/a> on Jenkins. The plugin can be searched and installed from the Jenkins <a href=\"https:\/\/jenkins.io\/doc\/book\/managing\/plugins\/\">plugin manager<\/a> . It enables a post build process to capture reports from major testing tools and then generates trend plots over the build history. In addition, it allows setting the latest build status as passed, unstable or failed, based on the reported error percentage. There are several supported report formats including <a title=\"http:\/\/gettaurus.org\/docs\/Reporting\/?utm_source=jenkins&amp;utm_medium=link&amp;utm_campaign=wiki#BlazeMeter-Reporter (link no longer works)\">Final Stats XML<\/a>, JMeter format, JUnit XML, and so forth. However, we pick the JMeter CSV format for our MATLAB project since the output measurement result object from runperf is already storing the information in tabular form and as you will see it is quite straightforward to generate a JMeter CSV out of these tables. Here are the detailed steps:<\/p><h4>Step 1: Convert the performance results to CSV format<a name=\"aba2ac37-5e59-4d2e-9906-7bd7676352fe\"><\/a><\/h4><p>To kickoff, we will create a JMeter CSV file from a measurement result object. First we need to gather the information required. The standard JMeter CSV format includes 16 variables: <i>timeStamp<\/i>, <i>elapsed<\/i> , <i>label<\/i> , <i>responseCode<\/i> , <i>responseMessage<\/i> , <i>threadName<\/i> , <i>dataType<\/i> , <i>success<\/i> , <i>failureMessage<\/i> , <i>bytes<\/i> , <i>sentBytes<\/i> , <i>grpThreads<\/i>, <i>allThreads<\/i>, <i>latency<\/i>, <i>IdleTime<\/i>, and <i>connect<\/i>. Some of these variables are important for our use case and some we can ignore. Four of them are available in the <a href=\"https:\/\/www.mathworks.com\/help\/matlab\/ref\/matlab.unittest.measurement.measurementresult-class.html#bu4b8w4-1-TestActivity\">TestActivity<\/a> table from the measurement result: timeStamp, elapsed (from \"MeasuredTime\"), label (from \"Name\") and success (from \"Passed\"). So let's use the results from our runperf call above. We can extract these columns into a samplesTable and rename the variables:<\/p><pre class=\"codeinput\">activityTable = vertcat(results.TestActivity);\r\nactivityTable.Properties.VariableNames'\r\n<\/pre><pre class=\"codeoutput\">\r\nans =\r\n\r\n  12&times;1 cell array\r\n\r\n    {'Name'         }\r\n    {'Passed'       }\r\n    {'Failed'       }\r\n    {'Incomplete'   }\r\n    {'MeasuredTime' }\r\n    {'Objective'    }\r\n    {'Timestamp'    }\r\n    {'Host'         }\r\n    {'Platform'     }\r\n    {'Version'      }\r\n    {'TestResult'   }\r\n    {'RunIdentifier'}\r\n\r\n<\/pre><pre class=\"codeinput\">samplesTable = activityTable(activityTable.Objective == categorical({<span class=\"string\">'sample'<\/span>}),:);\r\nnrows = size(samplesTable, 1);\r\n\r\n<span class=\"comment\">% Trim the table and change variable names to comply with JMeter CSV format<\/span>\r\nsamplesTable = samplesTable(:, {<span class=\"string\">'Timestamp'<\/span>, <span class=\"string\">'MeasuredTime'<\/span>, <span class=\"string\">'Name'<\/span>, <span class=\"string\">'Passed'<\/span>});\r\nsamplesTable.Properties.VariableNames = {<span class=\"string\">'timeStamp'<\/span>, <span class=\"string\">'elapsed'<\/span>, <span class=\"string\">'label'<\/span>, <span class=\"string\">'success'<\/span>};\r\n<\/pre><p>A couple of things to note are that the timestamp in JMeter is in unix style format, and the elapsed time reported in JMeter is in milliseconds, both of which are different from the MATLAB measurement result. Also, for failed cases, we need to replace the missing values NaN and NaT in the measurement result by some acceptable values in JMeter. Let's address both of these cleanup items:<\/p><pre class=\"codeinput\"><span class=\"comment\">% Convert timestamp to unix format, and fill NaT with previous available time<\/span>\r\nsamplesTable.timeStamp = fillmissing(samplesTable.timeStamp,<span class=\"string\">'previous'<\/span>);\r\nsamplesTable.timeStamp = posixtime(samplesTable.timeStamp)*1000;\r\n\r\n<span class=\"comment\">% Convert MeasuredTime to millisecond, and fill NaN with 0<\/span>\r\nsamplesTable.elapsed = fillmissing(samplesTable.elapsed,<span class=\"string\">'constant'<\/span>,0);\r\nsamplesTable.elapsed = floor(samplesTable.elapsed*1000);\r\n<\/pre><p>The \"Passed\" column by default stores logical values, we need to convert them to string for JMeter CSV:<\/p><pre class=\"codeinput\"><span class=\"comment\">% Convert pass\/fail logical to string<\/span>\r\nsamplesTable.success = string(samplesTable.success);\r\n<\/pre><p>Next, we need to create some default values for the 12 other variables that are less important for us, and append them to the samplesTable:<\/p><pre class=\"codeinput\"><span class=\"comment\">% Generate additional columns required in JMeter CSV format<\/span>\r\nresponseCode = zeros(nrows, 1);\r\nresponseMessage = strings(nrows, 1);\r\nthreadName = strings(nrows, 1);\r\ndataType = strings(nrows, 1);\r\nfailureMessage = strings(nrows, 1);\r\nbytes = zeros(nrows, 1);\r\nsentBytes = zeros(nrows, 1);\r\ngrpThreads = ones(nrows, 1);\r\nallThreads = ones(nrows, 1);\r\nlatency = zeros(nrows, 1);\r\nidleTime = zeros(nrows, 1);\r\nconnect = zeros(nrows, 1);\r\n\r\nauxTable = table(responseCode, responseMessage, threadName, dataType, <span class=\"keyword\">...<\/span>\r\n    failureMessage, bytes, sentBytes, grpThreads, allThreads, <span class=\"keyword\">...<\/span>\r\n    latency, idleTime, connect);\r\n\r\n<span class=\"comment\">% Append additional columns to the original table<\/span>\r\nJMeterTable = [samplesTable, auxTable];\r\n<\/pre><p>Voila! We now have a table in JMeter format with the full set of 16 variables. We can now simply use the <a href=\"https:\/\/www.mathworks.com\/help\/matlab\/ref\/writetable.html\">writetable<\/a> function to write it to a CSV file. Notice the strings are quoted to ensure the commas in the test name are not treated as delimiters.<\/p><pre class=\"codeinput\"><span class=\"comment\">% Write the full table to a CSV file<\/span>\r\nwritetable(JMeterTable, <span class=\"string\">'PerformanceTestResult.csv'<\/span>, <span class=\"string\">'QuoteStrings'<\/span>, true);\r\n<\/pre><h4>Step 2: Configure build and post-build actions<a name=\"79a8f9d1-c53c-41a4-948d-852f40171898\"><\/a><\/h4><p>Now we can set up performance monitoring on Jenkins! The good news is that after the hard work in step 1, the rest is super easy. Just put the conversion code we've developed above into a function (we called it <tt><b>convertToJMeterCSV<\/b><\/tt>) and make sure it is available from the workspace of your Jenkins build. Then you just invoke that function as part of the Jenkins build. Open the project configuration page, add an \"Execute Windows batch command\" build step and write the following into the command:<\/p><pre>matlab -nodisplay -wait -log -r \"convertToJMeterCSV(runperf('tMatrixLibrary.m')); exit\"<\/pre><p>The output from runperf will be converted and saved to \"PerformanceTestResult.csv\" locally.<\/p><p>Next, click \"Add a post-build action\". With the performance plugin successfully installed on Jenkins, the \"Publish Performance test result report\" option should appear. Select that, and enter the csv file name in the \"Source data files\" field. There are also other options to tweak but we will leave them as they are for now. Click the save button to exit the configuration page.<\/p><p><img decoding=\"async\" vspace=\"5\" hspace=\"5\" src=\"http:\/\/blogs.mathworks.com\/developer\/files\/y2018PostBuild.png\" alt=\"\"> <\/p><h4>Step 3: Build the project and review the trend<a name=\"18a3f122-4228-4f9b-b313-129baeb20cc9\"><\/a><\/h4><p>Everything is done, you can build the project several times, and click the \"Performance Trend\" link on the left to view the trend plots of the response time and percentage of errors:<\/p><p><img decoding=\"async\" vspace=\"5\" hspace=\"5\" src=\"http:\/\/blogs.mathworks.com\/developer\/files\/y2018trendPlot.png\" alt=\"\"> <\/p><p>Notice the statistics in the response time trend are calculated over all tests, which is why the median value can be very different from the average. We can reach another view of the trend by clicking into any build (say #14 in our case) and then click the \"Performance Trend\" link:<\/p><p><img decoding=\"async\" vspace=\"5\" hspace=\"5\" src=\"http:\/\/blogs.mathworks.com\/developer\/files\/y2018trendPlot_Build.png\" alt=\"\"> <\/p><p>Here we will see nice summary table presenting the stats of all tests with green\/red indicators showing the result difference compared to the previous build. All non-passing tests will show up in red, glad we don't have any of those.<\/p><p>That's how we can add a performance report for our MATLAB project on Jenkins, isn't that easy? Share your thoughts on how the process can be improved. Is there any other performance statistics trend you would like to follow for your MATLAB project?<\/p><script language=\"JavaScript\"> <!-- \r\n    function grabCode_f42b6f5171f14088815da02801280484() {\r\n        \/\/ Remember the title so we can use it in the new page\r\n        title = document.title;\r\n\r\n        \/\/ Break up these strings so that their presence\r\n        \/\/ in the Javascript doesn't mess up the search for\r\n        \/\/ the MATLAB code.\r\n        t1='f42b6f5171f14088815da02801280484 ' + '##### ' + 'SOURCE BEGIN' + ' #####';\r\n        t2='##### ' + 'SOURCE END' + ' #####' + ' f42b6f5171f14088815da02801280484';\r\n    \r\n        b=document.getElementsByTagName('body')[0];\r\n        i1=b.innerHTML.indexOf(t1)+t1.length;\r\n        i2=b.innerHTML.indexOf(t2);\r\n \r\n        code_string = b.innerHTML.substring(i1, i2);\r\n        code_string = code_string.replace(\/REPLACE_WITH_DASH_DASH\/g,'--');\r\n\r\n        \/\/ Use \/x3C\/g instead of the less-than character to avoid errors \r\n        \/\/ in the XML parser.\r\n        \/\/ Use '\\x26#60;' instead of '<' so that the XML parser\r\n        \/\/ doesn't go ahead and substitute the less-than character. \r\n        code_string = code_string.replace(\/\\x3C\/g, '\\x26#60;');\r\n\r\n        copyright = 'Copyright 2018 The MathWorks, Inc.';\r\n\r\n        w = window.open();\r\n        d = w.document;\r\n        d.write('<pre>\\n');\r\n        d.write(code_string);\r\n\r\n        \/\/ Add copyright line at the bottom if specified.\r\n        if (copyright.length > 0) {\r\n            d.writeln('');\r\n            d.writeln('%%');\r\n            if (copyright.length > 0) {\r\n                d.writeln('% _' + copyright + '_');\r\n            }\r\n        }\r\n\r\n        d.write('<\/pre>\\n');\r\n\r\n        d.title = title + ' (MATLAB code)';\r\n        d.close();\r\n    }   \r\n     --> <\/script><p style=\"text-align: right; font-size: xx-small; font-weight:lighter;   font-style: italic; color: gray\"><br><a href=\"javascript:grabCode_f42b6f5171f14088815da02801280484()\"><span style=\"font-size: x-small;        font-style: italic;\">Get \r\n      the MATLAB code <noscript>(requires JavaScript)<\/noscript><\/span><\/a><br><br>\r\n      Published with MATLAB&reg; R2018a<br><\/p><\/div><!--\r\nf42b6f5171f14088815da02801280484 ##### SOURCE BEGIN #####\r\n%%\r\n% _Hi folks, today I'd like to introduce ChangQing Wang. ChangQing is the\r\n% lead developer on MATLAB's performance framework, and in addition to all\r\n% the great performance testing features he has delivered, he has also\r\n% found a really easy way to integrate performance results into Jenkins.\r\n% Check it out!_\r\n%\r\n%% MATLAB performance testing on Jenkins\r\n% \r\n% \"Is it build 3301 or build 3319?\" CQ scratched his head, confusion written \r\n% all over his face. Somehow he noticed a significant increase in the code runtime, \r\n% but had no clue which change caused it. He wished he had logged the performance \r\n% of every change in the project.\r\n% \r\n% As continuous integration is becoming one of the key principles in Agile \r\n% processes, and as more and more products are adopting continuous delivery practices, \r\n% performance testing is a crucial step to add to the workflow. CQ never wanted \r\n% to introduce a performance regression, yet realized too late that there is a \r\n% real risk of this every time he touches the code for a bug fix or a new feature.  \r\n% \"A passing build does not mean everything is OK\", CQ pondered, \"How can I monitor \r\n% the performance of my MATLAB project on CI system?\"\r\n% \r\n% The anwer to this question is actually two fold:\r\n% \r\n% # He needs to add performance tests for the project.\r\n% # He needs to schedule performance testing runs for each build and report \r\n% the result.\r\n% \r\n% If you have a MATLAB project and wonder how to write a performance test \r\n% using the latest (and coolest) testing framework in MATLAB, <https:\/\/www.mathworks.com\/help\/matlab\/matlab_prog\/overview-of-performance-test-framework.html \r\n% this page> is a good starting point. You can also look at the <https:\/\/blogs.mathworks.com\/developer\/category\/performance\/?s_tid=Blog_developer_Category \r\n% other blog posts> we've made in the past on the topic. In this blog, I will \r\n% not go through the details of how to write performance tests but rather show \r\n% an example project with performance tests already written. The project I am \r\n% using to highlight this example is a super lightweight \"library\" for three different \r\n% matrix operations, computing the mean, sum, and eigenvalues of a given matrix.\r\n%\r\n% \r\n% <<y2018project_overview.png>>\r\n%\r\n% *matrix_mean.m*\r\n%\r\n% <include>matrix_mean.m<\/include>\r\n%\r\n% *matrix_sum.m*\r\n%\r\n% <include>matrix_sum.m<\/include>\r\n% \r\n% *matrix_eig.m*\r\n%\r\n% <include>matrix_eig.m<\/include>\r\n%\r\n% *tMatrixLibrary.m*\r\n%\r\n% <include>tMatrixLibrary.m<\/include>\r\n%\r\n% The performance test tMatrixLibrary has three parameterized tests for each \r\n% of the source files. Notice in testEig, we use an <https:\/\/www.mathworks.com\/help\/matlab\/ref\/matlab.unittest.qualifications.assertable.asserttrue.html \r\n% assertTrue> qualification to guarantee the matrix passed \r\n% in the test is square, as well as <https:\/\/www.mathworks.com\/help\/matlab\/ref\/matlab.perftest.testcase.startmeasuring.html \r\n% start\/stopMeasuring> to designate the measurement boundary in the test point. \r\n% There are multiple ways to run the performance tests in MATLAB, but the easiest \r\n% is probably to use <https:\/\/www.mathworks.com\/help\/matlab\/ref\/runperf.html runperf> \r\n% to obtain the results. Once we have the results it is easy to get a high level \r\n% overview using <https:\/\/www.mathworks.com\/help\/matlab\/ref\/matlab.unittest.measurement.measurementresult.samplesummary.html \r\n% sampleSummary>:\r\n\r\nresults = runperf('tMatrixLibrary.m')\r\nresults.sampleSummary\r\n%% \r\n% These are nice numbers to evaluate the performance of the project from \r\n% the MATLAB Command Window. Now let's see how we can report them in a CI system. \r\n% If we use Jenkins as an example, we can create a \"Simple Matrix Library\" project \r\n% containing the source and test files shown above:\r\n% \r\n% <<y2018JenkinsProject.png>>\r\n% \r\n% As a prerequisite, to enable logging of performance data on Jenkins, you \r\n% can use the <https:\/\/wiki.jenkins.io\/display\/JENKINS\/Performance+Plugin performance \r\n% plugin> on Jenkins. The plugin can be searched and installed from the Jenkins \r\n% <https:\/\/jenkins.io\/doc\/book\/managing\/plugins\/ plugin manager> . It enables \r\n% a post build process to capture reports from major testing tools and then generates \r\n% trend plots over the build history. In addition, it allows setting the latest \r\n% build status as passed, unstable or failed, based on the reported error percentage. \r\n% There are several supported report formats including <http:\/\/gettaurus.org\/docs\/Reporting\/?utm_source=jenkins&utm_medium=link&utm_campaign=wiki#BlazeMeter-Reporter \r\n% Final Stats XML>, JMeter format, JUnit XML, and so forth. However, we pick the \r\n% JMeter CSV format for our MATLAB project since the output measurement result \r\n% object from runperf is already storing the information in tabular form and as \r\n% you will see it is quite straightforward to generate a JMeter CSV out of these \r\n% tables. Here are the detailed steps:\r\n%% \r\n%% Step 1: Convert the performance results to CSV format\r\n% To kickoff, we will create a JMeter CSV file from a measurement result object. \r\n% First we need to gather the information required. The standard JMeter CSV format \r\n% includes 16 variables: _timeStamp_, _elapsed_ , _label_ , _responseCode_ , _responseMessage_ , \r\n% _threadName_ , _dataType_ , _success_ , _failureMessage_ , _bytes_ , _sentBytes_ , _grpThreads_, \r\n% _allThreads_, _latency_, _IdleTime_, and _connect_. Some of these variables are important \r\n% for our use case and some we can ignore. Four of them are available in the <https:\/\/www.mathworks.com\/help\/matlab\/ref\/matlab.unittest.measurement.measurementresult-class.html#bu4b8w4-1-TestActivity \r\n% TestActivity> table from the measurement result: timeStamp, elapsed (from \"MeasuredTime\"), \r\n% label (from \"Name\") and success (from \"Passed\"). So let's use the results\r\n% from our runperf call above. We can extract these columns into a samplesTable \r\n% and rename the variables:\r\n\r\n%%\r\nactivityTable = vertcat(results.TestActivity);\r\nactivityTable.Properties.VariableNames'\r\n%%\r\nsamplesTable = activityTable(activityTable.Objective == categorical({'sample'}),:);\r\nnrows = size(samplesTable, 1);\r\n\r\n% Trim the table and change variable names to comply with JMeter CSV format\r\nsamplesTable = samplesTable(:, {'Timestamp', 'MeasuredTime', 'Name', 'Passed'});\r\nsamplesTable.Properties.VariableNames = {'timeStamp', 'elapsed', 'label', 'success'};\r\n%% \r\n% A couple of things to note are that the timestamp in JMeter is in unix \r\n% style format, and the elapsed time reported in JMeter is in milliseconds, both \r\n% of which are different from the MATLAB measurement result. Also, for failed \r\n% cases, we need to replace the missing values NaN and NaT in the measurement \r\n% result by some acceptable values in JMeter. Let's address both of these cleanup \r\n% items:\r\n\r\n% Convert timestamp to unix format, and fill NaT with previous available time\r\nsamplesTable.timeStamp = fillmissing(samplesTable.timeStamp,'previous');\r\nsamplesTable.timeStamp = posixtime(samplesTable.timeStamp)*1000;\r\n\r\n% Convert MeasuredTime to millisecond, and fill NaN with 0\r\nsamplesTable.elapsed = fillmissing(samplesTable.elapsed,'constant',0);\r\nsamplesTable.elapsed = floor(samplesTable.elapsed*1000);\r\n%% \r\n% The \"Passed\" column by default stores logical values, we need to convert \r\n% them to string for JMeter CSV:\r\n\r\n% Convert pass\/fail logical to string\r\nsamplesTable.success = string(samplesTable.success);\r\n%% \r\n% Next, we need to create some default values for the 12 other variables \r\n% that are less important for us, and append them to the samplesTable:\r\n\r\n% Generate additional columns required in JMeter CSV format\r\nresponseCode = zeros(nrows, 1);\r\nresponseMessage = strings(nrows, 1);\r\nthreadName = strings(nrows, 1);\r\ndataType = strings(nrows, 1);\r\nfailureMessage = strings(nrows, 1);\r\nbytes = zeros(nrows, 1);\r\nsentBytes = zeros(nrows, 1);\r\ngrpThreads = ones(nrows, 1);\r\nallThreads = ones(nrows, 1);\r\nlatency = zeros(nrows, 1);\r\nidleTime = zeros(nrows, 1);\r\nconnect = zeros(nrows, 1);\r\n\r\nauxTable = table(responseCode, responseMessage, threadName, dataType, ...\r\n    failureMessage, bytes, sentBytes, grpThreads, allThreads, ...\r\n    latency, idleTime, connect);\r\n\r\n% Append additional columns to the original table\r\nJMeterTable = [samplesTable, auxTable];\r\n%% \r\n% Voila! We now have a table in JMeter format with the full set of 16 variables. \r\n% We can now simply use the <https:\/\/www.mathworks.com\/help\/matlab\/ref\/writetable.html \r\n% writetable> function to write it to a CSV file. Notice the strings are quoted \r\n% to ensure the commas in the test name are not treated as delimiters.\r\n\r\n% Write the full table to a CSV file\r\nwritetable(JMeterTable, 'PerformanceTestResult.csv', 'QuoteStrings', true);\r\n%% \r\n%% Step 2: Configure build and post-build actions\r\n% Now we can set up performance monitoring on Jenkins! The good news is that \r\n% after the hard work in step 1, the rest is super easy. Just put the conversion \r\n% code we've developed above into a function (we called it |*convertToJMeterCSV*|) \r\n% and make sure it is available from the workspace of your Jenkins build. Then \r\n% you just invoke that function as part of the Jenkins build. Open the project \r\n% configuration page, add an \"Execute Windows batch command\" build step and write \r\n% the following into the command:\r\n%%\r\n% \r\n%  matlab -nodisplay -wait -log -r \"convertToJMeterCSV(runperf('tMatrixLibrary.m')); exit\"\r\n%\r\n\r\n\r\n%% \r\n% The output from runperf will be converted and saved to \"PerformanceTestResult.csv\" \r\n% locally. \r\n% \r\n% Next, click \"Add a post-build action\". With the performance plugin successfully \r\n% installed on Jenkins, the \"Publish Performance test result report\" option should \r\n% appear. Select that, and enter the csv file name in the \"Source data files\" \r\n% field. There are also other options to tweak but we will leave them as they \r\n% are for now. Click the save button to exit the configuration page.\r\n% \r\n% <<y2018PostBuild.png>>\r\n% \r\n%% \r\n%% Step 3: Build the project and review the trend\r\n% Everything is done, you can build the project several times, and click the \r\n% \"Performance Trend\" link on the left to view the trend plots of the response \r\n% time and percentage of errors:\r\n% \r\n% <<y2018trendPlot.png>>\r\n% \r\n% Notice the statistics in the response time trend are calculated over all \r\n% tests, which is why the median value can be very different from the average. \r\n% We can reach another view of the trend by clicking into any build (say #14 in \r\n% our case) and then click the \"Performance Trend\" link:\r\n% \r\n% <<y2018trendPlot_Build.png>>\r\n% \r\n% Here we will see nice summary table presenting the stats of all tests with \r\n% green\/red indicators showing the result difference compared to the previous \r\n% build. All non-passing tests will show up in red, glad we don't have any of \r\n% those.\r\n% \r\n% That's how we can add a performance report for our MATLAB project on Jenkins, \r\n% isn't that easy? Share your thoughts on how the process can be improved. Is \r\n% there any other performance statistics trend you would like to follow for your \r\n% MATLAB project?\r\n##### SOURCE END ##### f42b6f5171f14088815da02801280484\r\n-->\r\n","protected":false},"excerpt":{"rendered":"<div class=\"overview-image\"><img src=\"https:\/\/blogs.mathworks.com\/developer\/files\/y2018trendPlot_Build.png\" class=\"img-responsive attachment-post-thumbnail size-post-thumbnail wp-post-image\" alt=\"\" decoding=\"async\" loading=\"lazy\" \/><\/div><!--introduction--><p><i>Hi folks, today I'd like to introduce ChangQing Wang. ChangQing is the lead developer on MATLAB's performance framework, and in addition to all the great performance testing features he has delivered, he has also found a really easy way to integrate performance results into Jenkins. Check it out!<\/i>... <a class=\"read-more\" href=\"https:\/\/blogs.mathworks.com\/developer\/2018\/09\/18\/jmeter-results-jenkins\/\">read more >><\/a><\/p>","protected":false},"author":90,"featured_media":1869,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[4,13,7],"tags":[],"_links":{"self":[{"href":"https:\/\/blogs.mathworks.com\/developer\/wp-json\/wp\/v2\/posts\/1857"}],"collection":[{"href":"https:\/\/blogs.mathworks.com\/developer\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs.mathworks.com\/developer\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.mathworks.com\/developer\/wp-json\/wp\/v2\/users\/90"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.mathworks.com\/developer\/wp-json\/wp\/v2\/comments?post=1857"}],"version-history":[{"count":11,"href":"https:\/\/blogs.mathworks.com\/developer\/wp-json\/wp\/v2\/posts\/1857\/revisions"}],"predecessor-version":[{"id":1889,"href":"https:\/\/blogs.mathworks.com\/developer\/wp-json\/wp\/v2\/posts\/1857\/revisions\/1889"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/blogs.mathworks.com\/developer\/wp-json\/wp\/v2\/media\/1869"}],"wp:attachment":[{"href":"https:\/\/blogs.mathworks.com\/developer\/wp-json\/wp\/v2\/media?parent=1857"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs.mathworks.com\/developer\/wp-json\/wp\/v2\/categories?post=1857"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs.mathworks.com\/developer\/wp-json\/wp\/v2\/tags?post=1857"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}