Building a Load and Performance Reporting solution using JMeter, MySQL, Pentaho Data Integration, JasperReports and IReport, JasperServer and Shell Scripting
The subject i think, has been going on through the testing community for a while, and i am sure everybody comes with a custom solution to the problem. My solution a couple of years ago, when i last performed a load testing project, was composed of :
OpenSTA for load testing
Microsoft Visual Basic for Applications (VBA) for data import
Microsoft Excel for Pivot Tables and Graphics
I can still remember spending about 3 days on building my VBA scripts, in order to have a pseudo-automated solution of generating testreports.
Well, here i am, 2009, doing load and performance testing for a new project. This time Excel was out of the question. Why ? – someone still using it would ask…Cause it’s reports are so hard to manage, history reports do not exist, and if they do, they require a lot of maintanance, it is time consuming, and …it is so Windows platform based (no portability whatsoever)
Now, in a time where ETL (Extract Transform Load) is on everybody’s lips, i decided to go the hard way. That’s what they say, isn’t it? The hard way is always the best way.
Therefore, in the article to follow, i will describe the solution i implemented using SHELL scripts, JMeter, MySQL , Pentaho Data Integration, IReport, JasperReports, JasperServer and Hudson for a fully automated load and performance test and reporting. Sound a little bit complicated, and so many tools involved, but trust me, it is worth the time!
Here we go. Let me explain a little how i imagined this would work, and how i implemented it.
Part 1 – JMeter and Shell Scripting – WHY ?
Well, the first thing i asked myself, was …how can i build those scripts so that i can group them in “performance scenarios”. It is a question where automated tools offer an answer, by providing the test scenario functionality. But i did not have any test scenarios…i only had JMeter test plans. That was my starting point. And i needed scenarios:
- I thought that the best way to control JMeter scripts is using command line interface. (1)
- Simply starting a JMeter testplan from the command line was surely not enough, therefore i wanted it to be able to run JMeter testplans with arguments.(2)
- I also wanted to be able of grouping multiple JMeter Testplans into a single Testscenario, so i could control the workflow (3)
- Only starting one JMeter test after the other was not exactly what i had in mind. I needed to control what happens in the beginning, in between, and when the tests ended (4)
So here i was, having 4 “requirements” that were all pointing in one and single direction: SHELL SCRIPTING
Part 2 – Collection of JMeter test results – Where ?
I definitely needed a central point to store the jmeter results, in order to be able to create reports from them. But creating dynamic reports requires a database of course. So i turned to my old friend, MySQL.
Basically, i have created about 5 tables, to store the JMeter samples, JMAP samples (heap maps), but also information about the tests i have run. I wanted a full report, not a partial one.
Basically i neede one table for storing the version of the application, one table for storing information about the test scripts that i have run, one table for saving the relations between script and version, one table for storing the actual JMeter results, and last, but not least (it was the last one added, that’s true ), one table for storing the JMAP Heap statistics (number of objects in heap, and size of the objects in heap)
Five tables alltogether. That is all you need to get started.The design of the tables, specially the one for the jmeter samples, depends very much on what information you want to collect and filter.
I will get back to the subject, by detailing the structure in the post that will be related to this.
Part 3 – ETL – Extract, Transform, Load
There were several opportunities here, like creating some stored procedures that could be called from the shell, but i wanted something more… user friendly, easier to maintain, easier to design, and easier to integrate. I needed an ETL tool. And Pentaho had the solution to my problem.
Pentaho has gone truly far in this direction. They offer Business Intelligence Products of amazing quality, and i do take the time to congratulate them on their products. They offer java based tools for designing transformations, grouping them into jobs, executing them via GUI or command line, and so on.
I decided to give Pentaho Data Integration a try, and although it took me more time than expected to get it alltogether, it was worth every minute.
So using Spoon, which is the GUI for designing transformations from Pentaho Data Integration, i managed to create a Data Transformation, extracting and loading test information and test results in my database, and even more, automating it via shell scripts.
I think on a scale of complexity, this was on the highest point…, only a few points down of integrating JasperServer But, as i said, it was worth every minute
Part 4 – Creating and running the reports
Now, i had the data in a database. The querys were quite simple, so all i had to do now was to find a smart tool to generate the reports. And in a DYNAMIC WAY. Any desktop based application was out of the question, i needed this reports to be called at any time, by anyone, from everywhere… i needed WEB BASED REPORTS
So i turned to Jasper Reports and Jasper Server. And of course, IReport for designing the reports. All these, part of Jasper Server Package ( no need for the proffesional version, i used the free one)
That is all i needed to get me started. And the best part of it ? It is free, it is customizable!
To summarize, once again, this is what you need:
MySQL DB, JMeter, Shell Scripting, Jasper Server, IReport, Pentaho Data Integration
My next post will start detailing the first steps, the database structure, and importing the data using Pentaho Data Integration. I will speak about the transformation itself: how to create it, what u have to take care of, and provide a sample, for both the sql structure, and for the Pentaho Transformation
Cheers,
Alex
