Flow of a Performance Test

Basic Flow of a Performance Test, as I've done them, for those interested:

1) Use The Grinder's TCPProxy to record a series of user stories, making sure to put in comments as you record
2) Trim down the scripts to only issue the necessary commands, delete the sleep times
3) Put in some text handling and error checking into the scripts to make things like searches into variable-based scripts, and custom site error responses (such as login redirects) into Grinder errors
4) Create a main script to handle all the different scripts, varying the scripts that are running based on thread number and random numbers to create the vUser distribution required
5) Create a grinder.properties file that runs the test in question (baseline, stress, etc...)
6) Run nmon or perfmon on the web server, app server, db server, etc...
7) Start the test using the Grinder Console
8) Stop the test and gather the results
9) Use CSVSee and the grinder output format to create the transaction statistics
10) Use a stat analyzer such as nmon analyzer to create meaningful statistics
11) Gather the statistics and create a summary report outlining the test, the gathered statistics, analysis, and predictions based on the test


Post new comment

The content of this field is kept private and will not be shown publicly.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd>
  • Lines and paragraphs break automatically.
  • Each email address will be obfuscated in a human readable fashion or (if JavaScript is enabled) replaced with a spamproof clickable link.

More information about formatting options

This question tests whether you are a human visitor, to prevent automated spam submissions.
Question text provided by textcaptcha.com