Flow of a Performance Test
Basic Flow of a Performance Test, as I've done them, for those interested:
1) Use The Grinder's TCPProxy to record a series of user stories, making sure to put in comments as you record
2) Trim down the scripts to only issue the necessary commands, delete the sleep times
3) Put in some text handling and error checking into the scripts to make things like searches into variable-based scripts, and custom site error responses (such as login redirects) into Grinder errors
4) Create a main script to handle all the different scripts, varying the scripts that are running based on thread number and random numbers to create the vUser distribution required
5) Create a grinder.properties file that runs the test in question (baseline, stress, etc...)
6) Run nmon or perfmon on the web server, app server, db server, etc...
7) Start the test using the Grinder Console
8) Stop the test and gather the results
9) Use CSVSee and the grinder output format to create the transaction statistics
10) Use a stat analyzer such as nmon analyzer to create meaningful statistics
11) Gather the statistics and create a summary report outlining the test, the gathered statistics, analysis, and predictions based on the test