Thursday, February 09, 2006

Performance Testing Project

This is the project that we engaged for the last two weeks. Even though it did not happen there are things that I have learned through out the process.

Business

The project came to us as a returned business. Basically, when party B did a contract for party A to build a system, they are required to find a third party to do a performance testing for the system so that they can prove that this system will work with the estimated workload.

This came to us as a returned business. It sounded like a routine contract that will just benefit all parties. Party B gets to finish its contract and get paid, party A will get to know better about the non-functional behavior of the system. As I have learned, the rate for performance testing contract is normally higher than other contracts. So it would have been a good short contract for me (working with a long time ThoughtWorker, Matthew Short) before I head back to China. And ThoughtWorks doesn't have to put me on beach.

For reasons that I cannot publicly post, the contract fell through even though technically we have a pretty good idea of who we need to work with, what we need to produce and what to do to get the data.

So now everyone lose.

This is my first fall-through project. Not totally surprised since I cannot imagine that all engagements that we have turn into a project, but still I am still a bit uneasy to the fact that despite all efforts, we just cannot influence the other parties to prevent them from taking actions that jeopardizes the project.

Or can we, I wonder?

Technically

I have done some performance testing for various projects, but always as part of the project iterative tuning process. It mainly focused on the profiling part rather than the performance benchmarking as a whole.

In the last project in China, the client who hired us had a contract with the customer, which put the specific performance requirement as part of it. It looked pretty vague to be, when reading through that part of the contract, that they are just numbers that can be easily met as long as you are free to choose the hardware, or could be almost impossible if the hardware is fixed.

In preparation for project by referring to the one that we did before, I have learned about performance testing methodology, Transaction Processing Performance Council and their benchmarks (http://www.tpc.org.)

Because the application is going to be deployed on a Linux system (while the last one was on a Windows system), with a little research and the great help from Barrow Kwan, we were able to have a Linux at our disposal, run resource monitoring programs, process the data into csv files, port them over to Microsoft Excel and generate the performance chart.

I guess I didn't totally lose on this after all.

No comments: