Lately I’ve been fortunate to conduct an array of load tests using very large numbers and I believe I’ve stumbled upon something interesting within the Load/Performance testing community. Or rather something I might not understand yet…that’s also possible. A standardized way to load test.
Let’s take an example. If I want to load test (not stress test) an application and I’m using CCU (concurrent users – how many users are active on the site in a single second) I might have someone inform me that I need to use Page Views per hour or Unique users per hour. Let’s say that we have 100 Unique users per hour on the site. If I load test using this figure for the a duration of an hour to meet the expected 100 Unique users this is where I encounter the dilemma and sometimes confusion.
Did these 100 Unique users arrive in the first minute of the hour? The first second of the hour? Did 10 users arrive every 10 seconds for 100 seconds? Never to return again? So far these questions, from my readings are of no concern as long as you hit your the desired load within the duration of an hour. This is a wrong way to look at it and I believe test. If you’re given a task to load test an application you must check how well your infrastructure deals with the peak CCU load, you might find out that hitting your server with 100 CCU completely destroys it (if this happens at this number you have many problems) .
I’ll discuss a bit more on Benchmarking, Load Testing, and Profiling with a ZF twist and sometimes not going forward.