Monday 7 September 2009

If you are not thinking about perfomrance, you are not thinking

Performance testing is a funny old thing, in that whenever the subject comes up, people get all hot and bothered about it. The thing that really tickles my fancy is when developers suddenly get righteous about testing!

Testers and developers have a totally different view of the world. The best testers i have worked with have a real need to dig into systems. Even with black box testing they find a way to work out what a systems does way beyond its simple inputs and outputs. They cant help themselves. It is almost like they cant pass go if they don't break the system, almost an addiction (or is that affliction).

Now that the developers find themselves writing unit tests, integration tests and acceptance tests they think that overnight they have learnt everything there is to know about testing, right? Wrong!

Yes, sure a developer can write a test, but they often struggle with the intent of the test an more so with non functional testing like performance testing. let me shake it down.

Ok, so the business wants to monetise their existing data by presenting it in a new way, for example "Email Alerts", you know the sort of thing. You create a search, and when your criteria are met you get sent an email.

The developer sits down to think about performance testing, and thinks about how the system works. In our example here, the system will fire the searches every night, when the database is relatively quiet so that we don't overload the system during peak hours.

So the developer thinks OK I'll create a load of these "alerts" using SQL inserts and fire up the system and see how fast it can work through them.

They do just that and get back some statistics like, number of threads, amount of memory the JVM consumed, how many connections to the DB were needed, how many searches were executed, how long it took to execute a search, that sort of thing. They call meetings and stroke their chins in a sage like way. The figures look about right.

But in real life the database would never have the alerts inserted into it in that way. Its probable that users would be inserting data at the same time as it was being read out. Also the product isn't likely to go live and have 100% take up over night. Its more probable that the take up would be slower, perhaps taking weeks or months never achieving 100% take up. Old alerts would be expiring and some users would renew those while new ones are being created while some other are being edited (change of email address etc).

The crux of the matter is mindset. The tester sits down and thinks, what could go wrong? What happens if the DB is unavailable at the time the batch job runs? What happens if the DB needs to be taken down for maintenance during a batch run, will the batch pick up where it left off? Can the batch job complete before the off peak period comes to an end? Can the mail server handle the number of emails to be sent? What happens to email that bounces? In other words the tester takes a step back and looks at the system holistically. Because a user doesn't give a damn if your search engine can execute a query in 33ms if they don't get the email until 12hours after it was relevant.

Now on the current project we have completely rewritten the platform. New version of Java, new styles of writing code, new infrastructure etc etc. The search engine technology is the same, however during the life of the project the API has been updated and a particular feature enabled. This feature allows us to search in a related way. Generally speaking it allows us to do the Amazon type thing; "People who search for X also search for Y", but it comes at a cost, it takes longer to return the result set (of course it would its got to do another search under the hood).

Again during "testing" the developers just threw as much load as they could muster at the application. But guess what, now its live the figures they got during testing don't match, not even close.

It isn't like hadn't been bleating on about performance for months. I even printed out posters that read "If you are not thinking about performance, you are not thinking" and stuck them above the urinals in the toilets.

Its only now that the team are in the spotlight (its so bright it burns) that they have come to us to ask for our help. Once again we are trying to polish a rough product instead of building the quality in from the start. Once again we cant.

It doesn't matter a damn that we went all out Agile, TDD and XP if the thing doesn't perform. The end user doesn't care that we have continuous integration, they know its damn slow and they vote with their feet (or keyboards/mouse).

No comments:

Post a Comment