Subscribe

Make performance testing mainstream

Companies must ensure the software or app developed does what it was designed to do.

Mario Matthee
By Mario Matthee
Johannesburg, 17 Nov 2016

Performance testing is often seen as the poor distant relative of the more established testing disciplines I've pored over in my previous Industry Insights.

This is strange, given how, for starters, the value of an application to the business ultimately depends on its perceived performance by the app's users, and secondly, because performance testing is far more complex - and expensive - than any other form of testing.

The latter reason should provide the first clue as to why performance testing is either non-existent or only casually flirted with: it simply costs more, and the benefits are (mostly) hidden from the managers and developers responsible for releasing an app in the first place. Put another way: development typically happens so fast, and new software versions are released so frequently, that an adequate and accurate assessment of an app's performance is somewhat dated by the time it is processed in the feedback loop.

This is until, of course, the day comes that an app falls over, business grinds to a halt, and lots of fingers get pointed in lots of different directions.

Unlike other forms of testing, the role of performance testing is similar to that of a car garage. The mechanics at the garage are not responsible for building the car or testing it before sale. They're also not responsible for testing any features or third-party add-ons. Their job is to maintain the engine, make sure the car runs smoothly, and if there are any issues, report them back to the owner or manufacturer.

It's fair to say many car owners don't exactly keep up with the recommended maintenance schedules on their cars. The cars themselves are expensive enough to run without the additional cost of someone looking under the hood, turning a screw or two, and charging a premium for the benefit. So, owners skip the checklist, and when the engine starts making unusual noises or the windscreen wipers stop working, they reluctantly book in the car for a service.

To me, this is completely counterintuitive. Just like with cars, people spend a lot of time, money and consulting fees to develop the software their business - and their customers - often depend on. Best-practice guidelines are followed developing it, and it is continually tested against a working baseline before it is released for general use. Then even more resources are invested gathering data from users on how to improve the software, adding features, and investing again in all manner of manual, automated, and automated regression testing to make sure it works well across many different devices and through numerous legacy versions.

Stopping short

What businesses don't do is take the extra step of making sure the software - or app or Web site - actually does what it was designed to do, in an acceptable and rewarding fashion for users. In other words, the software isn't tested for actual performance, at least not as often as it should be.

A good recent example of this issue is a Web site of a popular online retailer that promoted a massive discount on goods for Mother's Day a few years back. The offer was so good, and the retailer's marketing was so efficient, that come Mother's Day, the Web site was flooded with so many users that it crashed within minutes and brought the entire business to a standstill.

Don't start knocking on the CTO's door demanding regular performance testing just yet.

Until that day, no one had thought to test the performance of the Web site for instances of rapid high demand, yet it seemed everyone was surprised when it couldn't handle the load.

Performance anxiety

This seemingly obvious (in hindsight) example is hardly extreme or uncommon. It continues to happen every day in different industries in all parts of the world. For me, it's just a reminder of the persistent disconnect between business, marketing and IT, and how promises made by the one don't necessarily align with expectations from the others, leading to a distorted image of the business that only shows up when the fires start burning.

In truth, all - or most - of the issues shown up by these examples could be avoided if performance testing moved into the mainstream.

Now, before choking on your breakfast cereal, let me add that while it's true that good performance testing is not nearly as mature as other forms of software testing - which is why it's still so expensive - this is changing fast. DVT uses the industry standard of performance testing tools, Neoload, which has just been released as a Community Edition, giving away free performance testing for any business with a limit of 50 users.

If a company has more than 50 users, which most medium-sized businesses do, cost is certainly a factor in the decision. However, it's a sign of the times that previously exclusive and expensive tools are dropping down the price pole a notch or two. The more this happens, the more users start to test for performance, the more the price is likely to drop. So, without me having to do anything other than getting the word out there, the performance testing mainstream movement is already well under way.

But, don't start knocking on the CTO's door demanding regular performance testing just yet, because, like every good story, this one has a few caveats.

For one, it's hard enough finding the right people with the right skills for mainstream testing, let alone finding those with the experience and talent for performance testing. Moreover, since performance testing is still niche - despite its mainstream leanings - companies would be hard pressed to fill up a performance tester's day with revenue-generating work; they're most likely going to need multi-disciplined staff that can juggle performance testing alongside a list of other duties.

Performance testing also tends to be more technical than the norm, so qualified testers will need to be technically inclined, with an eye for detail, and have a clear understanding of user requirements coupled with business acumen and an excellent eye for detail. Not that I'm asking for much.

Despite these challenges, consider the alternatives. When was the last time you used an app that failed to load or repeatedly crashed more than once or twice? Chances are it's already deleted from your smartphone. When was the last time you patiently sat and waited for a Web site to load, watching the hourglass or spinning beach ball do its thing?

As the business world gets consumed by the next wave of Gen Y customers, can companies afford to take a chance with their livelihood and not test the performance of their bread-and-butter software as often as it is tested for everything else? Think of the time, cost, and loss of reputation and revenue of not doing so, and tell me if doing so is any less expensive.

I didn't think so.