Is any manager-ial platitude more often repeated, and less often observed, than “What’s measured is what matters”?
Before you count another line of code written, another percentage point of server uptime or another dollar of infrastructure capital spending, I ask you to spend some time counting the various ways that you measure the efforts you make and the results that those efforts produce.
I ask you to look unblinkingly at the list of things that you count with razor-sharp precision, and then to visualize—I warn you, this may be painful—the phantom list of things that are tremendously important but that you don’t measure at all.
If yours is a typical business, you know to the penny what you spend on office supplies, but you have no idea what you spend on unproductive hours caused by inadequate training in office software skills.
You know the amount of vacation time that an employee has accrued—to the nearest tenth of an hour—but you have no idea of how much time it takes to respond to a customer e-mail, and even less of an idea of the relationship between response time and the customer’s contribution to your earnings.
You know how much revenue you earn from sales of your company’s products, but you have no consistent way of associating revenue growth with customer-facing Web site application performance. You have even less of an idea of the connection between sales improvements on the one hand, and the difference between the work of your best and your worst application developers on the other.
You’re like a driver whose fuel gauge and speedometer function perfectly, but whose windshield is so dirty and scratched that he can barely see if it’s light or dark ahead of him—let alone which way the road leads. You’d never tolerate this situation in any other part of your life, but in the realm of enterprise performance measurement, it’s the norm.
One of the earliest strategic metrics that I remember encountering was 3M’s explicit measurement of its success in maintaining a fresh portfolio of innovative products.
I’m pretty sure I saw this mentioned first in Tom Peters’ and Robert Waterman’s 1982 book “In Search of Excellence,” but 3M was still using this measure in 1997—when it achieved the impressive statistic of generating 30 percent of annual revenue from products that had been introduced within the previous four years.
I suspect that calculating the comparable figure of merit for most companies, even in the supposedly fast-paced IT sector, would yield much less vigorous values—especially if one doesn’t count as “new” a product that seems, to all but the expert user, only superficially different from its previous release.
A product such as 3M’s Post-It wasn’t a re-release of anything, but rather a basically new approach to a common need; a product such as Microsoft’s Windows Vista solves Microsoft’s problem of maintaining a revenue stream more than it solves the problems of any customers I can identify.
Another metric that’s surfaced much more recently is Sun’s proposed SWAP (Space, Watts and Performance) composite measure of server design. The SWAP figure is computed by dividing performance (choose a relevant measure, please) by the product of rack space and power consumption. At least these things can be objectively measured—it’s just a matter of changing the viewpoint, not of learning to live with a softer focus.
There are two challenges facing anyone who wants to change the measures that guide enterprise decisions. First, the costs of collecting data have to be recognized as cheap compared with the costs of not doing so. Second, the discomfort of responding to any surprising results will require top management’s buy-in.
It’s a mistake to underestimate these challenges, but it’s a bigger mistake if you just keep on driving—and hope that you’re still on the road.
Technology Editor Peter Coffee can be reached at email@example.com .
Check out eWEEK.com’s for the latest news, reviews and analysis on IT management from CIOInsight.com.