The Myth of Faster, Bigger, BetterBy Channel Insider Staff | Print
Re-Imagining Linux Platforms to Meet the Needs of Cloud Service Providers
Opinion: The IT industry needs a unified services architecture and toolset that bridges different technologies and processes.
Still blindly following the hardware-centric myth that "better, faster, cheaper" will solve all ills, the networking industry continues to revel in the speeds and feeds of its latest products while IT managers are drowning in the cost and complexity of making the current ones work.
Today's approaches to application networks do not reduce cost and complexity, keep up with the pace of business change or address what is most important to business because they were not designed to do so.
What is needed is a unified services architecture and toolset that bridges the gap between the different technologies and operational processes still separating the computing and communications domains.
In the converged world of distributed applications, those persistent distinctions have become outdated and irrelevant.
The emerging promise of SOA (service-oriented architecture) has considerable potential, but the SOA announcements of the major vendors fall woefully short by offering "application awareness" as a solution.
Enterprise IT managers don't need "application-aware" networks; they need an application-centric network solution designed around the application session and the end-user experience.
To do that, applications must be liberated from legacy network environments, not further entwined in them, with new levels of complexity like network APIs.
The requirements for running distributed applications networks today are so enmeshed in the daunting complexities of provisioning and configuring a variety of hardware components that IT managers have neither the time nor the tools to focus on the fundamental purpose of the network infrastructure.
This is simply to make sure that the right organizations and people get reliable, secure access to the right systems, services and information when they need them, no matter where they are.
The tools that exist today all presume that IT organizations are working exclusively on their own private network, but critical enterprise traffic is now traversing the private networks of their customers and supply chain partners, and the largest enterprise traffic growth rates are over public IP networks.
As a result, IT managers are being held accountable for the performance of applications over infrastructures that they neither own, nor directly control.
Only by giving them the tools to control the application session independent of physical network elements will they have the capabilities to deliver what the business requires of them.
The complexity, cost and hardware-centric nature of today's application networking solutions is reminiscent of a similar complexity and design rigidity that existed in the database world before the introduction of relational technology.
Relational technology emerged as a software abstraction layer that isolated the business application from the cost, complexity and constraints of creating and managing separate data configurations for each application.
Applications were then able to be built and changed more cost effectively and rapidly because their primary focus was around the business problem, the users and the information that they needed.
Today we need the same relational capabilities and virtual software abstraction layer for application networks. This application network overlay would liberate distributed-applications networks from the hardware layer much as relational database technology liberated applications from hardware centric data management tools.
These new tools will manage the applications, content and the authorized users of that content across a global network of connections and relationships between enterprises, computers and users.
With a network-independent services architecture, enterprise IT managers can focus on the end-to-end performance of their applications rather than the integration, provisioning, configuration and maintenance of a highly distributed inventory of hardware and software components.
By freeing the application from the constraints and idiosyncrasies of network hardware, IT managers will have simpler, less costly, yet pervasive capabilities that give them the ability to monitor, control and prioritize individual applications sessions across global networks in real time.
The emerging IP telephony industry is developing a variety of measures for the customer's "experience" of quality during a phone conversation.
One, the MOS (Mean Opinion Score), is based on an actual human test. The industry is settling on the R-Factor as an automated measure to reliably simulate the MOS score.
This measure has allowed the IP telephony industry to focus on what really mattersthe end-to-end quality and performance of a conversation measured at the points of servicerather than on the speeds and feeds of individual hardware components in the infrastructure.
It is time to give IT managers the tools that provide these kinds of objectively measured, end user-focused performance standards for all IP networked applications, not just the latest one.
This type of instrumentation, control and transparency would go a long way in establishing a new benchmark for world-class IT application networks.
It would allow the IT world to raise its sights above the technology and focus on one of the only truths that matters in ITthe end-user experience.
Jim Zucco is Chairman and CEO of Corente, a provider of integrated software-based services for the secure delivery and management of distributed applications and diverse networks.