Channel Insider content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

Interop in Las Vegas May 20-25 will attempt to answer yet again the question, whither network access control?

Cisco, Microsoft and the Trusted Computing Group, backed by an entire NAC (network access control) Day track and a host of lesser players that provide so-called NAC right now products, will attempt to answer the question by asserting that the network is the best place to control access.

The NAC Interop Lab is certainly a place to go to get some nitty-gritty questions answered. The lab has been up and running for at least the last two shows and has plenty of handouts and demonstrations for attendees to observe. Keep in mind, however, that the NAC Interop Lab is tilted toward securing guest access by using the network as an access choke point.

Despite what NAC vendors at Interop will argue, the question of whether the network is best place for checking endpoint compliance with security policies remains unsettled for now. For one thing, using NAC devices and services to assess endpoint health—including the currency of antivirus signatures, operating system and application patches, and firewall status—and to ensure that unauthorized programs and malware are not running on endpoints threatens to be a policy writing nightmare.

What’s more, NAC tools today often use a tightly constricted definition of endpoint that usually means supporting various flavors of the Windows operating system. As a result, Apple, Linux, Unix and myriad other handset operating systems, as well as network appliances such as printers and copiers, are almost always excluded from NAC conversations.

Today, NAC tools are aimed primarily at endpoints used by contractors and auditors, which are machines that lay outside the strict control of central IT. It’s clear that many of these systems must be allowed onto controlled networks to provide valuable services and to ensure compliance with a burgeoning host of regulations.

Contractor and auditor systems that are outside central IT control are, as our tests have shown, some of the most resistant to being checked as safe to use. For one thing, it’s hard to put an agent on the systems to run the checks. For another, it’s almost impossible to remediate these systems because of licensing concerns assuming that the host network even has access to, for example, the antivirus signatures used by a particular endpoint.

Other Approaches

The other way to combat viruses and malware being carried into the network by visiting systems is to harden internal clients and servers to better withstand the onslaught infected systems will inevitably bring into the network.

Server managers can take a page from the trusted operating system functionality and best practices that are emerging from the Linux and Solaris platforms. Servers can also be protected by taking advantage of their special location in the data center where firewalls and identity-based access systems can be effectively combined to ensure that authorized users alone are able to access the protected resources.

We’ve talked for years about ways that client side systems can be protected, and we stand by those recommendations today. Least user privilege combined with strict user system lockdown is still one of the best ways to ensure that systems aren’t susceptible to the effects of malware. IT managers who haven’t taken this mantra to heart should pay careful attention to Interop exhibitors that provide endpoint configuration products and services.

Virtualization can also become part of the solution on both the server and client sides of the problem. We are especially interested in virtualized security appliances that can be placed in front of virtualized servers to protect individual applications. This same technology is starting to appear on user systems as well.

At the end of the day, it will likely be a combination of some form of network-based access control combined with much tighter client-side configuration that will solve the problem of providing network access to information without destroying the network or the clients attached to it. Having said this, however, we think that the questions that Interop will attempt to answer are worth asking, and we will likely see the NAC technologies that are being put forward today for several years to come.


Check out’s Security Center for the latest security news, reviews and analysis. And for insights on security coverage around the Web, take a look at eWEEK’s Security Watch blog.