SHARE
Facebook X Pinterest WhatsApp

IT Customers Push Cloud Standards

The newly formed Open Data Center Alliance is using an array of usage models to weld cloud-using customers into a force that prevents vendor lock. At the same time, the group is promoting secure movement of virtual workloads from one provider to another. The organization is made up of more than 200 members including JPMorgan […]

Aug 5, 2011
Channel Insider content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More

The newly formed Open Data Center Alliance is using an array of usage models to weld cloud-using customers into a force that prevents vendor lock. At the same time, the group is promoting secure movement of virtual workloads from one provider to another. The organization is made up of more than 200 members including JPMorgan Chase, Lockheed Martin and Marriott.

While there are other nascent cloud-user organizations forming, namely CSCC (Cloud Standards Customer Council), ODCA (Open Data Center Alliance) in June issued eight usage models that organizations can use today when specifying baseline requirements for cloud projects.

The formation of both customer groups comes at a seminal moment for data center design. Virtualization is driving compute, storage, networking, application and desktop IT managers to drastically increase the efficient use of costly resources. And the option to outsource some or all virtual workloads is a bell that cannot be un-rung.

The emergence of the ODCA s usage models is a recognition of the seismic changes in data center operations. IT managers must now provide strategic guidance to C-level managers and line-of-business leaders for incorporating the changes being wrought by virtualization and cloud computing while avoiding vendor lock in.

The usage models from the ODCA can help in this effort. However, my analysis shows that many of the guidelines can be immediately strengthened and made more practical. For example, the ODCA Security Provider Assurance guide doesn t spell out exactly what level of law enforcement action is needed for the provider to turn over your data. In a private data center, there are understood procedures and boundaries on the execution of search warrants. In a hosted environment, data protections from unwarranted law enforcement searches are murky. Therefore, IT managers should demand very specific answers from providers about the safeguards in place to prevent data loss when the governmental agency comes knocking.

The Usage Models

All together, there are eight published usage models that fit into four general categories: secure federation, automation, common management and policy, and transparency.

Secure federation is made of the SM (Security Monitoring) and the SPA (Security Provider Assurance) models. The SM usage model depends heavily on work being done at the Cloud Security Alliance and CloudAudit, both of which are made up primarily of security service vendors. Among the more interesting usage requirements is the daunting ability of the cloud provider to supply dedicated capabilities with specific resources and reserved for specific customers.

The SPA document has three stated purposes that are backed up with a four-category, bronze-to-platinum rating system. The publication also enables cloud consumers to compare security levels from one provider to another and between internally and externally hosted clouds. The SPA usage model should make it easier for cloud consumers to understand and select among various levels of security offered by providers. As previously stated, this usage model should be augmented to probe when a search warrant would result in the loss of data control.

The automation category encompasses I/OC (I/O Control) and VMW (VM Interoperability). The I/OC is a short document that references one of the big problems raised by increased virtual-machine density in a cloud environment: I/O contention. The I/OC weighs in on the side of work being done by NIST (National Institute of Standards and Technology) and the DMTF (Distributed Management Task Force) when it comes to I/O bottlenecks. To control for I/O bottlenecks, the I/OC publication focuses on monitoring, SLA metrics, APIs, timeslice controls and I/O reservations with the expectation these requirements could be met in multi-vendor environments using non-proprietary protocols.

To read the original eWeek article, click here: Cloud Standards Get Customer Push

Recommended for you...

Scale Computing Makes Strategic Updates to HyperCore Solution
Jordan Smith
Sep 17, 2025
Druva Launches Metadata Graphing & New Agentic AI Solutions
Jordan Smith
Sep 17, 2025
SonicWall’s Michael Crean on State of Managed Security
Victoria Durgin
Sep 17, 2025
Gigamon Unveils Agentic AI App to Boost IT Productivity
Luis Millares
Sep 16, 2025
Channel Insider Logo

Channel Insider combines news and technology recommendations to keep channel partners, value-added resellers, IT solution providers, MSPs, and SaaS providers informed on the changing IT landscape. These resources provide product comparisons, in-depth analysis of vendors, and interviews with subject matter experts to provide vendors with critical information for their operations.

Property of TechnologyAdvice. © 2025 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.