SHARE
Facebook X Pinterest WhatsApp

Intel Aims to Improve Big Data Economics

Intel’s broad strategy to increase significantly the utilization rate of x86 servers is part of a sweeping effort to make enterprise computing more efficient—an effort that could have far-reaching implications for the channel. Given all the cores that Intel is adding with each successive wave of new processors, the challenge now is finding ways to […]

Written By
thumbnail Michael Vizard
Michael Vizard
Jan 21, 2014
Channel Insider content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More

Intel’s broad strategy to increase significantly the utilization rate of x86 servers is part of a sweeping effort to make enterprise computing more efficient—an effort that could have far-reaching implications for the channel.

Given all the cores that Intel is adding with each successive wave of new processors, the challenge now is finding ways to optimize the running of, for example, transaction-processing and batch-oriented applications on the same servers. To that end, Intel has been making significant investments in big data technologies such as Hadoop, including adding the ability to deploy a graph database on top of version 3.0 of its Hadoop distribution.

While graph databases are an emerging class of database systems that are increasing in popularity, rather than deploying an additional database that needs to be managed, Intel is making the case for deploying a graph engine on top of Hadoop. That approach not only makes it simpler to manage the overall IT environment by reducing the number of databases that have to be managed, but it also gives organizations another reason to invest in a batch-oriented platform such as Hadoop.

With Hadoop increasingly emerging as the primary source of big data in the enterprise, Intel now views Hadoop as another data type that needs to be orchestrated simultaneously alongside other data types running on an x86 server, said Jason Fedder, general manager of channels, marketing and business operations for Intel’s Datacenter Software Division. That’s significant, explained Fedder, because it means getting the maximum amount of utilization from x86 processors that historically only ran one type of application at a time.

“We foresee a lot of exponential growth in terms of data types and associated application workloads that will drive up utilization,” Fedder said.

For solution providers in the channel, the implications of that strategy are profound. While it may not lead to a shrinking of the physical data center, it does mean that as x86 servers will get more efficient in the number and types of workloads that they can run concurrently. That should make it more economically feasible for organizations to invest in a higher number of workloads per server in a way that is cost effective to deploy, Feder said.

As part of the effort to make its distribution of Hadoop even more appealing, Intel in its latest Hadoop offering is also enhancing the security and management tools that now come embedded in the platform. In the meantime, Hadoop is evolving in a way that allows multiple types of engines to be layered on top of a big data substrate.

As Intel invests big in Hadoop and nonvolatile memory technologies to address some longstanding utilization rate criticisms of the x86 processor, the impact on the channel could be far-reaching, Fedder said. The economic effects on the channel could stem from a variety of factors, ranging from the number of physical servers sold (in light of the fact that more workloads will be able to run on any given server) to the total cost of managing a data center environment where the density of the overall virtual server environment is about to become that much greater, he said.

Meanwhile, Hadoop, in particular, and big data, in general, are emerging as massive opportunities for the channel. The only question now is how best to go about taking advantage of it.

Michael Vizard has been covering IT issues in the enterprise for 25 years as an editor and columnist for publications such as InfoWorld, eWEEK, Baseline, CRN, ComputerWorld and Digital Review.

thumbnail Michael Vizard

Michael Vizard is a seasoned IT journalist, with nearly 30 years of experience writing and editing about enterprise IT issues. He is a writer for publications including Programmableweb, IT Business Edge, CIOinsight, Channel Insider and UBM Tech. He formerly was editorial director for Ziff-Davis Enterprise, where he launched the company’s custom content division, and has also served as editor in chief for CRN and InfoWorld. He also has held editorial positions at PC Week, Computerworld and Digital Review.

Recommended for you...

What Top Technologies IT Leaders Want From Solution Providers in 2025
Jordan Smith
Jan 2, 2025
MSPs’ Guide to Building an AI Strategy
Pamela Winikoff
Feb 16, 2024
What is Partner Relationship Management (PRM) and Why Is It Important?
Sam Ingalls
Nov 19, 2021
Analytics Emerges as a Means to an AI Solution’s End
Channel Insider Logo

Channel Insider combines news and technology recommendations to keep channel partners, value-added resellers, IT solution providers, MSPs, and SaaS providers informed on the changing IT landscape. These resources provide product comparisons, in-depth analysis of vendors, and interviews with subject matter experts to provide vendors with critical information for their operations.

Property of TechnologyAdvice. © 2025 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.