Analytics Emerges as a Means to an AI Solution's End

By Mike Vizard  |  Posted 2017-03-02
Analytics and AI

Most advanced analytics applications today rely on statistical models to project an outcome based on all the data currently available. While not always 100 percent accurate, those models provide much higher degrees of certainty than previous generations of enterprise applications.

Now, however, vendors are embedding machine and deep-learning algorithms into applications that promise to transform IT by using artificial intelligence (AI) that's enabled by advanced algorithms.

Many of the machine and deep-learning algorithms employed in the latest generation of applications are not new, but, previously, there was not enough data collected centrally to enable these algorithms to establish relationships between events consistently.

With the advent of the cloud and data lakes based on Hadoop, the cost of aggregating massive amounts of data has dropped dramatically. It's now economically feasible to apply algorithms to massive amounts of data to enable machines to understand the relationships between various sets of data and then determine actions based on what has occurred.

A New Generation of Intelligent Applications

Over the next few years, just about every application will be replaced by a new generation of intelligent applications. That will create an upgrade cycle for solution providers that is likely to be unrivaled in the history of IT in terms of size and scope.

In fact, Accenture predicts that the IT industry is about to enable a new era of the "intelligent enterprise." Accenture CTO Paul Daugherty said the most striking aspect of that transformation will be the disappearance of traditional user interfaces. Instead of engaging with intelligent applications using a graphical user interface (UI), natural language interfaces will enable users to interact directly with algorithms using bots that understand spoken words.

"AI is the UI," Daugherty said. "The interface disappears."

The two best-known examples of applications making use of advanced algorithms to embed AI functionality based on analytics are the IBM Watson and the Salesforce Einstein platforms. In both cases, IBM and Salesforce are taking advantage of large amounts of data in the cloud and applying algorithms to analyze data to enable cloud applications.

"Because of the cloud, it's actually possible to do something with big data," noted Jeff Kaplan, managing director of THINKstrategies, a consulting firm that specializes in cloud computing.

But those results are not achieved via traditional programming models. Rather, Watson and Einstein are taught to recognize the relationships between various sets of data. Over time, the algorithms enable the machines to continually extrapolate relationships at scales traditional analytics applications can't match.

For advanced big data analytics, see HPE Vertica.

Disclaimer: QS may receive compensation from some of the companies or products reviewed in the article.

Applying Machine-Learning to Data in Legacy Apps

While machine and deep-learning algorithms are being applied most aggressively against data in the cloud, IBM and SAP have signaled their intentions to also apply machine-learning algorithms to data residing in new and legacy applications running on-premises. IBM has announced it is moving to make machine-learning algorithms available on any platform that drives its mainframes, starting with the z/OS operating system.

"We think there's a high opportunity to apply algorithms against legacy applications," said Dinesh Nirmal, vice president of analytics development at IBM. "You’d be able to apply machine learning wherever the application is."

SAP, meanwhile, is focusing its efforts on applying machine-learning algorithms wherever it makes the most sense.

Bernd Leukert, a member of the executive board who oversees product development across SAP, said that in the company's view, it makes more sense to bring algorithms to the data rather than move data to some central repository. In fact, he noted, moving data into a central repository often creates additional cost and security risks that many enterprise IT organizations already find untenable.

Regardless of where it is applied, however, the days of applying analytics against data residing in a separate repository are coming to a close, Leukert said. "Bringing machine learning to the data will be game-changing," he added.

In fact, developers of internet of things (IoT) applications expect entire business processes to be automated via a combination of analytics and machine- and deep-learning algorithms. Sensify CTO Sathish Gajaraju said his company, which partners with SAP, already is working with a client that is enabling direct shipment of paper goods from factories in Asia to specific retail outlets in the United States, eliminating the need for a warehouse.

The process is possible because it's now economically feasible to attach RFID tags to even the most basic of commodity items, Gajaraju said, adding that "It's all about capturing data at the relevant point."

The upside for solution providers is the massive opportunity brought by the new era of the intelligent enterprise. The downside is that the expertise required to tap into that opportunity is getting more expensive with each passing day.

But the path to the intelligent enterprise invariably starts with analytics: When all is said and done, analytics is simply a means to an AI end that soon will be a requirement of any application worth building.