Cloud AI services slowly but surely lure enterprise IT shops


As IT pros experiment with AI, many will do so on the public cloud. But choosing from an ever-growing list of AI services from AWS, Azure and others is no easy task.

Artificial intelligence is no longer reserved for sci-fi flicks, but the technology is still alien to many enterprise IT teams. Nevertheless, one adoption trend is clear: the public cloud will be the go-to destination for most enterprise AI workloads.

"I'm not saying that AI won't happen in the enterprise in people's data centers, but this is a workload that will predominantly happen in the cloud," said Rob Koplowitz, principal analyst at Forrester Research.

Some organizations will choose to keep AI applications -- particularly those that contain sensitive customer data -- in-house, as they do for other workloads with strict security or compliance requirements. But, in general, public cloud AI services will be the predominant model, agreed Adrian Bowles, lead analyst at Aragon Research.

One of the biggest reasons the cloud is a particularly good fit for AI is experimentation, Bowles said. Because most organizations are still exploring potential uses for technologies such as machine learning, predictive analytics or natural language processing, they want an environment that lets them experiment, without significant financial investment or risk.

"A large percentage of the enterprises that are using the public cloud right now for AI are using it as a test bed -- an inexpensive way to get started, and to figure out which applications are going to be amenable to different forms of AI," Bowles said.

Public cloud platforms including Amazon Web Services (AWS) and Microsoft Azure allow organizations to test different machine learning algorithms, for example, to see what might be possible with their data. From there, organizations can choose between two options: fail it or scale it.

"If your [AI application is] going to fail, great -- then you can move on to something else," Bowles said. "If it scales, then you're already in a place where you can scale rapidly … [The cloud] frees you up for that experimentation."

What's more, organizations often choose the public cloud for AI deployments because of the range of resources available.

"[In the cloud], it's easier to say, 'Well, I'm going to start with some natural language processing,'" said Nicola Morini-Bianzino, global lead of the artificial intelligence practice at Accenture, a consulting and professional services firm. "Then, you can move some of your data onto the cloud and decide to do something different around computer vision, and just extend and use those APIs on top of the infrastructure and data you have already created."

Public cloud also eliminates the need for organizations to invest in expensive, specialty hardware that many AI workloads require. Most of the major public cloud providers, for example, now offer cloud instances based on GPUs, which are especially beneficial for compute-intensive AI workloads, Koplowitz said.

Challenges with AI in the public cloud

Of course, any emerging technology, including AI, presents the enterprise with a learning curve. IT teams may not need to overhaul their underlying cloud infrastructure for an AI deployment, but they must evolve their skill sets in other ways to adopt a more data centric-mentality, Morini-Bianzino said.

For a successful AI deployment, IT teams must hone their data analysis skills and learn to recognize certain patterns or relationships within large enterprise data sets -- because AI is only valuable as the data you feed into it.

"The value of [a machine learning] algorithm is a direct function of the value of that data that you are pushing through the algorithm," he said. "So if the data is not good, the algorithm is not good either."

Data analysis skills are increasingly important as IT teams pursue machine learning, agreed Bowles. Part of this is because, with machine learning, IT systems can improve their performance over time through exposure to data, rather than through reprogramming.

In addition, infrastructure management teams should attempt to break down barriers with developers. The more admins learn how AI applications are built and consumed, the smarter choices they can make from an infrastructure perspective, said Lori Brown, managing consultant for the healthcare IT group at PA Consulting.

"As [IT teams] see the way AI development changes, and what impact that has on their infrastructure and consumption, they can be wiser about how they make their purchases of public cloud services to support AI," Brown said.

Pick your brain

Another big challenge IT teams face with AI applications is how to choose a cloud provider. As vendors release new cloud AI services at a dizzying pace, it can be tough to know where to start.

The top public cloud providers have emerged as dominant AI vendors as well: AWS, Azure, Google and IBM. Each vendor varies in strengths, weaknesses, and use cases, but their individual services cover several common AI features: machine learning, image recognition, natural language processing and text-to-speech capabilities. And niche players in the cloud provider market have yet to mount a challenge.

AWS, the leader in public cloud adoption, pulled back the curtain on three AI-based services at its re:Invent conference in 2016. Amazon Rekognition provides a platform for image processing, Amazon Polly uses deep learning to turn text into speech, and Amazon Lex uses the same automatic speech recognition technology as Alexa so developers can build conversational interfaces with voice and text.

The ability to integrate AWS' various compute, storage, content delivery, and developer tools entice enterprises as much or more than its Amazon AI suite alone.

In addition to its popularity as an app-dev platform, the popular Amazon Echo smart-home device gives enterprises an in-road to package applications that interact with consumers.

"In the same way that we used to think about capturing eyeballs, if voice is going to become a prevalent way of interacting with computers, then there's a great deal of value in being the system that captures your words," Koplowitz said. "There were an awful lot of [Amazon Echo devices] sold at Christmas last year, and there's a pretty good chance that I can reach you through that device."

At its Build conference in early May, Microsoft appealed to enterprise employees with Microsoft Graph, a service that gains insights from employee activity to improve productivity and plan meeting times and collaborators for projects. Microsoft Cognitive Services provides a broad set of APIs that enable speech, language, knowledge, search and vision technologies for AI developers.

Microsoft's Cortana front-end natural language understanding (NLU) digital assistant provides another customer-facing service in line with Amazon Alexa and Google Assistant -- which also appeals to enterprise customers in some industries.

Independent software vendors building AI systems for clients generally turn to AWS and Azure cloud AI services, owing to the popularity of those providers. "They almost always will offer AWS and Azure very early in their life, as they’re trying to create a business model for AI as a service," Bowles said. "That's where the people are."

Google's advantage is in data access and processing, which it uses internally. In addition to the open source machine learning library TensorFlow, Google Cloud Platform's APIs enable a range of AI skills, but none more promising than the predictive analytics capabilities of its machine learning tool.

"Data is going