Keri’s nine predictions about the future of AI:
AI Will Be Transformational
This one doesn’t exactly require a crystal ball. But to truly grasp the magnitude of the coming change, it’s worth thinking beyond abstract concepts like “revolutionary” and “transformational” to really visualize the eventual outcomes of a technology that is constantly improving itself through machine learning.
Take Control of Your AI with Optimal Infrastructure Built For Success
“It may be obvious that AI is going to transform things,” Keri said. “What may be less obvious is that I think we’ll actually use AI as the basis for interstellar travel.”
A Trough Is Coming
Keri noted that there is a “huge amount” of excitement surrounding AI, and he compared the hype around the technology to that surrounding Bitcoin several years ago.
“This is what I call ‘the cab ride test,’” Keri said.
“When you ride in a taxi, and the driver asks you what you think of AI, that’s usually a sign that the bubble is about to froth and spill over.”
This doesn’t mean that companies should pull back their investments, though. On the contrary, Keri said, the projected AI trough presents an opportunity for organizations to move beyond the buzz and do the “real work” of building out and testing new applications.
Model Management Becomes Critical
As AI models proliferate and evolve, organizations will need to ensure that they are up-to-date, secure, and functioning optimally, Keri said.
Effective model management is necessary to guarantee that AI systems are reliable and trustworthy and that they can adapt to changes in the field.
Companies Increasingly Turn to the Hybrid Cloud
According to Keri, AI is the “ultimate” hybrid cloud use case.
“You use public data to create a foundational model, but that’s not going to be enough for your business,” he said.
“You need to refine and augment a foundational model to make it specific to your business, and that can only really be done in your data center, because the moment you do it with infrastructure as a service, you have lost control of your data. And then all of the inferencing happens at the edge.”
Linear Algebra Makes a Comeback
It’s time to dust off those textbooks, as the tricky subject of linear algebra is key to driving forward AI applications like natural language processing and computer vision.
Many operations in AI, like transformations, rotations and scaling are linear algebra operations.
“It turns out that a lot of AI operations involve multiplying matrices and vectors,” Keri noted.
For example, words can be represented as vectors and images can be represented as matrices of pixels.
A grasp of linear algebra helps understand models that plot how changes impact output, something used to debug and improve AI models. It’s key for analyzing and finding meaning in large, complex datasets used in AI applications. Many operations in AI, like transformations, rotations and scaling are linear algebra operations.
Organizations Streamline Infrastructure
Developers should not have to think about the infrastructure, said Keri.
“The developers are thinking about the hybrid cloud app and managing their models,” he said.
“What they want, when they’re doing refinement, is for the right model to show up, and the infrastructure can help with that. If the infrastructure understands what a model is – and what a version of a model is – you can get that model from the public cloud and make it available for refinement, without the developer having to fetch data.”
GPUs Will Be Dethroned
Graphics Processing Units (GPUs) have reigned supreme in the realm of high-performance computing, which powers AI systems – particularly for tasks that require parallel processing, such as video rendering and deep learning.
However, other technologies are poised to challenge GPUs as researchers advance the use of Tensor Processing Units (TPUs), Field-Programmable Gate Arrays (FPGAs), and even general-purpose central processing units (CPUs).
Keri said, “GPUs won’t be king forever.”
Software will eventually help IT systems choose the most available, efficient processing resources.
Scale-Out Infrastructure Will Be Key
Unlike “scale-up” architecture, where the expansion is vertical and involves adding more power to an existing machine (such as more CPUs), scale-out infrastructure is horizontal and involves adding more machines or nodes to a network to increase capacity.
“Because it’s a hybrid cloud app – and because the models have to traverse all the way from the edge, to the data center, to the public cloud – you really need to do this with scale-out infrastructure,” Keri said.
Watch Out for Apple
Keri noted that the world’s largest company has been conspicuously absent in conversations about the future of AI. He said he doesn’t expect that to last for long.
“What Apple has done with its chipset is amazing,” Keri said.
“The M2 has a GPU, it has a CPU, it has a matrix engine. They’re not doing anything on the server side—but if they ever decided to get into the server side, I think it will be just fantastic. They’ll give incumbents a real run for their money.”