AI is driving rapid evolution of networked computing and data centers (Reader Forum)
In the world of networked computing and data centers, any 2022 planning report for business decision makers is likely to have three familiar words in the executive summary: distributed computing infrastructure.
The accelerated changes brought on by the pandemic are forcing businesses large and small to rethink the data center. To stay relevant to customers and employees working in hybrid locations, companies are increasingly choosing between an assortment of public cloud, on-premises computing, and colocation to determine what best suits business needs. business.
While each company’s strategies may differ, a common denominator for many companies is their eagerness to put AI and data analytics at the center of their planning. AI, such as recommender systems, simulation software, and natural language processing, is seen as key to boosting productivity, delivering new products and services, solving massive supply chain problems, and more. .
Researcher 650 Group predicts enterprise and cloud data center equipment spending will exceed $200 billion in 2022, growing more than 6% from 2021. Enterprise spending remains focused on support of multi-cloud and hybrid computing, while cloud spending continues new workloads and applications.
Based on conversations with partners and customers, some key NVIDIA initiatives include:
AI as a service
Companies that are hesitant to spend time and resources investing in their own AI infrastructure, whether for financial or other reasons, will begin to turn to third-party vendors to accelerate time to market.
Large enterprises, including the Fortune 500, will deploy a hybrid approach to AI by choosing a combination of on-premises and cloud solutions, said Alan Weckel, founder and technology analyst at 650 Group.
Small and medium-sized businesses will primarily rely on AI-as-a-service offerings for their AI workloads. Within AI, workloads, compute and networking significantly outpace industry growth over the past five years, Weckel said.
The data center is the new computing unit
Applications that previously ran on a single computer no longer fit in a single box. The new world of computing will increasingly be software-defined and hardware-accelerated.
As applications crumble and mine massive datasets, the network will be seen as the fast lane between many servers acting together like one huge computer. Software-defined data processing units will serve as distributed switches, load balancers, firewalls, and virtualized storage devices that bring this computer together at data center scale.
Growing trust in zero trust
As applications and devices move seamlessly between the data center and the edge, enterprises will need to validate and compose these applications from microservices. Zero trust assumes that everything and everyone connected to a business system must be authenticated and monitored to verify that malicious actors are not attempting to enter the network. Everything must be protected both at the edge and on every node in the network. Data will need to be encrypted using IPSEC and TLS encryption, and each node protected by advanced routers and firewalls.
The next enterprise data centers will belong to someone else
Many companies gave up owning their own data centers when they moved to cloud computing. So in 2022, enterprises will realize it’s time to start leveraging colocation services for high-performance AI infrastructure. Ease of deployment and access to infrastructure experts who can help ensure 24/7/365 availability will allow more businesses to benefit from on-demand resources delivered securely, where and when they are needed.
As businesses increasingly embrace AI to increase efficiency and revenue, the number of applications that will be made available in many of the world’s largest industries will grow exponentially.
This will create a huge pipeline of data flowing into giant data centers like Microsoft Azure from self-driving cars, robots in the factory, cameras in the warehouse, and medical equipment in the hospital.
This pipeline will enable inference learning at the edge and iterative model training at the core. At the same time, updated deep learning models and inference rules will flow from data centers to users, computers and smartphones at the edge.