A significant portion of large enterprises have adopted machine learning and artificial intelligence services frompublic cloud service providers. These companies are using the cloud providers' advanced AI technologies and pretrained machine learning models for proof-of-concept applications, decision-making analytics and data-driven task automation.
With the explosion of generative AI (GenAI), thebusiness decision to use AI in cloud computing follows a familiar path. Gartnerreported that approximately 70% of enterprises already use some form of public cloud services provided by the big three -- AWS, Microsoft Azure and Google Cloud -- although Alibaba Cloud and Huawei Cloud also own notableIaaS market share.
IaaS providers run numerous AI and machine learning (ML) workloads in multi-tenant cloud environments. Organizations use public cloud to implement AI to access a complete stack of technologies and services, including computational power to process large amounts of data, as well as storage, data analytics, large language models (LLMs), AI algorithms, APIs and a host of tools maintained by the cloud service providers (CSPs). All these technologies are accessible via the internet. Companies subscribe to these services, often on a monthlypay-as-you-go basis, and use service-level agreements (SLAs) to ensure performance levels, security measures, downtime and disaster recovery plans, as well as compensation if something goes wrong.
Early AI implementations in the cloud focused on access to computational resources and database storage. These projects often required data science expertise and the ability to bring your own resources. GenAI flipped the switch. GenAI finally gained traction when OpenAI launched ChatGPT, an early text-based chatbot technology, in November 2022.
GenAI's humanlike text generation requires LLMs that are continuously trained on large text-based data sets. As medium and large companies begin to adopt AI policies and test drive AI and ML capabilities in the public cloud, more organizations are investing in ways to create business value by taking advantage of GenAI's economies of scale.
Three-fourths of organizations worldwide are running their GenAI workloads on public cloud providers, according to a September survey by TechTarget's Enterprise Strategy Group. IT and business decision-makers responding to the multiple-response survey are involved in their company's GenAI initiatives, which range from proof of concept to production. By contrast, 42% of respondents are using on-premises data centers and 24% on-premises edge computing. "Most companies, the bigger they are, use a 'blend' because they have their own data centers," said Mark Beccue, principal analyst at Enterprise Strategy Group. "They have their own cloud. They kind of hedge their bets."
Cloud AI benefits are many and varied, including theability to access services from any location, pay only for services as needed, experiment with projects cost-effectively, build LLMs on AI infrastructure and collaborate with experienced cloud AI teams.
Public cloud providers offer a faster and sometimes cost-effective way to try out proof-of-concept projects using the latest AI services with access to prebuilt models and tools. For many enterprises, these resources and collaboration withhighly trained AI cloud computing teams offer valuable partnerships not available on-premises. All three major cloud providers, Beccue noted, are at the top of the AI field and have been investing in AI for a long time.
Updating on-premises infrastructure that has the computing power (GPUs), storage capacity and network bandwidth to support the data pipeline and training of GenAI models is a major investment. Businesses can benefit from the cost savings of AI in cloud computing when they only pay for what they use. "Companies are picking public cloud right now because it is pay as you go," Beccue said. "When you are testing the waters, this is a great way to do that. You can spin things up pretty quickly."
Cloud computing offers a way to test, build and scale LLMs on AI infrastructure.Project managers can adjust their servers and databases to meet demand by scaling the systems up or down. This flexibility can help businesses handle peak usage, high volumes of data and unexpected events.
Companies that have adopted distributed and remote work environments can access their AI services and technologies from anywhere if their teams have internet connectivity.
Public cloud providers offer a multi-tenancy model with shared resources. A major concern for many companies is where the data for the AI technologies resides. Heavily regulated industries can face strict data privacy, security and compliance requirements.
Companies comply with data privacy regulations (HIPAA andGDPR) and privacy standards (ISO 31700, ISO 29100, ISO 27701, FIPS 140-3 and NIST Privacy Framework) or risk penalties. AI projects increase the risks because massive amounts of real data dictate the behavior of ML training models. Model developers need to ensure that data is treated with fairness and transparency, said Rob van der Veer, senior principal expert at software assurance platform provider Software Improvement Group, co-editor of the EU's AI Act security standard and advisor to ISO/IEC and Open Worldwide Application Security Project. The data privacy requirements in theEU's GDPR are not specific to AI. Meanwhile, California, one of the U.S. states with data privacy regulations (CCPA and CPRA), is taking the lead in passing GenAI-related bills. Data residency and geolocation are also concerns with cloud AI, especially in Brazil, Singapore and the EU. Companies can set boundaries around data location in their SLAs.
Public cloud providers offer security and compliance frameworks that can aid anomaly detection in real time. Many companies adopt cloud data storage running on CSPs, but their sensitive data remains on-premises to meet information security and compliance requirements. Cloud data storage and analytics platform maker Snowflake, which processes proprietary and sensitive data of many Forbes Global 2000 companies on AWS, Azure and Google Cloud, was breached when, according to the company, a user logged in and failed to use multifactor authentication. Companies that use public cloud AI services should examine the CSP's monitoring and logging tools, employ data encryption at rest and in transit, require strict identity and access management controls, and perform regular audits for compliance.
For many companies, thecost of cloud AI services is difficult to gauge. Shadow AI, similar to shadow IT, is another concern. The FinOps Open Cost and Usage Specification (FOCUS 1.0), released in June, aims to normalize cloud billing for IaaS by using a common taxonomy and metrics for cost and usage data sets. AWS, Google, Microsoft and Oracle contributed to the open source project, which is hosted on GitHub. FOCUS can be extended toSaaS.
Public cloud providers might favor integration with their own services instead of third-party applications, which could lead to vendor lock-in. Data integration remains a major challenge for AI deployments.AI models require massive amounts of structured and unstructured data often coming from fragmented systems whose protocols and APIs might need updating to facilitate data exchange. Many organizations rely on legacy IT systems that may not be compatible with modern AI technologies and standards.
Finding personnel with cloud expertise is challenging enough -- not to mention data scientists, AI and ML engineers, or the nearly impossible-to-find prompt engineer, a role that all too often gets added to the duties of another member of the AI team. Google Cloud offers a prompt-grounding tool designed to address prompting tasks. Project managers need to ensure that AI and software teams follow best practices. Data engineers might not know about standard software development practices, such as versioning, unit testing and keeping documentation up to date -- even when experimenting with AI. Software engineering teams tasked with AI model alignment -- ensuring the AI system matches the designer's goals and is ethically sound -- might lack AI expertise.
Commercial data sets that augment the data pipeline used to train and fine-tune the LLMs needed for AI can help companies get started. But some data sets may offer limited information, which can lead to bias or diversity issues. Improper handling of private, confidential and copyrighted data used to train AI models can result in compliance violations and lawsuits.
Cloud AI business applications, such as the following, permeate numerous industries, including retail, customer service, financial services, product development, manufacturing and IT automation:
Many companies with proof-of-concept AI projects fail to make it to the production application phase, partly because no one has clearly identified the business objectives.
"What business problem is the organization trying to solve?" Beccue offered. Acost-benefit analysis could determine if AI cloud is the answer. "The business case will give you the answer of where you want to go," he explained. Companies should also look at their core capabilities and determine if they should build, buy or partner with other companies.
"There has been a proliferation of companies that all claim that they have generative AI capabilities to offer to organizations," said Eric Buesing, partner at McKinsey & Company. Business leaders can be overwhelmed by the evangelism. "I think the hyperscalers are the most advantaged," he reasoned, pointing to Microsoft, AWS, Google and Salesforce, among others. "These organizations are investing a tremendous amount. They already have the contact center technology infrastructure, in many cases, in many organizations. They are betting heavily that they will unlock use cases not only for agents, but for the end customer."
As more organizations invest inhybrid and multi-cloud architectures, a host of cloud AI startups offer GPU infrastructure, storage and related services.IBM Cloud and Oracle Cloud Infrastructure are also in the mix for public and hybrid clouds, especially for companies that want industry-specific AI tools or systems integration. "Public cloud," Beccue said, "is set up well as a key piece of AI going forward."
Kathleen Richards is a freelance journalist and industry veteran. She's a former features editor for TechTarget'sInformation Securitymagazine.
Quiz yourself on cloud computing basics
Cloud networking vs. cloud computing: What's the difference?
Decentralized data centers enhance scalability, reduce latency and improve data compliance, offering a strategic shift for ...
Data center admins should adopt a composable architecture to improve resource utilization, reduce costs and enhance AI workload ...
There are regulated requirements to maintain data center equipment and functionality. ISO 14644 cleanroom standards lay out ...
The director of engineering for a Fortune 20 automotive company spoke on the latest cloud-native tools for platform engineering, ...
CIOs face mounting talent challenges as tech skills rapidly become obsolete and companies poach top performers. Five proven ...
Presentations at KubeCon 2025 detailed efforts since last year's conference to enhance support for AI on Kubernetes platforms and...
Compare Datadog vs. New Relic capabilities including alerts, log management, incident management and more. Learn which tool is ...
Many organizations struggle to manage their vast collection of AWS accounts, but Control Tower can help. The service automates ...
There are several important variables within the Amazon EKS pricing model. Dig into the numbers to ensure you deploy the service ...
Take advantage of Horizon 7 with VMC on AWS for a better hybrid cloud user experience, new use cases, and several different ...
Hock Tan talked only for several minutes during the keynote at VMware Explore 2025, but in that time, he laid the groundwork for ...
The first major revision of the VMware platform under Broadcom's ownership is now available to all VMware Cloud Foundation and ...


