|
Welcome back! OpenAI and Amazon last week said they were developing a new cloud service for a new generation of AI agents, but their description of the service generated confusion among AI buyers. Let’s clear it up. The new AI cloud service aims to help Amazon Web Services customers develop custom AI agents— powered by OpenAI technology—to automate business tasks. AWS would sell other OpenAI software to manage such agents. The service would run on AWS as a “stateful runtime environment,” which is a lot of jargon that essentially means the agents would remember specific details about a customer, such as information about their businesses, or where their last AI chatbot conversation left off. That differs from the way AI largely works today, as it responds to questions or requests in isolation. (Chatbots like ChatGPT and Gemini remember things about their customers but tap a variety of generic models to answer questions, rather than models tailored to the customers.) A “stateful” AI service, which OpenAI said would launch in the next few months, could be particularly useful for upcoming AI agents that require a good memory of actions they’ve previously taken as well as up-to-date information about the customer. Such agents might track and audit customer finances or monitor websites to fix outages. The reason this is significant is that, thanks to its early investment in OpenAI, Microsoft has exclusive rights to sell “stateless,” or generic, versions of OpenAI models to cloud customers. The new Amazon-OpenAI product seems to partly get around those rights by selling a service for businesses to develop AI agents, rather than selling access to the raw models themselves the way Microsoft does. Depending on what task they’re trying to achieve, a stateful AI service may need to access the stateless models from time to time. That means customers using the AWS product could draw from the OpenAI models hosted on Microsoft’s cloud—in which case Microsoft would generate revenue from the process, the companies implied in a statement. But in most cases, customers of the stateful AWS product will be able to rely entirely on customized versions of open source OpenAI models that run on AWS without needing the Microsoft-hosted versions, according to an Amazon spokesperson. The new OpenAI-AWS service won’t be for everyone. It’s geared to companies that work closely with OpenAI—including those relying on the AI firm’s consultants, which it calls “forward deployed engineers”—to develop custom versions of models that they would run as part of a product called Frontier, a person who was briefed about the details said. (We covered Frontier and its implications here, and OpenAI’s FDE hiring and custom-model work here.) In other words, it remains to be seen whether the Amazon product will gain steam with customers in the way Microsoft’s sales of OpenAI’s raw models have. For the time being, a majority of AI app developers are more likely to pay for stateless versions of OpenAI’s models because they’re much cheaper than stateful versions for powering simple applications like generalized chatbots that are used by a large number of people at the same time, according to an OpenAI employee with knowledge of its business. Using a stateful model is more expensive and more appropriate for specialized uses, such as a company’s customer support chatbot, an internal IT chatbot and many AI-powered applications in fields like finance, gaming and commerce. Given how quickly AI agents are progressing in areas like coding and general white-collar tasks (for instance, Anthropic’s Cowork and OpenClaw-based agents), it’s possible that usage of stateful AI, including OpenAI’s, could eventually overtake stateless AI. So far, application programming interfaces for buying stateless models have been a blockbuster business for Anthropic, OpenAI and Microsoft, each of which is generating billions of dollars a year from such API sales. But OpenAI has projected its sales of AI agents and other non-API products will be larger than API sales by 2028, The Information has previously reported. Sri Muppidi contributed to this article.
|