With the aim of making artificial intelligence (AI) even more accessible for everyday life and work processes, OpenAI provided insights into new developments at its developer event in San Francisco on Tuesday. The focus is on the integration of AI agents capable of taking on complex tasks and enabling human-like interactions.
We want to make interacting with AI as easy as it is with another human," said Kevin Weil, Chief Product Officer at OpenAI. "These advanced agent-based systems are becoming a reality, and I think 2025 will be the year they reach the mainstream.
OpenAI introduced the availability of its new model series "o1" at the event, which features enhanced logical reasoning and extended language capabilities. With these developments, the company aims to strengthen its position in the race for market leadership in AI technology. Particularly interesting for developers: The models can respond to voice commands in real-time and interact in a live scenario similar to a telephone conversation.
This step is considered crucial for leveraging OpenAI's technical advancements to generate future profits, according to the company, which is currently undergoing restructuring towards a profit-oriented entity.
Parallel to technological advancements, OpenAI is conducting a financing round of 6.5 billion US dollars. With a targeted valuation of 150 billion US dollars, the company has engaged in negotiations in recent weeks with prominent investors, including Microsoft, Nvidia, SoftBank, as well as venture capital firms Thrive Capital and Tiger Global, according to individuals familiar with the matter.
OpenAI is not the only company that is increasingly focusing on AI agents. Just last month, Microsoft, Salesforce, and Workday put AI agents at the center of their strategies. Google and Meta have also announced their intentions to integrate AI models into their products to leverage the new technology.
Despite numerous attempts in the past, the industry has only now succeeded, through the use of Large Language Models (LLMs), in making interactions smoother and significantly improving the level of understanding. These new advancements enable agents to perform tasks more precisely and autonomously.
Last year, OpenAI released an 'Assistants API' which was intended to enable developers to create their own AI agents based on the company's technology. However, due to the technical limitations of earlier models, the project encountered hurdles. But with the latest advancements, the foundation has now been laid to significantly improve the assistance functions, said Weil.
The potential applications of the technology were demonstrated by OpenAI at the event using an example: A user spoke to an AI system to buy strawberries locally. The AI took the instructions and called the store to place the order – including detailed specifications such as quantity and desired price. The company emphasized that the technology always operates transparently as AI and not as a human.
If we do it right, we live in a world where we can spend more time on the things that really matter to us and a little less on constantly staring at our phones," said Weil.