Apple is working on a version of its Siri voice assistant that will use advanced AI powered by large language models (LLMs), Bloomberg has reported. The technology will allow users to perform specific app functions with their voices, such as opening documents, sending emails and more.
The new version of Siri will only work on Apple’s own apps to start with. It won’t arrive with iOS 18 but may be released subsequently as an update early next year, the report states.
The assistant will be able to analyze your phone’s activity and automatically enable Siri-controlled features. It’ll support “hundreds” of commands but will only be able to process one at a time at first, according to the article. Later, Siri will support multiple tasks in a single request.
At first, supported commands would include sending or deleting emails, opening a specific site in Apple News, emailing a web link or asking for an article summary. Once multiple commands are enabled, it’ll be able to summarize a recorded meeting then text it to a colleague, all in one request. “Or, an iPhone could theoretically be asked to crop a picture and then email it to a friend,” Bloomberg‘s Mark Gurman wrote.
It’s unclear which LLM Apple will use to power this version of Siri, but the company reportedly just reached a deal with OpenAI to integrate ChatGPT into iOS 18. At the same time, Apple may also be negotiating with Google to integrate Gemini AI into search on iPhones. Apple will reportedly handle many AI requests on-device, while using the cloud for more complex commands.
Apple is expected to focus on AI for its WWDC 2024 conference that runs from June 10 to June 14. At that point, it may reveal that it reached a deal with OpenAI to integrate ChatGPT into iOS 18. We may also see AI-powered features like voice memo transcriptions and summaries, website recaps, automated message replies and more.
This article contains affiliate links; if you click such a link and make a purchase, we may earn a commission.
Trending Products