ChatGPT undoubtedly turned out to be a revolution that probably forever changed the way we find information and solve problems today. Apple does not intend to give up on this revolution and is preparing its own assistant with the support of generative artificial intelligence.
I guess no one expected the repercussions of the premiere of ChataGPT, which will mark exactly 2 years in a few days. NVIDIA has moved into the vanguard of the most expensive companies in the world, and subsequent companies are competing with each other in presenting more and more advanced models. OpenAI still seems to be the leader in this race, but hot on its heels are Google with its Gemini model, and Anthropic with its Claude AI model. Apple seems to be lagging a bit behind, but it doesn’t seem to be for long.
LLM Siri in preparation
Reliable Mark Gurman z Bloomberga reports that Apple has decided to significantly expand the capabilities of the Siri assistant. The Cupertino giant already offers its Apple Intelligence service, which is based on generative artificial intelligence, but it intends to go one step further. Siri is great for simple tasks, but given the development of Gemini from Google and Alexa from Amazon, it can’t afford to be left behind. That’s why Apple started working on a new version of the assistant, which is currently called LLM Siri (Large Language Model Siri). It will be an assistant with whom you will be able to talk in natural language, ask for information, and still open the garage door or turn on the lights in the living room.
The LLM Siri assistant is scheduled to be unveiled in spring 2026 and will appear on both Apple smartphones and speakers. We should learn more details next year, probably during the WWDC conference. It also looks like Apple intends to create its own model and will not use the help of other companies. Apple Intelligence is largely based on OpenAI solutions. The company’s representatives have also been emphasizing their commitment to privacy for many years, so it is possible that LLM Siri will be a small model that can be used locally, using the NPU built into the processor. One thing is certain, if another deep-pocketed giant joins the race, the boom in AI model training processors has probably not yet reached its peak.
Source: antyweb.pl