Apple LLM
Regarding its own large language model (LLM) technology development, Apple seems more interested in on-device processing and practical apps than in public chatbots, even though some other companies are working on similar projects.
Ajax on Apple
According to Apple Insider, the business is focused on enhancing its present offerings, including apps and services, and it plans to roll out several artificial intelligence (AI) enhancements in the upcoming iOS 18 version. The new features will be powered by Apple's enormous language model, codenamed Ajax, which aims to provide users with beneficial capabilities while safeguarding their privacy through on-device processing.
A key feature of iOS 18 that is anticipated is Siri's capacity to review and summarise text messages in the Messages app. This feature will save users from having to read through entire threads in order to quickly grasp the main ideas of lengthy talks.
Spotlight and Safari will soon get AI upgrades
According to Apple Insider, Safari will get a "Intelligent Search" feature in addition to the Messages app that would let users generate summaries of webpages. The company is also focusing on using AI to improve Spotlight Search so that it can provide relevant information in response to user searches and be more context-aware.
Ajax LLM on Apple OpenAI Siri
Apple's Ajax LLM reportedly has the ability to produce simple responses entirely on the device, ensuring speedier processing and greater privacy protection. However, for more complex queries, the software could need to rely on server-side processing, which might require working with Google or OpenAI.
Ajax LLM on Apple Siri text texts OpenAI
According to Apple Insider, the company will give users control over their data by showing privacy alerts before Ajax accesses information from Safari or Messages.
AI-generated playlists might soon be available on Apple Music.
During a recent earnings call, Tim Cook, the CEO of Apple, expressed the company's confidence in its AI initiatives and highlighted its unique benefits. Cook claims that Apple stands out in the artificial intelligence space because of its "unwavering focus on privacy." While details on the exact AI features in iOS 18 are still few, we hope to learn more about Apple's plans at the next Worldwide Developers Conference (WWDC) on June 10.
Some claim Apple is lagging behind in AI. Most of Apple's rivals have raced to catch up after ChatGPT got popular in late 2022. While introducing a number of AI-inspired devices and talking about AI, Apple appeared to be treading carefully.
But rumours and suspicions have it that Apple has been holding off on taking action for months. According to recent reports, Apple has been working on Ajax and speaking with Google and OpenAI about powering some of its AI capabilities.
Apple's approach to AI is illustrated by its openly available research. Making conclusions about products based only on study articles is inaccurate because the route from research to store shelves is convoluted and full with obstacles. Apple is set to unveil its AI technologies and their potential functionality at its yearly developer conference, WWDC, in June.
More compact and effective models
Improved Siri. Looks like Better Siri is on the way! Many studies conducted by Apple, the tech industry, and other sources make the assumption that large language models will automatically improve the intelligence of virtual assistants. To enhance Siri, Apple has to produce those models as soon as possible and globally.
According to Bloomberg, iOS 18 will allow Apple to operate all AI features offline on the smartphone. It is challenging to create a robust, multipurpose model, even with hundreds of state-of-the-art GPUs and a network of data centres. The internals of your smartphone are considerably more difficult. Apple must therefore innovate.
An approach to store model data on the SSD rather than RAM was devised by the researchers in "Apple LLM in a flash: Efficient Large Language Model Inference with Limited Memory." Using SSDs, the researchers were able to execute Apple LLMs up to double the capacity of available DRAM and achieve an acceleration in inference performance of 4-5x in CPU and 20–25x in GPU when compared to standard loading techniques. They discovered that versions that make use of the cheapest and most accessible storage on your device run faster and more effectively.
In order to compress an Apple LLM without making it worse, Apple researchers developed EELBERT. Their downsized Google Bert model lost only 4% of its quality while being 15 times smaller (1.2 megabytes). Tradeoffs with latency did occur, though.
Apple is generally attempting to address a conflict in the model world: while larger models are generally better and more useful, they are also heavier, slower, and require more power. The company is trying to have it all and balance all those aspects, just like many others.
AI products, such as Apple Siri, are mostly virtual assistants that can do tasks such as reminders, information retrieval, and task fulfilment. It's hardly unexpected that the main focus of Apple's AI research is what would happen if Siri were exceptionally intelligent.
To utilise Siri without a wake phrase, Apple engineers are developing a method that involves turning off the alert for "Hey Siri" or "Siri." The device could be able to tell the difference between the two. The lack of a leading trigger phrase to indicate the start of a spoken command makes this task far more difficult than speech trigger detection, according to the researchers. Maybe for this reason, a different group of researchers developed a more advanced wake word detection system. Helpers find it difficult to understand uncommon words, which was taught to a model in a different paper.
It's enticing that Apple LLMs can handle more data faster in both scenarios. The researchers discovered that feeding the model all unnecessary noise and allowing it to process the important information produced better results (wake-word report).
After hearing you, Apple makes a lot of effort to enhance Siri's comprehension and communication. In a single study, it created STEER (Semantic Turn Extension-Expansion Recognition) to enhance back-and-forth communication with an assistant by recognising new and follow-up inquiries. Another uses Apple LLMs to interpret "ambiguous queries," meaning that they can figure out what you mean no matter how you say it. "In order to solve problems more effectively, intelligent conversational agents may need to take the initiative to reduce their uncertainty by asking good questions proactively in uncertain situations." Apple LLMs were used in another project to improve the comprehensibility and verbosity of the assistants' responses.
Memojis, picture editors, and medical AI
Apple prioritises the practical applications of AI over its sheer technological power when speaking about it in public. Apple sees additional applications for artificial intelligence (AI), even if Siri receives a lot of attention as it competes with devices like the Humane AI Pin, the Rabbit R1, and Google's continuous integration of Gemini into Android.
Apple LLMs, which could help you comprehend the vast amounts of biometric data your devices are collecting, may prioritise health. Apple has been researching ways to track and interpret heart rate data, identify you with headphones and gait recognition, and gather and analyse motion data. Apple gathered data from 50 people using multiple on-body sensors, and then released “the largest multi-device multi-location sensor-based human activity dataset."
AI is viewed as a creative tool by Apple. In order to develop Keyframer, a tool that “enable[s] users to iteratively construct and refine generated designs,” researchers spoke with animators, designers, and engineers for one study. Instead of inputting another prompt to get another image, you start with a prompt and get a toolkit to adjust and fine-tune features of the image. Professional artistic tools from Memoji or Apple may exhibit a back-and-forth creative process.
In a separate study, Apple presents MGIE, a tool that lets you alter an image by defining the edits. "Add some rocks," "make my face less weird," "make the sky more blue," etc. Researchers found that MGIE extracts precise visual-aware intent and leads to suitable picture modifying, rather than providing brief but confused guidance. Although impressive, its early experiments were flawed.
Apple will probably rely more on this in the future, particularly with iOS. Part of it will be incorporated into Apple apps, and third parties will be able to access APIs. With that, the most recent Journaling Suggestions feature could be helpful. Apple has long taken pride in its hardware, particularly when contrasted to Android smartphones; on-device AI that prioritises privacy might be a significant differentiator.
Ferret Apple LLM
To see Apple's biggest, most ambitious AI effort, you have to be familiar with Ferret. Ferret is a multi-modal large language model that can comprehend its environment, focus on a specific object, and obey commands. It may be able to interpret your screen, but it's designed for the now common AI use case of asking a device questions about the outside world. Researchers demonstrate how it might help you browse apps, respond to inquiries about App Store ratings, explain what you're looking at, and more in the Ferret article. The accessibility implications of this could alter the way you use your phone, smart glasses, and Vision Pro.
Though we're getting ahead of ourselves, this might work with future Apple initiatives. Your phone can use itself if Siri can understand what you want and if the device can see and comprehend everything on your screen. Without substantial integrations, Apple could launch apps automatically and press the appropriate buttons.
Once more, this is research, and it would take a technical miracle for it to function properly this spring. You know chatbots are horrible once you've used them. There will be major AI announcements during WWDC. During this week's earnings call, Tim Cook made a commitment that he had made in February. It is evident that Apple is pursuing AI quickly and that the iPhone may undergo a major redesign. Perhaps you'll be happy to use Siri! That would be really noteworthy.
News source: Apple LLM
0 Comments