Apple’s new AI system can ‘see’ and could be a game-changer for Siri

Apple’s new AI system can ‘see’ and could be a game-changer for Siri

Apple is diving head first into artificial intelligence-powered systems and according to reports citing Apple researchers behind these new systems, one specifically is designed to take on OpenAI’s GPT products.


Reports indicate that Apple is developing the ReaLM system, which stands for “Reference Resolution As Language Modeling“, a new system that is designed to make interacting with AI much more natural. Additionally, ReaLM is able to “see” on-screen content, with the researchers behind the project saying it outperforms OpenAI’s GPT-4, the underlying technology powering ChatGPT, in determining context and interpreting linguistic expressions.

Additionally, the researchers behind the project believe ReaLM is “an ideal choice” for a context deciphering system that would be present “on-device without compromising on performance“. So, how would it work? Imagine asking Siri to show you a list of local groceries around your location. After Siri has completed bringing that list up you will then be able to ask her “Call the bottom one.” With ReaLM implementation Siri would be able to identify the bottom option and proceed to call them. Apple researchers say ReaLM outperformed GPT-4 in this context deciphering area.

Human speech typically contains ambiguous references such as ‘they’ or ‘that,’ whose meaning is obvious (to other humans) given the context,” the researchers wrote about ReaLM’s abilities. “Being able to understand context, including references like these, is essential for a conversational assistant that aims to allow a user to naturally communicate their requirements to an agent, or to have a conversation with it.

You must be logged in to post a comment Login