At its WWDC 2024 developer conference on Monday, Apple showed off Safari Highlights, a browser feature that resembles Google’s AI Overviews.
If you witnessed the rough start for AI Overviews, you might be wondering why. Or at least you may be wondering how similar Safari Highlights might be.
In brief, Safari Highlights uses machine learning to display relevant information at the top of certain web pages as you browse.
It’s part of Apple’s big new push into generative AI. At WWDC, the tech company announced a long-awaited AI framework called Apple Intelligence, which will power new features like text summaries and image personalization, along with a revamped version of Siri, which will use AI to become more conversational and personal.
There was a lot of anticipation leading up to WWDC about what exactly Apple had up its sleeve with generative AI. Though its Big Tech peers long ago planted their flags, Apple has characteristically taken its time. Now we have a better sense of how the company plans to catch up, which also includes a deal with OpenAI to bring the popular ChatGPT chatbot to the iPhone with iOS 18 later this year.
Here’s how Safari Highlights works. The feature displays directions, summaries and links to help you learn more about the people, music, movies and TV shows you’re researching via the browser. Based on the examples shown during Monday’s event, Highlights taps information from sources like Apple Music, Apple TV Plus and Wikipedia.
For instance, if you’re reading an article about Dua Lipa, Safari Highlights might pull an album for you from Apple Music, or if you’re reading a review of Palm Royale, it could feature the show’s Apple TV Plus page.
It wasn’t immediately clear if Apple is using other sources for Highlights or if other topics will spur callouts. Apple didn’t respond to a request for comment.
Apple announced a similar feature, Safari Summaries, for its Reader app. Here, Apple will remove “distractions” like ads from articles and it will add a table of contents and a summary for what you’re about to read. Videos will automatically expand to fill the window in a similar capacity.
By comparison, Google’s AI Overviews were supposed to usher in a new era of search with a custom Gemini model that better understands our intent and quickly addresses what we’re looking for, adding summaries at the top of search engine results pages — including tackling questions we didn’t even ask yet.
These summaries cover a much wider range of information and source responses from across the web. There’s also more of a personalized aspect here as Google’s AI systems learn from user behavior.
But Google quickly scaled back AI Overviews after the feature sometimes returned bizarre responses, such as suggesting that people eat rocks or put glue on pizza. The company refined the queries that yield AI Overviews, saying the feature would no longer address health-related queries or when it senses users are trying to trip it up.
Google’s Gemini and other chatbots are known to have trouble with AI hallucinations, when a generative AI model presents false or misleading information as fact. Hallucinations arise from flawed training data, algorithmic errors or misinterpretations of context.
Editors’ note: CNET used an AI engine to help create several dozen stories, which are labeled accordingly. The note you’re reading is attached to articles that deal substantively with the topic of AI but are created entirely by our expert editors and writers. For more, see our AI policy.