Apple scaled back its AI focus during this year’s Worldwide Developers Conference (WWDC) compared to last year’s keynote, choosing instead to spotlight updates to its operating systems and software. Alongside a new design aesthetic dubbed “Liquid Glass,” the company still made several noteworthy AI announcements, including new tools for image analysis and live translation.
One of the standout features launched is Visual Intelligence, an AI-based image analysis tool that offers users insights about their environment. It can identify plants, provide restaurant information, and recognize clothing. This feature now extends its capabilities to interact with content on the iPhone screen; for example, it can conduct an image search related to social media posts using platforms such as Google and ChatGPT. Users can access Visual Intelligence via the Control Center or by customizing the Action button on their device starting with iOS 26, scheduled for release later this year.
Additionally, Apple has integrated ChatGPT into its Image Playground, an AI-driven image generation app. This enhancement allows users to create images in various styles including “anime,” “oil painting,” and “watercolor,” and to prompt ChatGPT for additional creations.
The company also unveiled the Workout Buddy, an AI-powered coach that delivers encouraging commentary during workouts. Using a text-to-speech model, it mimics a personal trainer, offering motivational insights at key moments during activities like running and summarizing performance metrics after workouts.
Apple’s commitment to improving communication was evident with a new live translation feature for Messages, FaceTime, and phone calls. This technology provides real-time translation of text and speech, displaying live captions during FaceTime and translating spoken dialogue aloud in phone conversations.
For users wary of unknown callers, two AI features have been introduced. The call screening function automatically answers calls from unfamiliar numbers in the background, informing users of the caller’s identity and purpose before they choose to answer. The hold assist feature detects hold music while users await a call center agent and allows them to continue using their phone until an agent is available.
In the Messages app, Apple has added a poll suggestion function, which uses Apple Intelligence to recommend polls based on the context of conversations—helping group chats make decisions, such as where to dine.
The Shortcuts app is receiving enhancements as users will soon be able to leverage AI models to incorporate features like summarization into custom shortcuts. Spotlight will also see minor improvements, utilizing Apple Intelligence to offer contextually relevant suggestions based on users’ activities.
Developers will have more freedom with the introduction of the Foundation Models framework, allowing them to access Apple’s AI models offline to enhance third-party applications with AI capabilities, potentially increasing competition in the AI space.
However, there was a notable disappointment regarding Siri. Despite attendees hoping for significant updates to Apple’s voice assistant, Craig Federighi, Apple’s SVP of Software Engineering, indicated that more developments will not be unveiled until next year, leaving many to ponder Apple’s strategic direction in the evolving landscape of AI-powered assistants.