Today : Jun 14, 2025
Technology
14 June 2025

Apple Delays Siri Launch While Unveiling AI Revolution

At WWDC 2025, Apple spotlights its AI-driven Apple Intelligence across devices, but the new Siri assistant faces significant delays until 2026 due to technical and organizational challenges.

Apple's Worldwide Developers Conference 2025 (WWDC25) wrapped up with a bang, unveiling a slew of fresh operating systems including iOS 26, iPadOS 26, and macOS 26. Yet, conspicuously absent from the spotlight was the much-anticipated new version of Siri, Apple’s voice assistant. According to recent reports from Bloomberg, the fully AI-enhanced Siri won’t debut until iOS 26.4, expected to launch around May 2026, marking a significant delay from initial projections.

Originally, Apple planned to introduce this revamped Siri alongside the iPhone 16 in late 2024. However, the timeline has since been pushed back multiple times—from late 2024 to early 2025, and now to spring 2026. Craig Federighi, Apple’s Senior Vice President of Software Engineering, explained at WWDC 2025 that the delay stems from the company’s commitment to achieving the "highest quality standards," necessitating more development time.

The upgrade promises a Siri that is far more intelligent and context-aware. The new assistant will better understand what users are doing on their screens and grasp their personal context, enabling it to execute complex, multi-step commands smoothly. This is made possible through a reimagined backend structure called Siri LLM, designed to support the enhanced capabilities. Additionally, a new system named App Intents will allow Siri to control apps more precisely than ever before.

Yet, the hurdles Apple faced weren’t just technical. Internal conflicts played a significant role in the delay. John Giannandrea, once a key leader of Apple’s AI efforts, was reportedly demoted from the Siri project due to challenges with the hybrid system that combined old and new technologies. This system had an alarmingly high failure rate—about one in three attempts failed—forcing Apple to scrap it and rebuild Siri from scratch. Now, the project is under the stewardship of Craig Federighi and Mike Rockwell, the latter known for leading the Vision Pro team, as they steer the assistant through redevelopment.

The ripple effects of Siri’s postponement extend beyond software. Other Apple hardware initiatives, such as smart home devices that rely on Siri for central control, have been put on indefinite hold. Meanwhile, Apple continues to lean on external AI providers like OpenAI and Google for certain features, including image analysis, which is slated to play a role in the launch of new smart glasses next year.

Apple’s vision for Siri is ambitious: transforming it from a mere voice assistant into a constant “Co-pilot” for users. This new iteration aims to comprehend the user’s environment, engage in human-like conversations, and may even introduce a new AI chat application called Knowledge, which would retrieve information from the web similarly to ChatGPT. While promising, these features remain in experimental stages, and their arrival depends on overcoming ongoing technical challenges.

Amid this backdrop, WWDC 2025 showcased Apple's broader AI strategy under the banner of Apple Intelligence. Tim Cook, Apple’s CEO, emphasized that Apple Intelligence is the "core power" behind all Apple products, integrating deeply with devices from iPhones and iPads to Macs, Vision Pro, and Apple Watches. Unlike many AI systems that rely heavily on cloud computing, Apple Intelligence is designed to operate primarily on-device, enhancing speed, privacy, and functionality even without internet access.

Developers now have access to the Foundation Models Framework, enabling them to embed large language models (LLMs) into their apps. For instance, Kahoot can generate quizzes from user notes, and AllTrails can offer personalized hiking route recommendations. Apple also prioritizes privacy by ensuring data processing stays on-device or, when cloud use is necessary, through Private Cloud Compute, a system that even Apple cannot access.

One of the most visually striking announcements was the introduction of the "Liquid Glass" design language for iOS 26—the most significant UI overhaul since iOS 7. This new aesthetic features fluid, transparent elements that respond dynamically to touch, gaze, and context, unifying the look and feel across all Apple platforms including iPadOS, macOS, watchOS, tvOS, and VisionOS. Dynamic tab bars and buttons adapt to real-time usage, app icons boast layered, light-reactive designs, and wallpapers can now generate 3D spatial scenes from user photos.

The Camera and Photos apps received major usability upgrades. Frequent modes like Photo and Video are now more accessible, with seamless swiping between Cinematic and Portrait modes. Users can access advanced settings such as 4K resolution and lens selection via intuitive gestures. Photos are organized into “Library” and “Collections” for easier navigation, and a new Spatial Photo View enables 3D photo editing.

Apple also revamped the Phone app’s interface for the first time in years, focusing on safety and convenience. It consolidates Favorites, Recents, and Voicemail into a single screen and introduces features like Hold Assist, which plays music while waiting on hold and alerts users when someone picks up. Call Screening intelligently answers unknown numbers, conducting a brief screening before deciding whether to ring the user.

Messages received enhancements that make chatting smarter and more secure. Users can customize group chat backgrounds, create polls with AI suggestions, transfer money via Apple Cash within groups, and see typing indicators for group members. Spam detection and screening of new senders help filter unwanted messages.

Live Translation capabilities now span Messages, Phone, and FaceTime, offering real-time text and voice translation. For example, users can speak with friends abroad without language barriers, with FaceTime providing live captions to facilitate understanding.

Apple unveiled creative tools like Image Playground, which allows users to create “Genmoji” by blending emojis or combining emojis with descriptive text. Users can also generate emojis from photos, altering hairstyles or moods, and produce images in ChatGPT-inspired styles such as sketches or paintings.

Maps became smarter by learning users’ frequent routes and suggesting preferred paths while providing real-time traffic alerts. The app remembers visited places and allows easy sharing, all while maintaining end-to-end encryption that even Apple cannot breach.

Visual Intelligence lets users search for products or objects in photos or screenshots with Visual Look Up. They can also ask ChatGPT-style questions about images, like identifying musical instruments in a picture.

Apple Wallet and Apple Pay are getting upgrades, including support for Digital IDs linked to U.S. passports for TSA travel, automatic order summaries pulled from emails, and new features for tracking loyalty points and installment plans. A redesigned boarding pass interface includes airport maps and luggage tracking through Find My.

watchOS 26 introduces an AI-powered personal trainer called Workout Buddy, which offers encouragement during exercise via Apple Watch and AirPods. New gestures like “Wrist Flick” allow users to dismiss notifications or answer calls without touching the screen.

macOS Tahoe debuts with a refreshed interface and enhanced Apple Intelligence. Spotlight search is faster and more comprehensive, and the Phone app is now available on Mac with full call screening, hold assist, and translation features. Apple Intelligence also helps generate new shortcuts, such as automatically creating document taglines.

iPadOS 26 gets a major multitasking overhaul, allowing users to resize and position windows like on Mac. It adds a menu bar and Expose feature, supports folders in the Dock, and introduces a Preview mode for PDFs. Audio and video recording see improvements with studio-quality voice capture using AirPods and local capture for high-quality podcast and interview recordings.

Despite these exciting innovations, Apple’s stock took a hit shortly after the keynote, dropping over 2.5%—a loss of roughly $75 billion in market value—largely attributed to concerns over the delayed Siri update. Craig Federighi reassured investors and users, stating, "We are still developing features to make Siri a more personal assistant, but this development requires additional time to meet our high-quality standards. We hope to share more progress next year."

Looking ahead, iOS 26 will support iPhone 11 and newer models, while Apple Intelligence features will require devices powered by the A17 Pro chip or newer, such as the iPhone 16 series and select iPads and Macs with M1 chips or later.

Tim Cook wrapped up the event by reiterating that Apple Intelligence is not just a feature but a foundational power that will drive all Apple products for years to come, seamlessly connecting experiences across platforms with a fresh, unified design. With workshops and engineering sessions continuing throughout the week, developers and enthusiasts alike eagerly await the innovations still to come.