New year, new dubdub. As always, it starts at 10 AM PDT and I’ll do my best to narrate it.
Stream start. A developer in Xcode. He spills his coffee chasing a bubble. They’re everywhere. He parkours onto the bubble.
GOOD MORNING! Tim at Apple Park’s park. Developers. 15th anniversary of the App Store. Continuing to innovate. Big announcements coming, new products.
John to talk about Mac. Apple Silicon really does great! New possibilities from AS. M2 Air best selling laptop. Now there’s a 15” Air. Kate to talk about it. Thinnest 15” laptop. 3.3 lb. MagSafe, 2 TB4, headphone. Comes in the same colours as 13”. 15.3”, 5mm bezel. 500 nits, 1B colours. 1080p camera, 3 mic array. Six speakers, force-cancelling woofers. M2 runs rings around the Intel predecessors. 18 hours battery life. Less compromise versus 15” laptops in PC space. Fanless.
Back to John. 1299$. 1199$ with edu discount. Orders today, available next week. 13” Air price drop to 1099$, M1 Air remains at 999$. Pro products. Mac Studio found a place in studios. Jennifer to talk about refresh. M2 Max. 25% perf over M1 Max. After Effects perf is up 50%. M2 Ultra. Still two dies welded together. 24 core in Ultra. CPU up 20% perf. 76-core GPU 30% faster. ANE is 40% faster. 192 GB of RAM supported, 50% more than M1 Ultra. UMA means it has far more RAM than most dedicated GPUs. 50% perf improvement in Resolve, 3x in Octane X, 22 streams of 8K ProRes possible. Laps Intel machines, of course. 8k240 over HDMI. Six Pro Display XDRs supported. Not done yet. Some people need internal PCI cards. Mac Pro with M2 Ultra. 3x improvement over the Intel mode. 192 GB max RAM. With 6 video I/O cards, can ingest and encode into ProRes in realtime. 8 built-in TB ports. Six PCIe gen 4 cards. Tower and rack. Dual HDMI and dual 10GB ethernet. Mac Studio with M2 Max starts at 2000$. Mac Pro at 6999$. Order today, available next week. Transition complete.
Back to Tim. It’s the integration of software and hardware. Craig on software. iOS. iOS 17 focuses on sharing, communication, input, and new experiences. Updates to FaceTime/Phone/Messages. Update to the Phone app. Contact posters show how you’re represented in the incoming call and in the contact card. Vertical text in CJK languages. It’s like customizing your lock screen. Works with CallKit for third-party apps too. Live Voicemail for screening calls. You see a live transcription as they leave a voicemail. Transcriptions are done on-device. FaceTime updates. Video voicemail support in FaceTime. Kim on Messages. Improved search with filters. Catch-up arrow to jump to first unseen message. Inline reply gesture. Transcription of audio messages. Location sharing updates inline. Check-in to automatically inform people if you’re home or delayed. E2EE of this, obviously. iMessage apps are hidden behind a plus button. Improvements to stickers, now in a custom drawer. All emoji have stickers, and can be rotated/resized. With copy subject, they can also be turned into stickers, and history available of them. Just pick from your photo library to automatically make a sticker. Animated stickers can be made of live photos. Stickers can be attached to messages. Effects on stickers, based on angle and lighting. Stickers are now sticker-wide, accessible anywhere like third-party apps with Markup and emoji support.
Back to Craig. AirDrop updates. Share your phone number via AirDrop… called NameDrop. Just bring the phone close together, and it shows the contact poster. Supports Watch too. Can share things with it too. You can leave AirDrop range and it’ll switch to doing it over the Internet. SharePlay can be initiated too. Automatic for third-party apps too. Text input improvements. Autocorrect uses on-device ML. Now uses a transformer. Sentence-level autocorrect for grammar. Tap underline words to revert back. The keyboard can be taught to say fuck. Easier to complete sentences. Predictions based on how you write. Dictation is transformer-based. All ANE based.
New experiences. Adeeti on this. People like memories in the Photos app. Journal is a new iOS app. Does… journalling. On-device ML to suggest inspiration from photos, location, music, workouts, etc. You control what to include and what to save. Suggestions as an API for third-party app. Some are reflection prompts. Examples shown a trip with music, photos, locations, and prompts like “what are the best place”. Revisit previous ones. Notifications to remind you to write. On-device processing, E2EE, and ability to lock down entries.
Back to Craig. StandBy when iPhone is on its side when charging. Glanceable view. With AOD on pro iPhones, always available. Customizable faces by swiping up and down, swipe left and right as a photo frame. Widgets supported, those are swipable too with smart stacks. Live activities supported. Siri with large card views. Low light at night, with a red tone. Remembers preferred view for each place you charge. “Hey Siri” is now “Siri”, don’t need to say Siri’s name again. Offline maps. Improvements to people albums in Photos app, can recognize the faces of cats and dogs.
Downstairs to talk about iPadOS. They brought Final Cut and Logic to iPad. iPadOS 17. Widgets and lock screen. Widgets are interactive now. Obviously, an API is available. Customizable lock screen like iPhone. Live photos can have additional frames synthesized for slo-mo effects. Some live wallpapers like astronomy are optimized for the larger screen. Widgets can be added on the side. Live activities. Multiple timers. We truly live in an age of wonders. Health app on iPad. Optimized for the bigger screen. Syncs with other devices and third-party services. HealthKit on iPad.
Jenny on PDF improvements. It can identify the fields in a PDF with ML to fill in even if the PDF didn’t have forms already. Even works with scanned with camera PDFs. Improvements to Notes integration with PDFs. Can put them inline into a note, and annotate. Multiple PDFs in the same note. Collaboration is also supported, with real-time annotation synced.
Back to Craig. Stage Manager has more flexible window positioning and sizing. External camera support. New tools in Freeform, and follow-along to view someone else’s viewport. Has the new iOS features too.
macOS improvements. Crack marketing name will select a new name. macOS Sonoma. Most of those new iPhone/iPad features are also on Mac. Screensavers, including architecture and nature. Screensavers transition into wallpaper. Widgets on Mac. Previously only in notification centre, now they can be dragged onto the desktop. Can be dragged anywhere. Widgets fade into the background to avoid being distracting, while still being glanceable. Blends into your desktop wallpaper. Widget gallery to select them, with Continuity to select widgets used on your iPhone. (RIP Apollo, it was featured in this. Screwed over by Reddit API changes.) Can use widgets from your iPhone even if they don’t have a Mac app.
Jeremy on Gaming. Apple now has hardware that can play video games…. if only they had them. Metal 3 with upscaling. Game mode to prioritize game processes for CPU and GPU. Lowers Bluetooth audio and input latency. Sample rate on controllers is doubled. Works with any game. Game porting toolkit to make it easier to port from i.e. Windows. Converting shader and graphics code made easier and faster. Another big game coming to Mac…. from Hideo Kojima. He loves Apple. New era for gaming on Mac. Death Stranding Director’s Cut on Mac, later this year. Actively bringing future titles to Mac. It’ll be available on the Mac App Store.
Craig back to productivity. Video conferencing. Screen sharing can lose your face. Presenter overlay includes your camera with screen. Small overlay in a bubble. Large overlay makes you prominent by separating you from background and puts you on top of the screen. ANE powered. Reactions video effects. Confetti, and you can do it from any third party app. Can be triggered by gestures on camera.
Beth on Safari. Fastest browser. WebKit. New web APIs, like web typography. Privacy, they’ve had porn mode first. Private browsing now locks unused windows. Removes trackers from loading, and removes them from URLs. Password and passkey sharing. E2EE over iCloud Keychain. Profiles. They separate cookies, history, bookmarks, tab groups, extensions, etc. Web apps on Mac. File -> Add to Dock. Web apps have simplified toolbars. Developers don’t need to do anything. Separate from Safari windows, with web APIs like notifications supported.
Back to Craig. That’s macOS Sonoma. Audio and home. Three pronged guitar. Guitar solo. Ron to tell you more. AirPods are really popular. Adaptive audio combines ANC and Transparency. Based on surroundings. Distracting noises reduced, important noises highlighted. Personalized volume works with historical preferences and current environment. Conversation awareness will reduce volume when speaking. Works with calls and people you’re speaking with IRL. Easy to mute and unmute. Improved device switching.
Anne on AirPlay. ML to learn AirPlay habits and suggest playing. Siri can start AirPlay sessions. AirPlay in hotels. Makes joining hotel WiFi easier. Apple Music and CarPlay updates. Adding to a car playlist can be a pain in the ass. SharePlay in CarPlay. Passengers can have your CarPlay suggested to join to. Other devices in it can control the stereo. Control Centre improvements in tvOS. Siri Remote can be located with Find My. Memories as a tvOS screensaver. FaceTime on tvOS. Uses continuity camera for audio/video input. Move calls to Apple TV. Keeps you in frame. Gesture effects are supported. SharePlay on Apple TV. Continuity Camera API for third-party conferencing apps on tvOS.
Kevin on watchOS. watchOS 10. App redesigns. Promo video. Widget stacks. Can use the crown to flip through them. ML to surface important stuff. Long press to add widgets to stack. Widgets can also hold a complication. Widgets are also interactive. Works with interactive stuff. World clock shows time of day easily, and a map. Activities has corner icons to jump to screens, redesigned trophies, and more detailed views for each ring. Other third-party apps are taking advantage of this. Two new faces. Snoopy face.
Eric on cycling. Can now connect to Bluetooth bike sensors to add power and cadence to workout metrics, with a power metric view. Estimated functional treshold power. Zones based on that estimated FTP. Another mode to cycling workouts with just iPhone, with live activity support. Shows the workout view on phone, with new additional screens for cycling. Compass and Maps improvements for hiking. Two waypoints automatically generated; one for last place with cell reception, and last with emergency cell reception. Elevation now visible for waypoints. Topographic map with contour lines, hill shading, trailheads, etc. Search for trails and hikes. View difficulty, length, etc. of a trail. Workout API improvements, with tennis and golf as examples for high-frequency motion sampling. Analyze your swing! Create a workout regiment from a third-party app.
Sumbul on health. Mental health features. Identify your feelings. Mindfullness app on Watch/iPhone/iPad can log mood and reasoning. Take standardized assessments for depression and anxiety to determine if you want to talk to someone. Resources like articles in health app. Vision and myopia, 30% are impacted, up to 50% in future. Touch grass to reduce risk of myopia. Watch can measure daylight using ALS and show in health app, with health sharing supported. Screen distance to use TrueDepth camera to determine how close you’re holding the device to your face, encouraging you to keep it further away.
Back to Kevin. Privacy is important, especially for health. On-device encryption, never shared without permission.
Back to Craig. Those are the OS updates. Lots of new APIs. Platform State of the Union and 175 sessions later. Developer betas today. Public beta next month. Fall for GA.
Back to Tim. Already a big day. One more thing…. Years in the making. AR. New platform and product. A headset? Lots of cameras. A crown? Fabric? Apple Vision Pro. AR headset. Interact with things as if it’s in physical space. Use hands, eyes, voice. Infinite canvas. Spatial audio. Blends digital content in existing physical computing. “Spatial computing.”
Alan on Vision Pro. Home kinda like a Watch. It looks like it’s there in your room. Physicality, dimension, casts shadows and reacts to light. Place apps anywhere in a physical environment, like moving a real object. Multitasking in the full room. Apps can expand fully into a space. Environments to extend a space, full volumetric capture. Use crown to control how immersed you are. New input models mark the goal of a new platform. No external input devices are used, just you. App icons and other things animate when you look at them. Flick to scroll. Shouldn’t have to think about it, and body should be comfortable doing it. Voice control with dictation. Not being isolated from people around you. You can see them, and they can see your eyes. It’s visible through the headset, “EyeSight”. Shows that you’re focused on apps and when immersed in an experience. People will materialize in a view if immersed.
Allesandra to demo the experience. Most apps available, including i.e. Safari. Small text is easy to read. Expands to show all tabs. Easily arrange apps, with layering in a full 3D environment. 3D objects from apps can be materialized. Still keep interacting with the physical world and people around you. Virtual keyboard. Works with normal keyboards and trackpads too. Mac screen can be materialized into view and manipulated like anything else. Bring your headset with you when you travel. FaceTime with spatial. SharePlay too.
En on home usage of it. View photos on it. Panoramas can wrap around you. 3D camera. Just one button to capture a 3D video. EyeSight to make it obvious when recording. Private theatre. Dims surrounding light. Watch a movie in an experience to scale beyond a room’s size. Use it on a plane. 3D movies. 3D entertainment experiences in apps. Games on a screen as big as you want, with controller support. 2D screen games from Arcade should be available day one.
Bob of Disney on stage. New experiences possible on the headset. Watch things in places, with multiple overlays. Sports as an example. Go look at the ocean inside the ocean, inside the ocean. Characters interacting with your physical space. Hard to describe, but Disney does seem to be selling it even more than Apple.
Back to Tim. Required a lot of hard problems in design and tech. Richard on that. Front is a 3D formed piece of glass. Polished to work as a lens for EyeSight and camera. Blends into metal chassis with button and crown. Curves to wrap around face. Tons of sensors and SoC. Air cooled. Textile strap at the back. Should adapt to all sorts of face shapes. Modular eye seals, straps, etc. Spatial audio on the strap. Ribbed strap. Dial to secure device. Zeiss offers inserts to the lens for people who need vision correcting glasses without reducing accuracy. Required to be plugged into a wall or battery. Two hours of battery with an additional battery.
Mike on tech. Displays and sensors. Micro-OLED with extreme density, 64x the density of an iPhone panel. 23MP across two postage stamp sized panels. Lenses to magnify, three element. Wide colour and HDR. Text is dense enough to be readable even at small point. Audio. Ambient spatial audio, matched to room and materials with audio ray tracing. IR, downward, side, LiDAR, all sorts of cameras…. uses IR for eye tracking instead of controllers. To process all of it. M2 based. R1 alongside it to do real-time DSP from 12 cameras, five sensors, and six mics. Low-latency. 12ms latency, 8x faster than blinking. Curved OLED with lenticular lens to make your eyes look right, illusion of transparency. No camera looking at you. How do you appear to others in a video call? Enroll yourself with a camera to make a 3D model of you in a video call, matching what you’re doing. visionOS. New subsystems, like real-time compute, multi-app 3D, spatial 3D, and foveated rendering.
Susan for developer platform. Port existing apps, or make new ones possible. Some early-access users. 3D beating heart for biology students. View CAD models with airflow. Preview a PLC control line. DJ mixing. Planetarium. Microsoft Office apps. Video conferencing apps. Use existing APIs. To make 3D content easier, Reality Composer Pro. Use existing apps with your hands. Unity will support it, including stuff like passthrough. App Store for it, for visionOS and compatible iOS apps. Platform State of the Union will cover DevEx.
Back to Mike. Privacy and security. Optic ID. Iris based identification. Encrypted and doesn’t leave device, uses SEP. Works everywhere you’d use touch/face ID. Where you look stays private. Eye input is isolated from apps for priacy. System gates camera access. That’s Vision Pro. Most advanced thing. 5000 patents worth of innovation. Replace monitors. New possibilities. 3499$. Available early next year. US retail at first. WWDC and Apple Store will help you figure out what the hell to use it for.
Back to Tim. Recap. Vision Pro ad. Done.
On lobste.rs “ML” means SML/Ocaml. See tag “ml” ( https://lobste.rs/t/ml )
In tagging, sure, but in context here it’s fine.
Also, the person you’re replying to has been around here a bit.
I wonder if they’ll fix the long-standing bug in Terminal.app when dealing with background colors and combining characters.
Terminal.app still does not even support 24-bit color. I wouldn’t hold your breath.
Any software that lasts long enough eventually accumulates some number of bugs which critics will endlessly post “You haven’t fixed that yet?!??!?! And how long has it been open??!?!!???!!??!??!?!?!” comments about.
At this point I take such comments as a sign of successful mature software projects.
To be fair this makes using several widely-spoken languages unusable in Terminal.app…
I never said these bugs don’t necessarily have significant user impact.
Just that any sufficiently long-lived/mature software project will have some number of these, to such an extent that I tend to increase my confidence in a piece of software if I see this type of complaint comment about it.
I wish they had gone more into apps as AR-physical objects and new ways to interact. Having tried working before in VR, the multiple screens (or a really big screen but ~ in space ~) was not that compelling to me. From the presentation I couldn’t tell if something like a timer app, that creates an AR object you place somewhere and remembers where it is, would even work.
I think the floating rectangles UI is necessary to launch visionOS with a significant volume of existing apps plus the Mac desktop. VR/AR devices have generally been bottlenecked by limited software catalogs. This approach may bring in users first and then compel businesses to follow their customers to the platform.
I also think flat sheets work nicely with using your gaze as a pointing device. Even though you can drag objects in full 3D, it’s a less clumsy 2D interaction (pitch and yaw) that casts the ray to the pinch starting point. It helps that windows generally aren’t occluding one another.
A segment of the second talk, the platforms state of the union, explains the user interface fundamentals like windows and spaces.