You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
AGENTS.MD for The Wearables Device Access Toolkit SDK from Meta (iOS)
AGENTS.MD – Meta Wearables Device Access Toolkit (DAT) iOS 0.2
Living spec + implementation guide for building iOS apps with the Meta Wearables Device Access Toolkit (DAT) targeting Ray-Ban Meta / Oakley Meta smart glasses.
0. Goals of this document
This file is meant to be the single source of truth for how we integrate the Meta Wearables Device Access Toolkit in our iOS codebase.
Senior visionOS Engineer and Spatial Computing Expert for Apple Vision Pro development.
VISIONOS AGENT GUIDE
ROLE & PERSONA
You are a Senior visionOS Engineer and Spatial Computing Expert. You specialize in SwiftUI, RealityKit, and ARKit for Apple Vision Pro. Your code is optimized for the platform, adhering strictly to Apple's Human Interface Guidelines for spatial design.
1) What drew you to the Vision Pro? Is it your personal device or one that you got through work?
I’ve been a developer for more than 10 years in the augmented and virtual reality industry. The arrival of the Vision Pro was a major opportunity to explore what could be done with a device deeply integrated into the Apple ecosystem. I’m part of a collective called STUDIO•84, focused on creating high-quality experiences for the Vision Pro, and 100% of my income comes from Vision Pro projects.
So it’s mainly a work device, but I also enjoy using it to watch immersive videos. It’s essentially a high-end cinema you can bring anywhere. And as I often say, it’s the best software I’ve seen on this kind of device in years — Apple is really ahead on that front.
Enhanced Hand Tracking: The isTracked property on hand joints now has adjusted sensitivity and may return false under poor lighting conditions or when occluded. ARKit will still return a plausible transform for that joint. Check for isTracked when high accuracy of the transform is required. Not recommended for use cases that need to work when some joints are occluded, such as custom hand gesture implementations. (152100522)
Leaks might be falsely reported by memory analysis tools when a target has instances of types that use Obj-C properties implemented in Swift using bridged types with @objc @implementation. This affects the leaks CLI tool, Leaks instrument, and Xcode memory graph debugger (157798911)
Enhanced Refusal Handling - Framework now supports programmatically detecting when and why the model refuses to respond with LanguageModelSession.GenerationError.refusal and model-generated explanations (156086748)
Content Completion Detection - New isComplete property in GeneratedContent to check if content was fully generated (156109416)
Raw Content Access - Access underlying weakly typed GeneratedContent via rawContent property on Response or ResponseStream (156351123)
Permissive Content Transformations - New Guardrails.permissiveContentTransformations mode for text-to-text tasks like summarization and rewrite (156721060)
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Fixed: Resolved an issue where iPhone and iPad apps on Apple Silicon Mac and Apple Vision Pro quit unexpectedly if MarketplaceKit is linked.
RealityKit
Known Issues:
Using .shadow in SpatialTrackingSession does not work on devices without LIDAR sensors. Workaround: Enable .plane as well as .shadow.
The AudioGeneratorController might not immediately begin playback if the RealityKit.Scene doesn't receive an update. Workaround: Trigger an update to the RealityKit.Scene by modifying a @State variable, or calling entities(matching:when:) in the context of custom System's update method.