Skip to content

Instantly share code, notes, and snippets.

View tomkrikorian's full-sized avatar

Tom Krikorian tomkrikorian

View GitHub Profile
@tomkrikorian
tomkrikorian / AGENTS.MD
Created December 4, 2025 15:07
AGENTS.MD for The Wearables Device Access Toolkit SDK from Meta (iOS)

AGENTS.MD – Meta Wearables Device Access Toolkit (DAT) iOS 0.2

Living spec + implementation guide for building iOS apps with the Meta Wearables Device Access Toolkit (DAT) targeting Ray-Ban Meta / Oakley Meta smart glasses.


0. Goals of this document

This file is meant to be the single source of truth for how we integrate the Meta Wearables Device Access Toolkit in our iOS codebase.

@tomkrikorian
tomkrikorian / AGENTS.MD
Last active December 7, 2025 01:35
AGENTS.MD for visionOS 26 & Swift 6.2 development
name description
visionos-agent
Senior visionOS Engineer and Spatial Computing Expert for Apple Vision Pro development.

VISIONOS AGENT GUIDE

ROLE & PERSONA

You are a Senior visionOS Engineer and Spatial Computing Expert. You specialize in SwiftUI, RealityKit, and ARKit for Apple Vision Pro. Your code is optimized for the platform, adhering strictly to Apple's Human Interface Guidelines for spatial design.

@tomkrikorian
tomkrikorian / TheVergeArticle.md
Last active November 14, 2025 13:39
Who is buying VR and XR headsets anyway? Answers

Interview Questions & Answers

1) What drew you to the Vision Pro? Is it your personal device or one that you got through work?

I’ve been a developer for more than 10 years in the augmented and virtual reality industry. The arrival of the Vision Pro was a major opportunity to explore what could be done with a device deeply integrated into the Apple ecosystem. I’m part of a collective called STUDIO•84, focused on creating high-quality experiences for the Vision Pro, and 100% of my income comes from Vision Pro projects.

So it’s mainly a work device, but I also enjoy using it to watch immersive videos. It’s essentially a high-end cinema you can bring anywhere. And as I often say, it’s the best software I’ve seen on this kind of device in years — Apple is really ahead on that front.


visionOS 26 Beta 8 Changelog

Changes from Beta 7 to Beta 8

ARKit

New Features

  • Enhanced Hand Tracking: The isTracked property on hand joints now has adjusted sensitivity and may return false under poor lighting conditions or when occluded. ARKit will still return a plausible transform for that joint. Check for isTracked when high accuracy of the transform is required. Not recommended for use cases that need to work when some joints are occluded, such as custom hand gesture implementations. (152100522)

CoreData

visionOS 26 Beta 6 to Beta 7 Changelog

New Known Issues Added in Beta 7

Memory Tools (New Section)

  • Leaks might be falsely reported by memory analysis tools when a target has instances of types that use Obj-C properties implemented in Swift using bridged types with @objc @implementation. This affects the leaks CLI tool, Leaks instrument, and Xcode memory graph debugger (157798911)

Minor Updates in Beta 7

RealityKit

visionOS 26 Beta 5 to Beta 6 Changelog (Made with Claude)

Issues Resolved (Moved from Known Issues to Resolved Issues)

EyeSight

  • Fixed: EyeSight not displaying content even when observer is present (155800405)

Foundation Models Framework

  • Fixed: Model requests erroneously throwing guardrailViolation when model assets not fully downloaded (156223847)

visionOS 26 Beta 4 to Beta 5 Changelog

New Features Added in Beta 5

Foundation Models Framework

  • Enhanced Refusal Handling - Framework now supports programmatically detecting when and why the model refuses to respond with LanguageModelSession.GenerationError.refusal and model-generated explanations (156086748)
  • Content Completion Detection - New isComplete property in GeneratedContent to check if content was fully generated (156109416)
  • Raw Content Access - Access underlying weakly typed GeneratedContent via rawContent property on Response or ResponseStream (156351123)
  • Permissive Content Transformations - New Guardrails.permissiveContentTransformations mode for text-to-text tasks like summarization and rewrite (156721060)

visionOS 26 Beta 3 to Beta 4 Changelog

New Features Added in Beta 4

Foundation Models Framework

  • LanguageModelSession.prewarm() now caches instructions and prefix of prompts, reducing time to first-generated token (152381043)
  • #Playground in Xcode now allows filing feedback for Foundation Models framework responses (153770707)
  • Content Tagging now supports non-English languages with SystemLanguageModel(useCase: .contentTagging).supportedLanguages (155801948)

WebKit API (New Section)

/// Generates a collision shape from the model's bounding box
/// - Returns: A `ShapeResource` representing the collision shape, or `nil` if the generation fails
func generateShapeResourceFromBoundingBox() async -> ShapeResource? {
// Get the model entity and its mesh
guard let modelEntity = self.children.first as? ModelEntity,
let modelMesh = modelEntity.model?.mesh else {
Logger.realitykit.error("Required model components not found")
return nil
}

visionOS 2 Beta 9 Changelog

MarketplaceKit

  • Fixed: Resolved an issue where iPhone and iPad apps on Apple Silicon Mac and Apple Vision Pro quit unexpectedly if MarketplaceKit is linked.

RealityKit

  • Known Issues:
    • Using .shadow in SpatialTrackingSession does not work on devices without LIDAR sensors. Workaround: Enable .plane as well as .shadow.
    • The AudioGeneratorController might not immediately begin playback if the RealityKit.Scene doesn't receive an update. Workaround: Trigger an update to the RealityKit.Scene by modifying a @State variable, or calling entities(matching:when:) in the context of custom System's update method.
    • RealityRenderer might be missing material inputs.