I have a lot of helpful modules if you're coding for visionOS. A good place to start is ARKitVision, which has some nice starter app projects.

A few of the ones I'd point you at:

  • ARUnderstanding — easy setup to get ARKit sensor data on visionOS and iOS, with a path to visualize and share it across macOS and tvOS in RealityViews too.

  • HandGestures — SwiftUI-like syntax for defining your own hand gestures and recognizing them simultaneously in an immersive space.

  • Manipulatives — a visionOS demonstration of ManipulationComponent and the manipulable() SwiftUI modifier.

The full list lives at github.com/johnhaney.

If you find something you like, please ⭐️ the repo. Feel free to file an issue or open a PR if you find something broken.