Q&A: Building apps for visionOS – Discover

Over the previous few months, Apple consultants have been fielding questions on visionOS in Apple Imaginative and prescient Professional developer labs all over the world. Listed here are solutions to among the most ceaselessly requested questions, together with insights into new ideas like entities, immersive areas, collision shapes, and extra.
How can I work together with an entity utilizing gestures?
There are three necessary elements to enabling gesture-based entity interplay:
- The entity will need to have an InputTarget Element. In any other case, it will not obtain gesture enter in any respect.
- The entity will need to have a Collision Element. The collision element shapes outline the areas that the gestures can really hit, so ensure that the collision shapes are correctly specified for interacting together with your entity.
- The gesture you’re utilizing have to be directed on the topic you are attempting to work together with (or any entity). For instance:
non-public var tapGesture: some Gesture {
TapGesture()
.targetedToAnyEntity()
.onEnded gestureValue in
let tappedEntity = gestureValue.entity
print(tappedEntity.title)
}
It is also a good suggestion to offer an interactive entity a HoverEffectComponent, which allows the system to set off a regular hover impact when the person appears on the entity.
Ought to I exploit an array of home windows, an immersive area, or each?
Contemplate the technical variations between home windows, volumes, and immersive areas when deciding which scene sort to make use of for a specific characteristic in your app.
Listed here are some necessary technical variations to think about in your choice:
- Home windows and volumes from different functions that the person has open are hidden when a complete area is opened.
- Home windows and volumes clip content material that exceeds their limits.
- Customers have full management over the position of home windows and volumes. Apps have full management over content material placement in an immersive area.
- Volumes have a set measurement, home windows may be resized.
- ARKit solely sends knowledge to your app if there may be an open sink.
Discover the Good day World pattern code to familiarize your self with the behaviors of every scene sort in visionOS.
How can I visualize collision shapes in my scene?
Use the Debug Shapes visualization within the Debug Visualizations menu, the place you too can discover another helpful visualizations for debugging. For info on debugging visualizations, see Diagnose issues within the visualization of a operating utility.
Can I place SwiftUI views inside a wraparound?
sure! You may place SwiftUI views in an inclusive area with the offset(x:y:) and offset(z:) strategies. You will need to keep in mind that these offsets are laid out in factors, not meters. You should use PhysicalMetric to transform meters to factors.
What if I wish to place my SwiftUI views relative to an entity in a actuality view?
Use the RealityView attachments API to create a SwiftUI view and make it accessible as a ViewAttachmentEntity. This entity may be positioned, oriented and scaled identical to some other entity.
RealityView content material, attachments in
let attachmentEntity = attachments.entity(for: "uniqueID")!
content material.add(attachmentEntity)
attachments: {
Attachment(id: "uniqueID")
Textual content("My Attachment")
}
Can I place home windows programmatically?
There isn’t a API out there to place home windows, however we want to learn about your use case. Submit an improve request. For extra info on this subject, see Positioning and sizing home windows.
Is there any approach to know what the person is ?
As famous in Adopting greatest practices for privateness and person preferences, the system handles digital camera and sensor inputs with out passing the knowledge on to functions. There isn’t a approach to get correct eye actions or actual line of sight. As an alternative, create interface components that individuals can work together with and let the system handle the interplay. In case you have a use case that may’t work this manner, and so long as it would not require express eye monitoring, please make an improve request.
When are onHover and onContinuousHover actions referred to as in visionOS?
The onHover and onContinuousHover actions are referred to as when a finger hovers over the view or when the pointer from an connected keyboard hovers over the view.
Can I show my immersive atmosphere textures in my app?
In case your app has an open ImmersiveSpace, you possibly can create a big sphere with an UnlitMaterial and scale it to have inward-facing geometry:
struct ImmersiveView: View {
var physique: some View {
RealityView { content material in
do catch {
}
}
}
}
I’ve present stereo video. How can I convert them to MV-HEVC?
AVFoundation supplies API to jot down video in MV-HEVC format. For an entire instance, obtain the “Convert 3D video side-by-side to HEV” pattern code undertaking.
To transform your movies to MV-HEVC:
- Create an AVAsset for every of the left and proper views.
- Use AVOutputSettingsAssistant to get output settings that work for MV-HEVC.
- Specify horizontal offset adjustment and discipline of view (that is asset particular). Here is an instance:
var compressionProperties = outputSettings(AVVideoCompressionPropertiesKey) as! (String: Any)
compressionProperties(kVTCompressionPropertyKey_HorizontalDisparityAdjustment as String) = horizontalDisparityAdjustment
compressionProperties(kCMFormatDescriptionExtension_HorizontalFieldOfView as String) = horizontalFOV
let taggedBuffers: (CMTaggedBuffer) = (
.init(tags: (.videoLayerID(0), .stereoView(.leftEye)), pixelBuffer: leftSample.imageBuffer!),
.init(tags: (.videoLayerID(1), .stereoView(.rightEye)), pixelBuffer: rightSample.imageBuffer!)
)
let didAppend = adaptor.appendTaggedBuffers(taggedBuffers,
withPresentationTime: leftSample.presentationTimeStamp)
How can I gentle my scene in RealityKit on visionOS?
You may gentle your scene in RealityKit on visionOS by:
- Utilizing a system-provided computerized lighting atmosphere that updates primarily based on the real-world atmosphere.
- Offering your personal image-based lighting by way of an ImageBasedLight Element. To see an instance, create a brand new visionOS app, choose RealityKit because the Immersive House Renderer, and choose Full because the Immersive House.
I see that CustomMaterial is just not supported on visionOS. Is there a approach to create supplies with customized shading?
You may create customized shader supplies in Actuality Composer Professional utilizing the Shader Chart. A cloth created this manner is accessible to your utility as a ShaderGraphMaterial, so you possibly can dynamically change shader inputs in your code.
For an in depth introduction to Shader Graph, see Discover Supplies in Actuality Composer Professional.
How can I place entities relative to gadget place?
In an ImmersiveSpace, you may get the total gadget rework utilizing the queryDeviceAnchor(atTimestamp:) technique.
Be taught extra about constructing apps for visionOS
Q&A: Spatial design for visionOS
Look now
Highlight on: Improvement for visionOS
Look now
Highlight on: Developer instruments for visionOS
Look now
The pattern code contained right here is given beneath Apple Pattern Code License.


