Hi, Since Apple introduced Vision Pro last week, t...
# compose-ios
j
Hi, Since Apple introduced Vision Pro last week, technically developer can use SwiftUI to make both 2D & 3D content to make the app more immersive, what’s the option of Compose on this field? for 3D or AR development?
👍 1
r
Compose UI is not meant for 3D/AR, even on Android. In general these kind of applications don't use platform UI but something at a lower level, see games for example. On Android you would use SurfaceView or GLSurfaceView to render stuff with OpenGL for example.
l
Compose runtime (NOT UI) could be used in a 3D UI thing. It can also interface with UIKit, which works on Vision Pro AFAIK.
l
K/N would have to be ported to VisionOS for this to work.
l
Not sure, apparently, all iOS apps already work on VisionOS
But for accessing VisionOS specific APIs, probably. I guess JetBrains will look into it at some point.
r
I'm pretty sure we'll have to wait for the product to actually be a thing anyway. How do you test that everything works right now without the device? Is there a Vision simulator?
a
Yes, there is a simulator
But there is a LOT more to think about than just a single surface on a phone screen with XR devices
I did jokingly ask this too, knowing the limitations, more so in humor since it just was announced
There was a insightful thread for others if they want to read here: https://kotlinlang.slack.com/archives/C0346LWVBJ4/p1686078230932529
But realistically, any compose multiplatform ios app should work with back compat shims, it will just look like a normal app I would assume
l
From what I gathered, it looked like it could run iOS apps by 'projecting' a flat screen, likely running via something like Catalyst. Will iOS include everything needed for AR/VR, or will we need a separate VisionOS SDK for this?
a
A lot of the frameworks that exist for iOS have been adapted over into visionOS too
I think iOS has most of everything
326 Views