:bulb: Big idea time: We're already seeing the gr...
# compose
d
💡 Big idea time: We're already seeing the grand success of Compose for Android and Desktop, at making UI development a pleasurable (and portable) experience again. Soon, we're likely to see K Compose Multiplatform impact iOS and Web development more fully too. It's also been demonstrated that Compose runtime can be applied to other diverse use cases e.g. terminal applications. So; thinking about near-future trends and wondering how the Kotlin/Compose Ecosystem can establish itself even further, how about emitting 3D scenes for m*etaverse*applications? I would love to be able to build 3D VR/AR clients to my full-stack applications, as 'yet another KMP target', using familiar Compose-style code to declare and manipulate 3D world objects. I think in a few years this is going to be an important space for tooling; with requirements outside of what Unity etc. serve today: The needs of VR/AR productivity Apps aren't going to be quite the same as for games. In early practical terms; could we take the 3D scenegraph API of, say, the Godot engine, which has both Kotlin bindings and VR/AR capabilities, and use the Compose runtime to continually (re)compose a scene graph for display. For a metaverse UI toolkit you'd probably want to support 2D interface panels floating in space as well; so to be able to in-line regular 'material Compose' elements with this graph would be ideal too. I can't be the first to have thought of this - does anyone know if such efforts are in the works? Or anyone keen to have a dabble at a PoC?
👀 11
x
I always wanted to work on a compose version of LibGui (a popular UI library for minecraft modding over fabricmc) Though, don’t know enough of how compose toolchain work to pull this off myself 😞
m
Are you talking about things like this: https://twitter.com/JPeredaDnr/status/1017843778697269250 I’d find that super interesting and probably more relevant in the near future.
d
Yes @Michael Paus, that video is close to the 'state of the art' for AR today, but it's also baby steps compared to what's coming. Once hardware catches up with concepts - and it's doing so fast - 5 years? I think we'll see lightweight headsets that are easy to pop on/off throughout our day. For example, showing a complete 'Banking App' as virtual table-top objects, perhaps even while you eat breakfast(!), with grabble sliders, 'physical' dragging of money between accounts etc. - just don't knock over the real coffee cup(!) I think it's one of those things that seems of uncertain value given how invested we currently are in 2D interfaces, but that once apps really capitalize on a more visceral, high-fidelity 3D experience, the choice will seem obvious.
👍 1
That kind of overlay of Apps with our real world is now being loosely termed the metaverse. I know it's a new word for an old idea, one that hasn't yet come to fruition in spite of one or two simpler efforts (Google Glass anyone?). But sometimes ideas just need to wait a few decades for practical technology to rise up and meet them. I do think it's on the way.
Perhaps rather than Godot, targeting ARCore (Android) is the way to go; with Compose-runtime emitting the 3D scene-graph for it. Unfortunately Google dropped support for Sceneform API but an up-to-date form remains alive here.
The major gap to fill here is that Augmented Reality for productivity Apps, beyond games and cute demo's, is really lacking a 3D UI kit. Compose seems applicable to this 3D UI problem and Kotlin Multiplatform could provide a pathway to support additional 3D clients for current applications.