💡
Big idea time: We're already seeing the grand success of Compose for
Android and
Desktop, at making UI development a pleasurable (and portable) experience again.
Soon, we're likely to see :compose-multiplatform: Compose Multiplatform impact
iOS and
Web development more fully too.
It's also
been demonstrated that
Compose runtime can be applied to other diverse use cases e.g. terminal applications.
So; thinking about near-future trends and wondering how the Kotlin/Compose Ecosystem can establish itself even further, how about emitting 3D scenes for ✨ m*etaverse*✨applications?
I would love to be able to build 3D VR/AR clients to my full-stack applications, as 'yet another KMP target', using familiar Compose-style code to declare and manipulate 3D world objects. I think in a few years this is going to be an important space for tooling; with requirements outside of what Unity etc. serve today: The needs of VR/AR productivity Apps aren't going to be quite the same as for games.
In early practical terms; could we take the 3D scenegraph API of, say, the
Godot engine, which has both
Kotlin bindings and
VR/AR capabilities, and use the Compose runtime to continually (re)compose a scene graph for display. For a
metaverse UI toolkit you'd
probably want to support 2D interface panels floating in space as well; so to be able to in-line regular 'material Compose' elements with this graph would be ideal too.
I can't be the first to have thought of this - does anyone know if such efforts are in the works? Or anyone keen to have a dabble at a PoC?