I just came across the change list <2604366: WIP s...
# compose-android
s
I just came across the change list 2604366: WIP support for Modifier that converts composable into a SurfaceTexture for GL consumption. I've been working quite some time on a project involving hardware-accelerated blurring effects, and the approach I've been using is very similar to what's described in the change list! I know it's experimental and still a work in progress, so it might not make it into the final Compose release, but I'm really eager to see where it goes. ☺️
👀 1
opengl 1
I'm curious about the differences in implementation. Briefly, I've been using GraphicsLayer to capture the current Compose layout and render it onto a surface-baked canvas, which records everything into an OpenGL texture. I then perform post-processing on that texture. Finally, I render the processed texture onto my SurfaceView, and use the same GraphicsLayer to draw it as the content. This implementation seems a bit different. From what I understand, the TextureModifierNode modifier adds a ComposeTextureView to the root of the current ComposeView when it is attached. This ComposeTextureView extends a standard TextureView to provide enhanced capabilities for rendering the Compose Canvas as a texture. That aspect is quite similar to my approach, however, instead of using a TextureView and a Canvas, I directly create a SurfaceTexture and a Surface, skipping the TextureView step. What puzzles me is that the content of this TextureView is rendered again into the provided canvas within the modifier's drawing scope. I'm not sure why that's necessary, perhaps it's even a better approach than my GraphicsLayer method. I've run into one strange bug with GraphicsLayer. Sometimes, the Compose UI stops updating, even though my OpenGL texture continues to render correctly. It's interesting because I can see the texture updating, but the Compose layout doesn't refresh.
Screenshot 2024-10-09 at 14.56.12.png,Screenshot 2024-10-09 at 14.56.48.png
Overall, I'm really excited that this change list exists, and my work aligns with it! Can't wait to see how it develops. Also, I'm curious why the graphics layer isn't being used in this approach. 😄
c
Very cool, thanks for the share.
🤗 1
s
Also, I’m genuinely curious, is it possible to do essentially the same thing - render composable layouts into a texture but for a Vulkan context? Or alternatively, could we render into an OpenGL texture and then share or copy it to a Vulkan context, allowing us to use it from the Vulkan side? 🤔
r
You’ll need
HardwareBuffer
for Vulkan, via
ImageReader
An
ImageReader
will give you a
Surface
to render into
To consume the results you acquire `Image`s from the
ImageReader
, and the image gives you a
HardwareBuffer
(you can even wrap
HardwareBuffer
with a
Bitmap
to display the result anywhere a
Bitmap
is accepted)
s
Hm. I initially started with ImageReaders and hardware bitmaps. It turns out that HardwareBuffer is more versatile than I realized and can be accessed from any graphics API? Thanks for the great suggestion!
r
Yes, SurfaceTexture technically only works for GLES
s
While debugging, I observed that the textures I create using the GLES API have a format that starts with the VK_ prefix. I suspect that I suppose, Android has shifted from using GL rendering to utilizing ANGLE, which wraps GL ES API calls and redirects them to a Vulkan backend. However, can I access those textures if they already reside in Vulkan memory, or is that access restricted? Also, working with ANGLE isn’t that straightforward, right?
r
Depends on the device, but yeah you can be on ANGLE
👍 1