I'd like to render my UI to a PNG file in order to...
# compose-desktop
s
I'd like to render my UI to a PNG file in order to automate taking screenshots of an app. I found out it is possible for Swing like this (https://stackoverflow.com/questions/5853879/swing-obtain-image-of-jframe). I have not tried it, but I'm presuming this method won't work with compose as rendering goes through skija/skiko right? Would there be a similar mechanism that could be employed for compose's rendering engine?
I also found a post about doing this on Android (https://dev.to/pchmielowski/automate-taking-screenshots-of-android-app-with-jetpack-compose-2950), however I don't think this can be ported to desktop right away
k
ImageComposeScene
is your friend
😮 3
Use it to wrap your content, then call
render()
+
encodeToData().bytes
+
File.writeBytes()
s
Thanks for the hint, will give that a try!
Got a screenshot of my app rendered like that. Basically replaced my
Window
with
ImageComposeScene
and was able to render and store to file the way you suggested. I'm still curious whether it's possible or planned to use the testing API similar to the way its possible with Android. I found Maven artifact
org.jetbrains.compose.ui:ui-test-junit4:1.1.1
and was able to change most imports on the Android example in the blog post above so that they resolve. What does not seem to be available is
captureToImage()
, I'm assuming it's going to be added eventually:
Copy code
onRoot()
        .captureToImage()
        .asAndroidBitmap()
        .save(file)
Why I'm interested in this: the testing API also has support for simulating clicks on UI elements, so I guess it will help making series of screenshots from various parts of the UI without needing to manipulate state internally, but can be manipulated by simulated mouse and key events.
And kind of out of curiousity, I'm wondering if there's still an equivalent of the Swing approach, i.e.
Copy code
BufferedImage image = new BufferedImage(
      component.getWidth(),
      component.getHeight(),
      BufferedImage.TYPE_INT_RGB
      );
    // call the Component's paint method, using
    // the Graphics object of the image.
    component.paint( image.getGraphics() );
I find it kind of fascinating that you can just call the component's
paint()
on the graphics obtained from a buffered image. Kind of cool and wondering if there's potentially a way to trick compose into rendering into an offscreen buffer, too. That would allow to capture screenshots of an app in production (say with some magic keyboard shortcut) without using external screenshot tools.