On a project for work, I need to be able to render...
# compose-ios
l
On a project for work, I need to be able to render some part of the screen to a bitmap to save to a file. This has been one thing blocking an experimental compose migration on iOS lately. I see a PR for making ImageComposeScene common. Would this be a good solution for what I need? Are there any performance issues I should consider?
o
ImageComposeScene’s content can be encoded into an image bytes (skiko Image, and then processed any way that covers your needs). But ImageComposeScene is not rendered on an actual screen.
render some part of the screen to a bitmap
What’s “some part of the screen” in your case? Is it a real screen? What part do you need? (I think you can trim the resulting bitmap anyway)
l
I essentially will be passing the same composable function with the same state variables to both ImageComposeScene and Application. That should allow me to access the pixels that show on the screen.
I suppose a good example of what I’m looking to do is if I have a screen with
Copy code
Column {
    Header()
    SomeContent(someState)
    Footer()
}
and I want to capture an image of SomeContent(), I could do
Copy code
ImageComposeScene().setContent {
    SomeContent(someState)
}
and the header/footer will not be visible in the saved image, but will show up on screen.
o
I think ImageComposeScene would cover your needs. Its size and density can be manually configured according to a screen of an actual device, so the resulting image should look exactly as render on the screen (unless the real app has some special CompositionLocals which could make the difference in the UI, but should be possible to configure as well). I guess you want 100% pixels matching (the Image vs the real screen)? How often do you need to make a screenshot/bitmap?
l
It should be fine if it’s slightly different, but would be ideal if it was exactly the same. All state is saved in a data class and updates are applied via lambdas, so I’m not really concerned about remember/CompositionLocals. The goal is to record a video like this, so I’d like at least 30fps.
Of course, iOS is still experimental, so if it can’t get 30fps right now, but future improvements may allow this, that’d be fine. It would be nice to have ImageComposeScene in Android so I could start trying this out in a more production ready environment, but I don’t see ImageComposeScene support for non skiko targets.
o
There might be implicit Composition Locals set specifically for ios (when running in a real environment). But ImageComposeScene is platform agnostic by default (at least for now), so there might be some surprises in the beginning.
l
Do y’all have any data on performance? I’d like to think rendering to both ImageComposeScene and Application would double the render time (maybe a bit less, since the ImageComposeScene is smaller, since it’s just a section of the screen). Is this a good guess?
o
We have no data on performance yet. I think it should be possible at some point to make ImageComposeScene run in a separate thread.
Is this a good guess?
Yes, I have similar assumptions