Hello, is it expected that `ImageComposeScene` ren...
# compose-desktop
l
Hello, is it expected that
ImageComposeScene
renders one frame late, or just renders wrong? I am using this composable where I call
render
repeatedly on the same
ImageComposeScene
, but with ever increasing
nanoTime
values, and I only get the correct render if I call
render
a second time, which is quite of a dirty workaround.
Copy code
@Composable
fun JustFrameNumberComposable() {
    val frameNumber by produceState(0L) {
        while (true) withFrameMillis { frameTimeMillis ->
            value = frameTimeMillis * 60L / 1000L
        }
    }
    Text("Frame number: $frameNumber")
}
a
It’s expected, given this code. The phases of rendering are composition, layout, drawing, effects. In the code you gave, produceState runs in an effect, so the first time render is called, it sets the value, but composition and drawing (with the default value) has already happened.
👍🏼 1
It’s not about ImageComposeScene; the same will be with a regular window - the first frame will show the default value.
l
Is calling render twice the correct way to handle this?
It's not the most expensive part of what I'm doing anyway… The bottleneck is saving those to WEBPs in parallel based on the number of CPU cores so they can be fed to FFMPEG and generate a transparent .mov with Apple ProRes 4444 encoding.
a
You don’t have to call it twice, just take into account that after each call, the text will be with the previous value.
l
There's no way to do pixel & time perfect animations?
a
Not sure what you mean by that.
l
I'd like that at a given frame time, I know exactly what will be shown
I'm generating overlays based on timecodes for video editing (not live)
a
You do. It’s just always the value of the previous render call.
l
Isn't the issue caused by this in
Application.desktop.kt
?
Copy code
private object YieldFrameClock : MonotonicFrameClock {
    override suspend fun <R> withFrameNanos(
        onFrame: (frameTimeNanos: Long) -> R
    ): R {
        // We call `yield` to avoid blocking UI thread. If we don't call this then application
        // can be frozen for the user in some cases as it will not receive any input events.
        //
        // Swing dispatcher will process all pending events and resume after `yield`.
        yield()
        return onFrame(System.nanoTime())
    }
}
To me, the
yield()
should come after
onFrame(…)
in a
finally
block, or after with the result of
onFrame
being put in a local
val
.
a
I’m afk so can’t be sure, but I don’t think that code is relevant when you use ImageComposeScene. You drive the render loop manually, with your own frame times, when you use ImageComposeScene.
👍🏼 1
Hmm, there’s a weird problem with uncomposed frames I’m investigating, but actually looks like I was wrong here. It seems to work correctly, in the expected way. What happens is that the
setContent
call already runs the 1st composition (the one with frameNumber=0). Then each
ImageComposeScene.render
call correctly calls the
withFrameMillis
lambda and then recomposes. This code seems to work as expected:
Copy code
fun main() {
    val scene = ImageComposeScene(640, 480)

    scene.setContent {
        val frame by produceState(0L) {
            while (true) {
                withFrameNanos {
                    value = it
                }
            }
        }
        Box(
            modifier = Modifier
                .fillMaxSize()
                .background(Color.White),
            contentAlignment = Alignment.Center
        ) {
            Text("Frame: $frame")
        }
    }

    for (frameNumber in 1..10) {
        val image = scene.render(frameNumber.toLong())
        image.encodeToData()?.let { data ->
            val file = File("frame-$frameNumber.png")
            file.writeBytes(data.bytes)
        }
    }

    scene.close()
}
i.e. each
frame-N.png
file contains a picture with “Frame: N” text.
Ah, and I got the phases wrong.
withFrameNanos/Millis
is actually called first, before recomposition.
It’s possible the problem with uncomposed frames is what you were experiencing.
You can try working around it by running all your code on the event dispatching thread (via
MainUIDispatcher
).
l
Shouldn't this be the default behavior, granted I'm running the code in a
produceState
builder (from a parent, visible composition)?
a
I don’t know how you do it, but the most straightforward way for me is the code I posted above, and there everything is running in the process’s main thread, not the EDT thread.
Running it on the EDT thread does work around the issue with uncomposed frames.
l
So that's wrapping the entire snippet you posted with
runBlocking(MainUiDispatcher) { ... }
or alike?
a
yep
l
I think there's some improperly shared mutable state or graphics layers within
ImageScene
and the Compose machinery, as I get the
ImageScene.render
function to sometimes output another composable that is in the app.
I contains 4 images, with one quite obviously incorrect
(That was with everything running inside
MainUiDispatcher
)
a
Oh, you can’t use it together with a running Compose app.
Unfortunately.
l
Well, it kinda works, actually, it's just not reliable
a
There is indeed global shared state in Compose that gets corrupted if you try to.
😬 1
l
Most of the time, it actually works perfectly
What is this global shared state?
Also, I think that would warrant putting a
@RequiresOptIn
guard for
ImageComposeScene
, because I really didn't expect that gotcha, that one discovers far into the project…
a
I don’t remember now. I discovered if a few years ago when hunting down ImageComposeScene threading issues.
l
Is there someone that would know? If it's not too complicated, I'd consider sending a PR to fix those
I guess it's really doable, especially since it's possible to have multiple windows in Compose Desktop
a
It’s not about windows, but about threads.
See links above
l
Is it causing the same problem in multi-window apps?
a
No
l
So, if
ImageComposeScene
gets updated to work just like a window, it'd avoid those issues too, wouldn't it?
a
If by “just like a window” means using it from the UI thread, then yes.
l
Is it really all about the thread though? In the case where I had images from the wrong composition,
ImageComposeScene
was used from
MainUiDispatcher
blob thinking fast
I also set its
coroutineContext
to
coroutineContext = coroutineContext[ContinuationInterceptor]!!
, which is now
MainUiDispatcher
too.
a
Could be additional bugs. If you have a reproducer, I can take a look.
l
Wait, I just saw I'm calling
render
on
Dispatchers.Default
I'm no longer able to reproduce the issue when
ImageComposeScene
is instantiated (and closed) on
MainUiDispatcher
AND
render
is called there too. Maybe just calling
render
there is enough blob thinking fast Anyway, that's fixing my issue, and performance doesn't seem to suffer significantly, which makes sense since the call to
render
was the fastest thing in the pipeline (the slowest things being encoding the image to WEBP, and generating a video from those)
Thank you very much! Do you think a PR to have
ImageComposeScene
throw when not invoked from the right thread could get a chance of being merged?
a
Unlikely, because it’s ok to use it from any thread, just not concurrently with other threads also composing. I will add a warning comment, though.
l
Could you add a
@RequiresOptIn
that warns about that too? Otherwise, it's very likely to go unnoticed, and you know that in Kotlin, we're not used to these kinds of gotchas from Kotlin made libraries/primitives, especially first party ones.
a
I don’t know about that. Is that the appropriate annotation for something like that? @Igor Demin What do you think?
We do have a bit of a strange situation there with the annotations. The main
ImageComposeScene
constructor is marked as
@ExperimentalComposeUiApi
for some reason, but the secondary constructor, and
renderComposeScene
aren’t. Not sure why the 2nd constructor is even needed. I added it here.
i
Is that the appropriate annotation for something like that? @Igor Demin What do you think?
It can be
DelicateComposeUiApi
(similar to
DelicateCoroutinesApi
). But I would rather fix the issue of multi-threading. Because adding this annotation isn't easy - we should add to AOSP, as ComposeScene is planned to be upstreamed, and it will cause source-incompatibility
a
I agree. The API is not inherently delicate, it’s the global mutable state that makes it so, and one could argue that’s a bug.
But unfortunately it’s not a priority, the issues have been known for a few years now. It’s possible that they can’t be fixed without degrading performance in the typical use-case.
l
If it's first added with a warning level, it wouldn't cause source incompatibility. If the plan is to soon fix those multi-threading problems though, maybe
ExperimentalComposeUiApi
would be more accurate than
DelicateComposeUiApi
?
The inconsistency for the experimental annotations across constructors is really weird, and doesn't match the behavior, I agree
a
Unfortunately those problems are on Google’s end, and they’re (understandably) not a priority for them.
l
I don't understand, to me,
ImageComposeScene
wasn't available on Android, but only on other platforms that JetBrains supports blob thinking upside down
If it's not on Android yet, I'd think JetBrains can definitely add the missing
@ExperimentalComposeUiApi
annotation on the other constructors, no?
a
The problems aren’t in ImageComposeScene. They’re because Compose uses (mutable) global state, making it non-thread safe.
We can add any annotations to ImageComposeScene, but the question is which.
and then there are the concerns Igor wrote about
l
That's about the root cause, but I'm talking about warning future users, so that they don't lose hours and days over those subtle and mostly silent issues that plague the output.
Well, it's always possible to remove the annotations afterwards, it's not going to break anything, neither source, nor binary. If the
@ExperimentalComposeUiApi
is already used in this file, why not use it for other constructors, and also for
render
? I mean, that's probably the simplest kind of change one can make to Compose, besides improving the KDoc. Just doing it looks faster than debating the possibility of introducing another annotation, and then ask other people who have other priorities than specifics of what is essentially a warning label.
i
We can't add Experimental to already stable API, as it allows breaking API by another developer some time later
l
Oh, so that's why you're looking into having a
@DelicateSomething
annotation instead?
i
I mentioned it only as a possible annotation for something that needs additional understanding of the limitations. I prefer changing the KDoc as the easiest solution for something that it is not a popular API. I don't prefer marking
ExperimentalComposeUiApi
, because it doesn't tell much, users still miss the threading issue if they don't read the doc, and it requires deprecation of the old API I don't prefer
DelicateSomething
, because in the end the issue is just bugs we need to fix someday.