Hey folks! :wave: Although I already posted this ...
# compose-desktop
u
Hey folks! 👋 Although I already posted this in
#feed
, I felt it was important to share here too since it's especially relevant to Compose Desktop devs: 🎉 Compose Media Player v0.6.0 is out! This release brings full Canvas Compose rendering support on all platforms, including Windows — no more Swing panel! That means you can finally overlay UI elements directly on top of video content 🎥 We also replaced
mfplay
with
MediaFoundation
on Windows for better flexibility. More details here: https://github.com/kdroidFilter/ComposeMediaPlayer/releases/tag/v0.6.0 Would love your feedback or ideas! 🙌
🎉 2
m
Which technique do you use to show the video frames in compose? May be interesting for other use cases too.
s
From exploring the library’s code, it looks like video playback on JVM desktop platforms (Linux, macOS, Windows) is implemented purely in software, extracting frames as bitmaps and drawing them onto the canvas. In my opinion, this approach isn’t efficient. https://github.com/kdroidFilter/ComposeMediaPlayer/blob/master/mediaplayer/src/jvmMain/kotlin/io/github/kdroidfilter/composemediaplayer/windows/WindowsVideoPlayerSurface.kt#L32 https://github.com/kdroidFilter/ComposeMediaPlayer/blob/master/mediaplayer/src/jvmMain/kotlin/io/github/kdroidfilter/composemediaplayer/linux/LinuxVideoPlayerSurface.jvm.kt#L45 https://github.com/kdroidFilter/ComposeMediaPlayer/blob/master/mediaplayer/src/jvmMain/kotlin/io/github/kdroidfilter/composemediaplayer/mac/compose/MacVideoPlayerSurface.kt#L33 However, on WebAssembly, Android, and iOS, video is handled using the respective native players, which are hardware accelerated.
u
@Sergey Y. So it's not totally true, the offscreen rendering is done via the GPU, but the rendering is actually managed via the software, indeed it is not the most efficient, but it is largely acceptable in the majority of cases according to my tests, and it is above all the only way to have a real composable that can really be handled with compose
@Michael Paus I render offscreen via the platform's native APIs in RGB, and display them in a compose canvas
@Michael Paus indeed we can imagine using the same method to display a native webview, without external dependency
🙏 1
@Sergey Y. on my i5 14600, on a 4k video, my GPU is used at 30% and my cpu at 7-8%, on 1080p classic, the cpu usage fluctuates between 1 and 2%
s
Rendering offscreen using the GPU is fast, I agree. But reading pixels back from GPU memory to the CPU is hundreds of times slower. Then, drawing on the canvas typically requires uploading those same pixels back to the GPU. I’d be glad if this can work fast
u
I think it's still pretty fast for most uses.
s
Interesting. Could JetBrains provide something similar to SurfaceView on Android, a direct cut into the window framebuffer where we can render custom GPU content? That would dramatically improve performance.
1
> I think it's still pretty fast for most uses. That the sad truth of modern software.
u
https://github.com/JetBrains/compose-multiplatform-core/pull/915 it seems that it is not possible
On DirectX, it cannot overlay another DirectX component (due to OS blending limitation)
🤔 1
My first implementation used full GPU rendering, but it was impossible to overlay anything on top of it or even apply a transparency filter, making it almost unusable, and thus not composable.
s
I understand, I had similar problems. I wasn’t blaming you at all.
👍 2
u
Yes, I know, I was just justifying why I chose to do part of the rendering with the software
❤️ 2
z
Ive been planning on for a while to make a video player backed by mpv using the render API functionality it has. The goal is to eventually have it render directly in a composable without copying to a texture. It's been difficult though trying to figure out how exactly to make it render in compose and haven't got there yet. Just thought I might share
u
https://github.com/open-ani/mediamp @zt I came across this last night, it seems to be a lot more work than I have, it looks like my library is going to be out of work
They are developing this for a big Chinese anime app, unlike me who is developing this for fun 😅
z
Oh thanks for showing me this. I didn't think anyone else was working on mpv rendering