We have Compose UI. With it, it's technically poss...
# compose
l
We have Compose UI. With it, it's technically possible to record videos… without sound. What about Compose Sound?
z
You mean for producing sound?
I’ve wanted to build something like that; but I figure it would be the most compelling for live audio programming and kotlin is not good at that. And the existing tools are so much mature I’m not sure what value compose would bring over them anyway.
👆🏼 1
👆🏾 1
👆 2
l
Yes, I mean for producing sound. At least, being able to control sound effects to play (or be recorded) in sync with visual UI made in Compose too.
t
IMO the main barrier is audio processing in Android - its not the most fun 😅 So maybe an easier wrapper/abstraction around something like Oboe, and which is also declarative, would be cool. I was trying to look into building something like this. With that in hand creating an A/V framework shouldn't be too hard. Still there are several Android synth apps out there which do a great job. And also https://github.com/google/oboe/wiki/AppsUsingOboe
l
I was thinking about something for the JVM (Compose Desktop). My goal is to generate videos with sound effects
m
Have tried recording videos with Compose Desktop? What is the max fps that you can get? I tried Java Robot but I wasn't able to pass 30 fps.
l
I can reach any fps
I'm currently doing 60fps
m
Nice, so you are creating frames from composables directly, right? For sound effects you can use ffmpeg, you will need to bundle it with your desktop app.
I'm not sure what's exactly your use case, but let's say you want to add a sound effect when you click a button. I think that the best option is to save the type of the sound and the exact time to start that sound effect. Then, after finishing recording the video, you can run an FFMpeg command to add the sound effects in the correct positions.
👍 1
l
The short version is I do concurrent image rendering, and concurrent WEBP encoding, and concurrent saving, before feeding it all to ffmpeg
#coroutines
Yup, I was thinking exactly doing that, saving the timestamps where I want to play a given sound
BTW, I'm using WEBP because it's much faster to encode than PNG, and ffmpeg doesn't seem to slow down when fed WEBP over PNG
And WEBP support the alpha channel too 🙂
m
Interesting! I still don't understand what you are using for capturing those frames.
l
ImageComposeScene
thank you color 1