How can I create a bottomsheet with glassmorphic b...
# compose
m
How can I create a bottomsheet with glassmorphic background like this.
s
a
That looks like a great implementation, thanks @Stylianos Gakis
🙏 1
s
Not me, I am just a messenger, thank the library author 😅
m
this doesn;t work on Android 12...i think this is for Android 13 and above
a
Are you sure? I think the lib has a fallback
m
Bit late to the conversation.. Heres the screen recording of the sample app for haze... It doesn't work on Android 12
works flawlessly on A13 btw
Any other solution that you guys are aware of? which is supports lower api levels also
nono 1
s
No, there isn't a library available for real-time blurring. Older versions of Android lack the necessary API, making it challenging to achieve the same effect efficiently. For more in-depth discussion, check out this thread: https://kotlinlang.slack.com/archives/CJLTWPH7S/p1651845634528419?thread_ts=1651845634.528419&cid=CJLTWPH7S
a
Nah we did it with older apis in our project, it’s completely doable if you are willing to accept some tradeoffs, for example quality will not be the same, as performance relies on heavy downscaling before handing the bitmap to renderscript for blurring, another trick is to blur a larger portion and move it in sync with scrolling in that way you have to only blur every X frames instead of every frame - worked fine actually.
s
I didn’t say it's impossible to implement. The question is about quality. In other words, it’s done through software, it's janky, slow, and it warms up the user's device. 🙂
a
It was not janky, not slow, that’s solvable - warming up the device not really, it worked fine, just looked a little less good than on modern apis
s
sure 😃
m
@Alex could you share some reference to the implementation that worked for you?
a
Wanna bet me a crate of beer @Sergey Y. ? :D
s
I believe it will be more efficient and feasible with the recently released GraphicsLayer#toImageBitmap. This is because it utilizes hardware rendering into a texture, encapsulated within a Hardware Bitmap.
Everything done in software Canvas may not be suitable for real-time applications, in my opinion.
a
The heavy lifting (blurring) is done in renderscript, which is on a different level of performance than Kotlin though
s
It doesn't matter if it's on the CPU as well; the way you're creating a bitmap of the UI is the key issue. It's incredibly inefficient, and that's the real problem. Believe me, I've done similar things as well. This issue can be masked when you're only blurring small parts of the UI.
Anyway, we're going in circles here. Romain did a great job explaining in the thread above why this approach is flawed.
a
You create the bitmap only once, everything else is just a write to the same bitmap instance (bit flipping), until the bitmap size changes, which it doesn’t do often
The approach is flawed but it’s the only one available on older apis so it doesn’t matter, it’s either that or not doing it
s
so, than we are not talking about realtime blurring
a
Yes we do
s
realtime means every frame, otherwise the blurred image will be lagging behind
a
you overwrite the content of the same instance of the bitmap every frame
s
horribly to see that in high refreshing rate screens
a
it will lag one frame, but thats not avoidable
noo, since its blurred you wont notice
s
Would be great to show a video recording of this Alex to show what you really mean here. It might help clear up any misunderstandings here.
a
I will prepare something, but it might take a little bit of time, I need to extract it into a sample
s
but you still need to rasterize the ui into bitmap
s
I was thinking just the behavior in the app itself, so that you don’t need to spend too much time doing this
a
the app is not for public release, I will create a sample
Quick sample, in this case its blurring the whole screen of a pixel 6 pro (1440 x 3120px), with 10 times downsizing. This is a bit unfair since the pixel 6 pro is a rather recent device, but my oneplus 7 pro is charging (haven’t used it for a while). This is actually a legacy android view, wrapped in compose, blurring with renderscript, will be faster if you only blur smaller parts, sure
❤️ 1
It’s quick and dirty so don’t hate me for bad git commit / history
s
Let's break it down by technique: 1. Old Rendering Method: You're likely using the old approach of rendering UI into a bitmap via a software-layered View wrapper. This method is inherently slow. 2. Newer Composable to Bitmap: While writing the contents of a composable to a bitmap is hardware accelerated, your mention of using RenderScript for blurring and resizing indicates data is being moved to the CPU side, which slows things down and is also inefficient. 3. OpenGL and Shaders: Utilizing OpenGL and shaders to sample and blur the hardware bitmap represents a more efficient method. This approach allows for direct GPU-driven processing, enhancing both efficiency and power consumption by treating the bitmap as a texture through the conversion of its hardwareBuffer into a khr_image. However, based on your description, it appears this might not be the method you're using. This process, especially if repeated every other frame to mask performance issues, is far from optimal. I understand the implementation, but compared to genuine GPU-driven solutions, it's less efficient in both performance and power consumption.
a
Of course its less efficient
But you don’t have options on API < 12
s
I do have, just fallback to a simple scrim.
a
such as?
yeah not blurring is best, of course
I didn’t say do it, I said its possible
s
I'll take a look your code later. BTW, consider implementing the 3rd option. Maybe I take a try sometime later.
a
I have done openGL and shaders for the page turning effect you have in the ios book reader app
it’s a different can of worms
My advice: Do not do it, it will explode your head 😛
s
but this are sugar one
a
OpenGL is pain
but if you can make it work, great! Show us your code later