If this becomes mature enough, I'd love to see thi...
# compose
a
If this becomes mature enough, I'd love to see this work upstreamed 🤤🤩
c
a
I have haha 😛
🤩 1
a
It seems that it's actually using the classic
View.draw()
+ RenderScript method, which I think is unlikely to get mature due to the severe performance issue and compatibility issue (from my real experience 🙃).
2
a
Are there A) Any other ways to go about this B) The possibility of a flag to disable it on really low end devices?
Or at least using this background render effects for 12+
a
Not only on low end devices. Even on high end devices I doubt if it’ll be able to run at 60fps, not to mention that most high end devices today have higher refresh rates. IMO real time blur below Android 12 is definitely not worth the effort and the resource consumption. If you really want it, window-wise background blur on Android 12 is your best bet.
c
I'm fine with 12+. But yeah, the modifier we have right now doesn't blur the background unlike iOS and we just get sooo many designs that look like that.
a
Yeah, exactly this ^
Like I cannot do the toolbar blur without something like this
1
r
I have to second what @Albert Chang said; this approach works and is fine for static blurs, but it’s very expensive for dynamic updates
And one of the reasons blurs are not backported
a
If this were A12+ only?
☝️ 1
I wouldn't care if there was a platform limitation there if I could use it on modern 12+ devices
Think of it as progressive enhancement as the webdev folks say 😛
c
@Albert Chang is correct, for Android 12+ should use Window Blur or
RenderEffect
, @Nader Jawad made a demo before but I’m not sure when it will be available 😶
a
In theory, could this also allow for processing of composables with AGSL in A13 with either this change, or your lib?
r
All that’s needed for that is to expose a
RenderEffect
type like in Compose Desktop
1
Which is not what the change above nor this library enable
a
And that would allow for effects such as blur behind within the same view?
r
Not directly no
😭 1
Such a shader would need access to what’s behind
A shader only accesses the current pixel, but for a blur you need what’s around the pixel
a
Yeah, that makes sense
r
So it has to be a multi-pass effect
or what’s behind needs to be exposed as a shader itself via render nodes
which has brings its own complexity
a
Would that also be tied to the platform?
r
to be efficient, yes
otherwise you have to go through an intermediate bitmap
that said
you could at least use hardware bitmaps to make this faster
@Chachako could do the same btw
🎉 1
c
So what you're saying is that this still has time to make it before Google I/O? 😂😂
😄 1
n
Hardware bitmaps unfortunately do not support all graphics operations so YMMV even if you render into a hardware bitmap to be used as input for a blur algorithm
1
r
Create a bitmap with a
HARDWARE
config, get a
Surface
for it via an
ImageReader
, create a hardware accelerated
Canvas
from that surface, and at least you can use GPU rendering and stay on the GPU for the blur effect
@Nader Jawad Yeah but software bitmaps don’t support all graphics operations either 😄
c
Reasonable. But I need to clarify that any of the current real-time blur libraries implemented with view don’t work well with Compose, even in Android 12+ there is no good practice on how to get Compose to work well with window blur, the Compose
blur
modifier is just the blur component itself. So all I’m doing is verifying the possibility of implementing older versions with background blur on Compose, and blurring the background with window blur on 12+ is something I’ll be working on later. To put it simply, compatibility is necessary because in many old view system applications use things like RealtimeBlur (like Telegram, QQ) and they have always worked fine, but migrating them to Compose system is a challenge and that’s what I’m doing 😄
r
I would strongly challenge the “always worked fine” argument
From a power/performance perspective, they don’t
And because it goes through software bitmaps, it comes with limitations and compatibility issues
a
Even with small regions being blurred?
r
It also relies on RenderScript which is deprecated (which in itself is fine) but might be losing hardware acceleration support on some devices
c
OK you’re right, but at least they don’t look too bad 🤣
a
I guess it does capture a bitmap of everything before it draws the blur
r
Looking good is not the problem
@andrew The problem is how that capture is done
a
ah
r
Drawing views means losing things like elevation shadows, support for surface views, etc. + it means running all View drawing code in software
Which is slow
Then because you end up with a software bitmap, that bitmap must be uploaded as a texture for final rendering, and that’s expensive too
Even worse, the RenderScript filter may mean CPU->GPU/DSP->CPU->GPU transitions
(depending on how RS works on the current device)
(and a software bitmaps means 2x the memory btw, the CPU copy and the GPU copy)
Anyway, it works and it’s really not bad for static content, but for dynamic content… I’m not a fan 🙂
Oh and of course properly catching when to update the bitmap is in itself an issue
You certainly don’t want to do it on every vsync for instance
c
Yes yes but sometimes expensive isn’t the problem because they aren’t everywhere, apps like Telegram they are just used in places that are extremely small and don’t have a high refresh rate, like BottomBar, that’s why I said they look good. 🙂
a
But I guess not updating often enough or async can lead to that "jelly" effect
where it lags behind
That or you see choppiness
r
@andrew but you don’t want to update when nothing is changing on screen; if you just use vsync you’ll do useless work
a
True
r
@Chachako again, looking good is not the issue I have 🙂
For us to add an effect like this in Compose (or Jetpack) we must worry about performance, battery usage, and memory usage
❤️ 1
plus1 1
(and all of the above also applies to elevation shadows and why they were limited to convex shapes for instance)
a
So realistically, will something like this see the light of day officially, even if only on modern devices?
☝️ 1
2
c
Yes I see what you mean, but for some people this is necessary, so what I mean is that just providing the possibility to let third parties implement him in Compose would be fine, for any performance considerations should be left to the developer, I don’t mean to say that something like this should be built in, just to make it seem possible for them
1
a
Exactly, stuff like this is kinda high demand for some users, even if it's niche
r
I have no problem with 3p libraries doing this
I’m only explaining why it’s not trivial for us to “just” add it to Jetpack 🙂
a
I've had designers/product up my ass asking before "where's the blur" etc
And it's always kind of been a painpoint
r
And I would still ask you to have conversations with design/product so they understand the cost of what they’re asking for
(low-end devices, power impact, etc.)
1
a
They always kinda have their way of getting their way lol
Like I could understand having a flag for low end devices
Like there's one in build.prop for SF blurs iirc, or used to be anyway
c
That makes sense, so I just wanted to say that it would be nice if Compose upstream would support dynamic blur, even if it’s only 12+, because currently it’s not easy to use the window blur provided by 12+ systems on Compose
a
Especially when a lot of times, we don't have as much access to some of the internals as well
1
c
That’s the problem
r
Well… you can use blur on Android 12
Using a
graphicsLayer
and setting its
renderEffect
to the blur effect
a
But would that do background?
No, right?
r
No, but that’s not available with Views either
You can use a window blur at the window level as well
a
I get, that, yeah
r
I was repsonding to “it’s not easy to use the window blur provided by 12+ systems on Compose”
👌 1
a
ah
When I was working with AOSP a bit, it was easier because we were just blurring the behind window
Or the window rather
c
Just wondering, is it possible to embed a window like the one in the picture in Compose?
a
And we'd clip it
r
@Chachako that’s just a separate window, you can create one and put Compose stuff in it
👀 1
c
Also wondering why Compose doesn’t use SurfaceView to render the UI? For SurfaceView, blur and shadow effects shouldn’t be a problem, at least on Flutter it works well. Any explanation for this?
a
That would require it's own render
c
Skia?
a
Yeah
Which the view system uses internally
c
So it’s strange that Android has many limitations, can’t figure out why these things work well on Flutter, in theory they are both Skia and shouldn’t have this difference
a
Flutter is janky as hell tbh
2
r
Compose doesn't need a surface view because it already renders into a Surface
And by reusing the existing Canvas Compose can provide full interoperability with views which helps apps transition
👌 1
c
@andrew 😁 Agree that Flutter isn’t good, which is why I chose Compose. But there are some effects there that are hard to achieve in Android.
r
It's the cost of interoperability and keeping the library size down
c
That means fundamentally this is the cause of JVM language?
a
This background blur stuff seems like it might be a good candidate for something like accompanist
r
That has nothing to do with being a JVM language (and Android doesn’t use a JVM anyway)
c
@andrew I don’t think accompanist would accept something like that either, but it’s possible
r
@andrew Not so sure, because again all the reasons listed above
c
Exactly, stuff like this is kinda high demand for some users, even if it’s niche
I’ve had designers/product up my ass asking before “where’s the blur” etc
In my humble designer opinion, in most cases with blur, it’s purely an aesthetic decision/approach which happens to be the hot trend due to iOS. From a design POV, one can achieve the same meaning of depth/hierarchy with scrims/transparency with no blur (which is arguably why Material doesn’t necessarily have blur in their design guidelines). I would argue this is really iOS design creeping into Android, versus designing to Android. I realize many have tried to fight this battle with designers/product in the past to no avail so I’m probably not being helpful.
Don’t get me wrong - I love blur effects like every designer likely does, but I also know we designers must make compromises based on ROI and I’m not sure getting blur to work without something from the framework/Jetpack is really worth the implementation cost, especially if it’s motivated by making the designs look exactly like the iOS ones.
a
On the other hand, as a dev, I've always kinda prided myself on being as pixel perfect as possible
K 1
c
Of course! And I always love working with Eng that strive for pixel perfection. But, I also appreciate design feedback around what makes sense to design for the platform, since sometimes designers can have blind spots because we like to be “unconstrained” 🙂
1
a
I mean, realistically, this feels very Android-y, but still has those blurs, etc
😍 1
Like these were based on Android designs
Same here, this is done in Compose, the toolbar would have a minimal blur in designs
It's nice that compose lets us make our own design systems and what not, I guess having something like background blues would enable even more flexibility
1
f
Those blurs can be just pngs of the blur and transparency adjusted, fastest blur hack ever 😅 For pre Android 12 🤷‍♂️
a
Not necessarily, if it needs to be dynamic
And even then, that adds bloat to the APK, and if you compress the image, it'll then look dithered or have compression artifacts
f
You can shift the position of the blur image, rotate etc.., it's not the perfect solution since Android struggled with this and the only solution is now working only for +12, but for us it worked also with static image, one image is like webp of roughly 100kbs, we haven't had UI issues so far 🤷‍♂️ Blur on low end devices is another story if you use RenderScript, try it and you'll find out ;)
a
Yeah, unfortunately the type of content in most apps are dynamic, or not image based like text, etc
f
You can add text background, scrim under anything 🤷‍♂️
c
It’s nice that compose lets us make our own design systems and what not, I guess having something like background blues would enable even more flexibility
Yeah I don’t disagree (we have the same issue with shadows not being flexible enough 🙂). But to Romain’s points above, it will require much more investment and changes in the lower levels of the Android graphics stack to really pull off blur for both Views and Compose (same with shadows).
a
I have custom shadows in compose
r
@Chris Sinco [G] Shadows are built as specified by UX 😛
🙃 1
a
setShadowLayer does the job, never noticed any slowdowns with it
👍 1
r
It won’t work on all API levels though
At least not with hardware rendering
👆 2
a
Ours is API 26+
HW is 28+, right?
r
Something like that yeah
When we switched back to Skia
Also, aren’t outline convex only or did we lift that limitation?
Because if it’s convex only… just use elevation?
a
I thought it may have been lifted
I think it was added with colored shadows
In elevation
r
@Nader Jawad do we support concave outlines now?
n
Yes the requirement that paths configured on Outlines must be convex has been removed since Android Q
r
Nice
c
@Chris Sinco [G] Also speaking as a design enthusiast, I just wanted to make a point, not to refute you 🙂
in most cases with blur, it’s purely an aesthetic decision/approach which happens to be the hot trend due to iOS
This is true, everyone has a different aesthetic, so it makes sense that most people like to blur such designs, but such designs shouldn’t be banned (not a technical argument), in many design communities nowadays (such as Dribbble) you will find most designs leaning towards the iOS style, but I don’t think this is infiltration of iOS, it’s just that certain people might be more accepting of such things, they think The iOS style suits them better, and also their design inspirations derive from it. Also, blurring is not proprietary to iOS, it just brings such a design to mobile (maybe? I’m not sure), and like Neumorphism UI. So everyone has different preferences, just like you may like dogs but I like cats 😜, there’s nothing wrong with someone liking the design of iOS~ in short, the style of third party apps shouldn’t be fixed by the system either, just like I can easily implement Material Design in iOS (BTW, I hate that huge FAB in the material.io promo picture)
a
No one is saying that blur should be banned but it's a truth that real time blur isn't smooth and battery-friendly so it's simply a choice between the design you prefer and better user experience.
2
a
I think the compromise here would be making it easier to do things such as this by providing the internals required for not only realtime blur but other render effects to be implemented as efficiently as possible
1
That way you aren't providing an implementation that feels too dirty or inefficient inside compose, but also so that you can enable others to do that if they so choose like with the view system
It seems like everyone wants something like that, but with differing requirements like the efficiency of it, etc
r
That’s why we’ve added AGSL in Android 13
to enable apps to implement their own high performance real-time effect in the future without requiring platform intervention
a
From what I saw it looks very fascinating, curious if there's a formal spec to AGSL online at all
it’s basically GLSL
a
Okay, that's what I gathered initially, just wasn't sure how different it was, awesome!
Makes tons of shaders pretty portable oob
c
Yeah. For example, there is at least one easy way to save the various Composable nodes into a bitmap so that any third-party library that wants to do something fancy won't be too difficult 🙈
r
Nah you don’t want to go through bitmaps
In AGSL we can expose render nodes directly as shader inputs
a
Would it let us do the whole screen as is?
Or a clip of it?
c
I think the compromise here would be making it easier to do things such as this by providing the internals required for not only realtime blur but other render effects to be implemented as efficiently as possible
I was a little slow. I was answering that 🤣
c
I'm just happy that @Chachako library exists so now I can show my design/product team and they can make a decision if the perf is crappy enough. they all have like galaxy s7s anyway so if it lags for them they'll just say to use a solid color.
damn iOS for having such an easy impl of this and making my team think that this is easy on android
😄 4
when they ask "why is this so hard" I'll just tell them to DM @romainguy on twitter for the technical answer. 😂
😂 1
c
@Colton Idle and I have to say that it is not actually a library yet and there is still a lot of work to be done in order to fully migrate my existing app from the old View architecture to Compose 🤭
1
m
soldiers, are we dead?
a
I would hope not
m

https://c.tenor.com/OB8Djjv6BLkAAAAC/300.gif

Is there a book of content related to render effects on Android? The best advanced material I can find is https://jorgecastillo.dev/book/ , focused on the intrinsic details of compose itself, which I believe it will help me understand better this universe. But, if I'm willing to tackle the performance problem, from where should I start? What should I be reading? What examples should I be looking for?
a
Look into SKSL, skia shaders are the underpinning of render effects and AGSL
m
Thanks, i will
a
I know this is an older thread, but a colleague of mine found this little nugget: https://android-review.googlesource.com/c/platform/frameworks/base/+/2033223 If this patchset becomes upstreamed, would this then enable backdrop blurs? 🤔
m
(This tread is gold, never old)
r
That’s… not efficient for the GPU
a
🙈
m
(Sad naruto song)
k
After reading this long thread: How does iOS make background blur efficient enough to use everywhere?
m
they own every hardware design like a boss
jokes aside, android work is more complex because of wide range of compatibility with security and performance concerns
c
How does iOS make background blur efficient enough to use everywhere?
My take is that blur as a design concept was decided to be used everywhere in their apps and design language, and so the platform team prioritized making that possible so every app could do it efficiently
That is not the case with Android (see latest Material3 and Android 12+ designs). So if blur was really a key design element of Android design that Google would want all their apps and every app in the ecosystem to take advantage of, it’d likely be prioritized higher.
But to the point above also, Android hardware fragmentation is a reality, and so it’s challenging to push this kind of aesthetic even on Google apps, if your apps and platform need to work well on a larger spectrum of devices. 🤷
k
I guess I'm curious what they do differently in their hardware that makes this kind of stuff efficient 🙂 I think the demand is more from designers who are used to using iOS devices (which are inevitably their primary devices) which have this background blur feature, so they tend to use it everywhere because it's "normal" to them 😂
r
iOS is also designed differently wrt to the window composer, etc.
the hardware itself isn’t really an issue nowadays, but Android avoids using the GPU for window composition and it’s a core part of the design
m
The amount of hours I suffered trying to implement a background blur is not written. But I did learned a lot tho
a
Yeah, Apple uses their Quartz compositor for the graphics layer
c
"Android avoids using the GPU for window composition" TIL
r
We use the hardware overlays
It uses less power and leaves the gpu to the apps
Still hardware accelerated, just different hardware
👍 1
a
I don’t want to create a necrothread, but figured that some folks here might want to check out: https://androiddev.social/@cb/111324976097218908
👍 1
r
It uses the APIs we've exposed
❤️ 1
a
I’m glad a solution exists now, it’s very exciting!
r
Since Android 12 for blurs
a
It sounds like more APIs within compose exist to make this easier too
r
It's just platform APIs exposed via Compose
a
Gotcha!
I'm fixing up OTA for a lil community project, already starting to use haze in my code.
We’ve started switching some portions of the codebase to compose 🙂
Figured I’d give haze a try here too
Anyway, y’all are rockstars getting all of this to work up to this point. My inner UI designer is so excited!