Hey all, using Chris Banes’ Haze library, is it po...
# compose
a
Hey all, using Chris Banes’ Haze library, is it possible to mask the blur?
Copy code
Modifier.graphicsLayer {
  compositingStrategy = CompositingStrategy.Offscreen
}.drawWithContent {
  val brush = Brush.verticalGradient(listOf(Color.Black, Color.Black.copy(0F)))
  drawContent()
  drawRect(brush, blendMode = BlendMode.DstIn)
}
Using this chain of modifier on any other compose view masks it with a gradient, effectively fading out an edge. iOS has a nice blur look where they use a gradient to decrease/increase the blur with a gradient mask, is this achievable with Haze?
This is the iOS effect
s
Hi, I'm not a developer of Haze and might be wrong, but I've reviewed its source code. It doesn't provide APIs for proper masking functionality. Masking needs to be applied in the Haze Modifier where they render and combine the foreground and background. So, applying a mask to your views with modifiers might not work as intended. Haze needs to support this functionality directly. I've learned this while developing similar features for my library.
I believe it's definitely doable with render nodes. This is how Haze is built for Android. If someone needs this functionality, it's better to file a feature request; without it, the functionality might never be added. 🙂
r
Changing the blur radius based on a mask is not part of the blur provided by the system. It can however be faked by fading the blurry image on top of the original content (this works only with opaque content though)
s
Yeah, the trick with the mask should work, but Haze should handle this. And yes, blurring content must be opaque.
a
I don't expect to be able to change the blue radius based on a gradient or anything, just masking the resulting blur image. But what y'all are saying is that the internal Modifier node from the library would prevent this, currently?
s
Yeah, a gradient mask needs to be applied to the blurred background
RenderNode
. The steps are as follows: first, draw the original, non-blurred background. Next, create another node for the blurred background and draw it on top of the original, applying the gradient mask. This step will create a nice effect of gradually increasing blur over the crisp image. Finally, draw the content of the child that will have this background effect on top. This is how I handle it with my OpenGL-based blurring with gradient masks, but I'm not 100% confident if this is the right approach — it's just what I'm doing in my case (and I'm starting to hate OpenGL blending functions 🫠). https://github.com/chrisbanes/haze/blob/main/haze/src/androidMain/kotlin/dev/chrisbanes/haze/AndroidHazeNode.kt#L298
a
That makes sense 🙂
👍 1
s
It's a bit buggy, but it's working! 😄
K 2
jetpack compose 2
❤️ 2
a
And that's using your own library, correct? I saw the work you posted to X, which looks very cool!
❤️ 1
I see what you mean though, with the seam appearing every so often
s
The seam occurs because all OpenGL rendering happens on a separate thread, causing it to be a bit out of sync. I need to speed up the rendering process, as it was written quickly and not optimized. I plan to improve it before making it public.
👍🏻 1
a
Ah, that would make sense. And it wouldn't be possible to mask the resulting image, huh
s
Basically this is a mask, simple gradient from white to black rendered into R8 texture. Works like this
finalColor.rgb = mix(backgroundColor.rgb, blurColor.rgb, maskColor.r);
In the case of Haze, instead of using shaders and mixing, there will be blending modes
👍🏻 1
a
Gotcha, and the blending modes I'd imagine would be more performant, since it's still part of the same render pipeline as compose
s
At least it will be in sync with the UI because it is within the same pipeline.
👍🏻 1
I've had some tough times with OpenGL blending functions, brrrr