Is there already a standard way to add haptics to ...
# compose-wear
t
Is there already a standard way to add haptics to
LazyColumns
? My workaround solution that I implemented in the past don't work very well
Found this issue but there isn't updates since July
y
I don't have enough clarity on either a) the specific UX design we should implement, or b) the proper device agnostic code to have it work on various devices. So I don't think that ticket will be moving soon. I think we'll follow the lead in Wear Compose.
However we do implement haptics for volume controls in Horologist, so it's workable for specific cases.
j
We are working on adding device independent Haptic support to scrollable surfaces like
ScalingLazyColumn
in a future release (no timeframe yet as will need new APIs at various levels in Wear Platform, Core and Wear Compose).
@yschimke can you link to the haptics in Horologist? @Michail Kulaga is there any basic generic haptic sample code we might share - I am guessing it is to early and all we have is device specific hacks?
t
I was taking a look at Horologist before asking, I imagine @yschimke is talking about this: https://github.com/google/horologist/pull/64
Yep, same.
t
@John Nichol By "Wear Platform" you mean that we'll depend on future OS versions to have some api for this? No clue about how it works for native Samsung apps on their watch? Could it be something they implemented by themselves that's not present in the AOSP Wear? I don't have another watch so I don't really know if they all Wear devices have this haptics when scrolling system settings.
m
I’m currently working on a standard api for Haptics across all devices. Probably some device specific hacks will be required in the meantime. I can share some drafts in Horologist soon ( this-next week)
t
That would be lovely @Michail Kulaga 🙏🏽 Doesn't need to be a perfect solution, I'm sure these things are tricky, it's just that scrolling on a digital bezel without any feedback makes it really weird
l
Throughout all the Wear OS watches I used or tried, I noticed that some react very differently to the same instructions. Some will have a very strong vibration motor than even when turning on for just 1ms, will be heard from 8 meters away (that was the Asus Zenwatch 2, "Zen" 🤣), some will need at least 30ms to have something you can possibly feel on the wrist. So, I am very looking forward an API that gives predictable results, as I'd love to fine tune vibration for my apps, including for notifications, so that users know what kind of event just happened without having to look, or feel the feedback from their input, without having to squint. Ideally, there would be something in the Compatibility Test Suite to enforce a predictable haptics motor behavior, so that we can deliver a UX that is fine tuned for the hardware, like one can do for the Apple Watch.