- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
In-display fingerprint sensors have become commonplace in virtually all Android smartphones, for better or for worse, and five years later…
In-display fingerprint sensors have become commonplace in virtually all Android smartphones, for better or for worse, and five years later…
There is a one-handed mode gesture that you can enable. It allows you to swipe straight down on the gesture bar to pull the entire top of the screen down.
I use that but it only works from the home screen. If I use the gesture from an app it just interact with the app.
That’s odd, I can use that gesture from any app. Wonder if it’s phone-specific.
I’m using a pixel 8 with graphene. If I try to use that gesture while browsing lemmy with the Sync app, for example, it just scrolls the feed back towards the top.
I only explain bc I’m hoping I’m using it incorrectly.
I am using a stock Pixel 6a. From the home screen, I can swipe down anywhere to pull down the notification shade.
The one-handed mode gesture (and function) is different though. Settings → System → Gestures → One-handed mode:
Usage: