Touch inputs on phones are based on tapping, swiping and more generalized nowadays – the long press. Google defines that a long press is a time-based process “where a user’s finger must remain stationary for 400-500ms.” This time-based threshold has its own negative impact on user experience given the increased response time. An alternative to this is sensing force gesture, which was integrated into the Pixel 4. This brought to the fore a more refined user experience, Google says.
Now Google AI has detailed the machine learning algorithm that works behind this firm press gesture – a swifter alternative to long press. The explanation first details the idea behind resorting to machine learning – it informs that “when a user’s finger presses” against the display, its “soft tissue deform and spreads out.”
How this spread happens is based on the person’s finger size, shape, and angle of interaction with the display. To decipher – the wider the touch on the sensor, higher the force applied. Since there are so many possible finger sizes and ways people press the screen, it’s not possible to “encode these observations with heuristic rules” thus machine learning is best way forward.
We often confuse long press and force press since we cannot calculate the amount of force to be applied during contact. Google didn’t want to go ahead with this technology so instead of predicting the finger’s “force or contact spread” Google worked on sensing “press gesture” of when user engages with “a button or a switch.”
Idea was to deliver to users’ high expectations from touch they make – for instance, we want the application to respond in real-time, as we touch and that’s what the app developers want the system to deliver. For the “press gesture to occur in real-time” changes were made at the component level and testing was carried out with “datasets of press gesture” (all possible user interactions – scrolling, dragging, tapping, and long press without force) to deliver quicker press gesture response time.
Google informs “applications that use Android’s GestureDetector or View APIs will automatically get these press signals through their existing long-press handlers.” This means we can take advantage of such apps without having to update or developers having to do any extra tinkering.