CMU logo
Expand Menu
Close Menu

Tablet Prototype Senses Context to Provide Posture-Aware UI

News

collage of 7 pictures of various display options from the posture-aware UI prototype

The research paper “Sensing Posture-Aware Pen+Touch Interaction on Tablets” received an Honorable Mention award at CHI 2019. 

Lead author Yang Zhang, fourth year HCI Ph.D. student, led the research project while interning at Microsoft Research in Redmond, Washington, last summer.

What is this idea of posture-aware tablet interaction, as mentioned in the paper title, and why is it novel?

Computers weren’t always as mobile as tablets, which didn’t allow for much flexibility in terms of user postures. The computers were huge, the user interface (UI) was fixed, and the users adapted to where things were on the screen.

However, as computers continue to evolve, users can now choose from using a mobile tablet while sitting up at a desk or lying in repose, holding a tablet with one hand or two, interacting on screen by touch or by pen, just to name a few options in this new era of computing.

While most tablet interfaces today still force users to react to the computer’s de­vice-centric UI, this posture-aware research explores adapting the tablet’s behaviors and controls based on how the user is actually holding and using the device. For example, menu options and panels display in convenient hand-centric locations instead of in a predetermined, far away corner of the screen.

The system automatically senses and makes adjustments based on a variety of reference points, including egocentric frames-of-reference (such as body, arm, hand and grip positions) and exocentric (world and device-centric) reference frames.

Their posture-sensing prototype was built with a Microsoft Surface Book, augmented with sensors to measure the background actions of the interactions. These additional sensing capabilities measure the nuances of the user’s grip, the angle of the tablet, the presence and orientation of the user’s palm on the screen while writing or sketching, and the direction the user is reaching to the screen.

“This project was a really fun and exciting summation of some ideas about tablet interaction that we'd had brewing for a long time,” said Ken Hinckley, Principal Researcher & Manager of the EPIC (Extended Perception, Interaction, and Cognition) Research Group at Microsoft Research. “I could even point to sketches of some of the concepts that appeared in my notebooks over 10 years ago. But we could never realize them until Yang showed up, banged out some amazing hardware within just a couple of weeks, and then got going on some equally amazing software that showed just how much latent potential there is in a few simple sensors that fill in the critical missing information to make interaction with devices truly more natural: the context of use. Because the nature of peoples' intent, activity, and expectations change as they shift between one posture of use and another.”

Zhang said he had a great experience during his first internship with Microsoft Research and enjoyed the summer there.

“Most of their research focuses on domains where research has great potentials to be applied to products and affect millions of users. I learned quite a lot from working with their researchers. Also, since Microsoft Research is one of the top places where HCI students would go for internships, I made a lot of friends there,” said Zhang.

Other contributors to this research paper include fellow Carnegie Mellon University HCI PhD student Gierad Laput, as well as Michel Pahud, Christian Holz, Haijun Xia, Michael McGuffin, Xiao Tu, Andrew Mittereder, Fei Su, William Buxton, and Ken Hinckley.

This work will be presented at CHI 2019 on Wednesday, May 8 at 11:00 a.m. in the Paper Session: “Mobile Interactions.”