DUBAI: Soon, smartwatches could detect and capture the staggering range of human hand motions and store that information for a host of uses.
###
Researchers from the US-based Carnegie Mellon University, after making a few changes to the smartwatch’s operating system, were able to use its accelerometer (a built-in device used to measure acceleration force) to recognise hand motions in 25 activities (see right).
###
This is just the beginning of what might be possible to detect in terms of hand movements and what they could signify [about your habits, skills and health], according to them.
###
“We envision smartwatches as a unique beachhead on the body for capturing rich, everyday activities,” said Chris Harrison, Assistant Professor in Human-Computer Interaction Institute (HCII), Carnegie Mellon, in an exclusive email interview with Gulf News.
###
If the watch can know that you haven’t been washing your hands frequently enough, it can remind you to do so. Same with brushing teeth.
– Gierad Laput | PhD student, Human Computer Interaction Institute, Carnegie Mellon University###
Harrison, and HCII PhD stud-ent Gierad Laput, presented their findings at CHI 2019, a leading international conference on Human-Computer Interaction, in Glasgow, Scotland, this month.
###
We envision smartwatches as a unique beachhead on the body for capturing rich, everyday activities
– Chris Harrison, Assistant Professor in Human-Computer Interaction Institute (HCII), Carnegie Mellon###
Research in hand activity sensing has been an active area for several decades, but much of its has “been largely stuck at ambulatory states (walking, standing, sleeping, etc.) for decades,” said Harisson.
###
There are a host of apps that track all these activities.
###
But full body movements are not the only movements that engage us, he said. Think about it. You could be standing perfectly still yet your hands could be chopping vegetables.
###
What if computing systems could know the activity of both, the body and the hands? Could we then look forward to more advanced applications?
###
Yes, said Laput.
###Techwear with a purpose###
Hands central to human experience
###
Hands, said Harrison, are central to the human experience, and as such, have been the focus of inquiry across many fields, including paleontology and anatomy, linguistics and neuroscience, to name just a few fields.
###
“Ethnographic work has studied how hands are employed in everything from domestic life to industrial settings. Unfortunately, much of this seminal research was completed before computing was common,” Harrison said.
###
“Thus, as a starting point to our research, we asked two key questions: 1) What activities do humans perform with their hands in the modern world? Armed with such a list, we hoped to focus our technical efforts and better understand how recognition of these activities could be valuable in a computationally-enhanced setting. 2) Do different hand activities generate characteristic signals? In other words, are hand activities distinct and separable?”
###
Hand action and hand activity: differences
###
“We draw an important distinction between hand actions versus hand activities,” said Harrison. “Specifically, we define a hand activity as a sustained series of related hand actions, typically lasting seconds or minutes. For example, a single clap would be a hand action, whereas a series of claps would be the [hand] activity of clapping.”
###
The interesting thing about such data, according to Harrison, is that hand activity is often independent of body activity. For example, one can type on their smartphone (hand activity) while walking (body activity); or sip water from a bottle while jogging.
###
Don’t interrupt me now!
###
The ability to interpret hand activities, Harrison said, could be valuable in augmenting methods that gauge ‘human interruptibility’. For example, a system that knows what your hands are doing can intelligently avoid interruptions.
###
Just as smartphones now can block text messages while a user is driving, future devices that sense hand activity might learn not to interrupt someone while they are doing certain work, he said.
“We believe there is great value in knowing what activities the hands are engaged in to support assistive computational experiences,” he said. “In this paper, we investigated the feasibility of such sensing using commodity smartwatches, which are an immediately practical means for achieving this vision.”
###
To reach this conclusion, Harrison and his team began their exploration of hand activity detection by recruiting 50 people to wear specially programmed smartwatches for almost 1,000 hours while going about their daily activities, thus extracting a unique dataset.
###
How will it help?
###
The importance of hand activity detection [through a smartwatch], said Harrison, lies in the fact that it can be used for a host of purposes. The ability to detect, for example, hand washing and toothbrushing, can be used to nudge people towards healthier habits, said Laput. “If the watch can know that you haven’t been washing your hands frequently enough, it can remind you to do so. Same with brushing teeth, or other types of health-related activities.”
###
The data can even be used to “identify the onset of harmful health patterns (e.g., Repetitive Strain Injury (RSI),” said Harrison. “Apps might alert users to … assess the onset of motor impairments such as those associated with Parkinson’s disease.
###
“For example, [sequential]movements such as filling a kettle, turning on the stove, and then pouring the kettle, can be indicative of the user making a cup of tea. However, if these hand activities occur out of order (e.g., pouring water into the cup before boiling it), it could suggest, for example, the onset of dementia,” said Harrison.
###
Hand-activity sensing also might be used by apps that provide feedback to users who are learning a new skill, such as playing a musical instrument, the study noted.
###
For example, watch data from an expert (piano maestro’s hand movements) can be compared to watch data of a learner’s movements. “This way, you can build an app to evaluate whether your piano performance is at a ‘pro-level’,” said Laput.
###
Terming this research as a “just a small window”, Harrison believes that much remains to be done in both Human Computer Interaction (HCI) and beyond.
###
Laput said they are grateful for the positive response the work has been receiving. However, “research like this requires partnership with companies that have the engineering manpower to fully bring it to reality,” he said.
###
Research: The to-do list
###Image Credit: Supplied###
The 25 human hand activities studied by Hanson and Laput for their research paper:
###
- A) Hands still.
- B) Scrolling on touch screen.
- C) Typing on keyboard.
- D) Using mouse.
- E) Tapping on touchscreen.
- F) Playing the piano.
- G) Brushing hair.
- H) Swaying.
- I) Writing with pen.
- J) Cutting with scissors.
- K) Operating tool (drill).
- L) Operating remote.
- M) Petting.
- N) Scratching.
- O) Clapping.
- P) Opening door.
- Q) Twisting.
- R) Pouring.
- S) Drinking.
- T) Grating.
- U) Chopping with knife.
- V) Wiping with rag.
- W) Washing utensils.
- X) Washing hands.
- Y) Brushing teeth (not in the graph).
###
— (With inputs from IANS)