skip to main content
Guest
My Research
My Account
Sign out
Sign in
This feature requires javascript
Library Search
Find Databases
Browse Search
E-Journals A-Z
E-Books A-Z
Citation Linker
Help
Language:
English
Vietnamese
This feature required javascript
This feature requires javascript
Primo Search
All Library Resources
All
Course Materials
Course Materials
Search For:
Clear Search Box
Search in:
All Library Resources
Or hit Enter to replace search target
Or select another collection:
Search in:
All Library Resources
Search in:
Print Resources
Search in:
Digital Resources
Search in:
Online E-Resources
Advanced Search
Browse Search
This feature requires javascript
Search Limited to:
Search Limited to:
Resource type
criteria input
All items
Books
Articles
Images
Audio Visual
Maps
Graduate theses
Show Results with:
criteria input
that contain my query words
with my exact phrase
starts with
Show Results with:
Search type Index
criteria input
anywhere in the record
in the title
as author/creator
in subject
Full Text
ISBN
ISSN
TOC
Keyword
Field
Show Results with:
in the title
Show Results with:
anywhere in the record
in the title
as author/creator
in subject
Full Text
ISBN
ISSN
TOC
Keyword
Field
This feature requires javascript
Hap2Gest: An Eyes-Free Interaction Concept with Smartphones Using Gestures and Haptic Feedback
Distributed under a Creative Commons Attribution 4.0 International License ;DOI: 10.1007/978-3-031-42280-5_31
Digital Resources/Online E-Resources
Citations
Cited by
View Online
Details
Recommendations
Reviews
Times Cited
External Links
This feature requires javascript
Actions
Add to My Research
Remove from My Research
E-mail
Print
Permalink
Citation
EasyBib
EndNote
RefWorks
Delicious
Export RIS
Export BibTeX
This feature requires javascript
Title:
Hap2Gest: An Eyes-Free Interaction Concept with Smartphones Using Gestures and Haptic Feedback
Author:
Jamalzadeh, Milad
;
Rekik, Yosra
;
Dancu, Alexandru
;
Grisoni, Laurent
Subjects:
Computer Science
Description:
Smartphones are used in different contexts, including scenarios where visual and auditory modalities are limited (e.g., walking or driving). In this context, we introduce a new interaction concept, called Hap2Gest, that can give commands and retrieve information, both eyes-free. First, it uses a gesture as input for command invocation, and then output information is retrieved using haptic feedback perceived through an output gesture drawn by the user. We conducted an elicitation study with 12 participants to determine users’ preferences for the aforementioned gestures and the vibration patterns for 25 referents. Our findings indicate that users tend to use the same gesture for input and output, and there is a clear relationship between the type of gestures and vibration patterns users suggest and the type of output information. We show that the gesture’s speed profile agreement rate is significantly higher than the gesture’s shape agreement rate, and it can be used by the recognizer when the gesture shape agreement rate is low. Finally, we present a complete set of user-defined gestures and vibration patterns and address the gesture recognition problem.
Creation Date:
2023
Language:
English
Identifier:
DOI: 10.1007/978-3-031-42280-5_31
Source:
Hyper Article en Ligne (HAL) (Open Access)
This feature requires javascript
This feature requires javascript
Back to results list
This feature requires javascript
This feature requires javascript
Searching Remote Databases, Please Wait
Searching for
in
scope:(TDTS),scope:(SFX),scope:(TDT),scope:(SEN),primo_central_multiple_fe
Show me what you have so far
This feature requires javascript
This feature requires javascript