Metaphoraction: Support Gesture-based Interaction Design with Metaphorical Meanings

The Hong Kong University of Science and Technology

Paper Code Presentation

tochi_22_teaser

We present Metaphoraction, a creativity support tool that formulates design ideas for gesture-based interactions. The tool assists designers in creating metaphorical meanings to interaction design by interconnecting four components: gesture, action, object, and meaning. To represent the interaction design ideas with these four components, Metaphoraction links interactive gestures to actions based on the similarity of appearances, movements, and experiences; relates actions to objects by applying the immediate association; bridges objects and meanings by leveraging the metaphor TARGET-SOURCE mappings.

Abstract

Previous user experience research emphasizes meaning in interaction design beyond conventional interactive gestures. However, existing exemplars that successfully reify abstract meanings through interactions are usually case-specific, and it is currently unclear how to systematically create or extend meanings for general gesture-based interactions. We present Metaphoraction, a creativity support tool that formulates design ideas for gesture-based interactions to show metaphorical meanings with four interconnected components: gesture, action, object, and meaning. To represent the interaction design ideas with these four components, Metaphoraction links interactive gestures to actions based on the similarity of appearances, movements, and experiences; relates actions to objects by applying the immediate association; bridges objects and meanings by leveraging the metaphor TARGET-SOURCE mappings. We build a dataset containing 588,770 unique design idea candidates through surveying related research and conducting two crowdsourced studies to support meaningful gesture-based interaction design ideation. Five design experts validate that Metaphoraction can effectively support creativity and productivity during the ideation process. The paper concludes by presenting insights into meaningful gesture-based interaction design and discussing potential future uses of the tool.

The components of Metaphoraction

tochi_22_components

An overview of Metaphoraction construction with four components: gesture, action, object, and meaning.

Specifically, we define:

  • Gesture: an operation people perform to activate functions or effects in a digital system;
  • Action: a process or movement to achieve a particular thing in everyday life;
  • Object: a thing to which a specified action is directed; and
  • Meaning: a message that is intended, expressed, or signified.

Research pipeline

tochi_21_pipeline

An overall research pipeline of Metaphoraction.

RP1: We conduct a literature survey to compile a list of 52 widely accepted, lightweight interactive gestures supported by commercially available devices or off-the-shelf wearable/mobile sensors.

RP2: We carry out a crowdsourced study to translate every given interactive gesture into a set of daily actions performed on associated objects by removing the presence of the digital system.

RP3: We exploit a large metaphor dataset mined from web resources to infer possible messages conveyed from the objects into meanings.

RP4: We validate the interface design of Metaphoraction with university students with design backgrounds and conduct a design workshop with design experts from a local company.


Application

tochi_21_interface

The interface of Metaphoraction. ➊ is the search box; ➋ is the Sankey diagram visualizing the possible design idea candidates; ➌ is the control panel; ➍ is the node operation panel; ➎ is the description view.

This creativity support tool aims to fulfill the following tasks:

T1: Support searching for random keywords in any components;

T2: Demonstrate the relationship between different components based on query results;

T3: Enable multi-directional and multifaceted exploration of all the components; and

T4: Track browsing history for user reference.

The data and code are available on Github.


Reflections

This work presents Metaphoraction, a creativity support tool for gesture-based interaction design with metaphorical meanings. This tool supports designers in systematically exploring the metaphorical meaning of gesture-based interactions; in other words, it provides a web interface to facilitate the exploration of design idea candidates, consisting of four distinct components, i.e., gesture, action, object, and meaning. We connect these four components through three steps. First, we conduct a literature survey of 71 papers on commonly adopted upper-body mobile/wearable interactive gestures. Second, we invite crowd workers to translate those interactive gestures into daily actions plus associated objects based on their similar appearances, movements, and experiences when performing those actions. Third, we explore the potential extended meanings of those objects through metaphorical mappings with probabilities predicted based on crowdsourced ratings. Experts from our design workshop suggest that Metaphoraction can support the ideation of meaningful gesture-based interactions with improved productivity and creativity. We discuss how our insights evolved into a meaningful interaction design and present future work which could further empower interaction designers.


Gesture Survey

The survey results of 52 common mobile/wearable interactive gestures are grouped based on their trajectories. Names in the “()” are other alternatives used in the surveyed paper. Notes in the “[]” of the Name column indicate the difference when performing the gestures which share the same name.

Moreover, we provide the annotation for 52 common mobile/wearable interactive gestures with their type and available sensor categorization is identified from previous research.

In the Type column,

In the Sensor Categorization column,

ID Name Bodyparts Front View Side View Type Sensor Categorization
T V D 1 2 3 4 5 6 7

BibTeX


                            @article{10.1145/3511892,
                            author = {Sun, Zhida and Wang, Sitong and Liu, Chengzhong and Ma, Xiaojuan},
                            title = {Metaphoraction: Support Gesture-based Interaction Design with Metaphorical Meanings},
                            year = {2022},
                            issue_date = {October 2022},
                            publisher = {Association for Computing Machinery},
                            address = {New York, NY, USA},
                            volume = {29},
                            number = {5},
                            issn = {1073-0516},
                            url = {https://doi.org/10.1145/3511892},
                            doi = {10.1145/3511892},
                            journal = {ACM Trans. Comput.-Hum. Interact.},
                            month = {oct},
                            articleno = {45},
                            numpages = {33},
                            keywords = {user experience, wearable gesture, mobile gesture, metaphor, creative support tool, Gesture-based interaction design}
                            }