The components of Metaphoraction
An overview of Metaphoraction construction with four components: gesture, action, object, and meaning.
Specifically, we define:
- Gesture: an operation people perform to activate functions or effects in a digital system;
- Action: a process or movement to achieve a particular thing in everyday life;
- Object: a thing to which a specified action is directed; and
- Meaning: a message that is intended, expressed, or signified.
Research pipeline
An overall research pipeline of Metaphoraction.
RP1: We conduct a literature survey to compile a list of 52 widely accepted, lightweight interactive gestures supported by commercially available devices or off-the-shelf wearable/mobile sensors.
RP2: We carry out a crowdsourced study to translate every given interactive gesture into a set of daily actions performed on associated objects by removing the presence of the digital system.
RP3: We exploit a large metaphor dataset mined from web resources to infer possible messages conveyed from the objects into meanings.
RP4: We validate the interface design of Metaphoraction with university students with design backgrounds and conduct a design workshop with design experts from a local company.
Application
The interface of Metaphoraction. ➊ is the search box; ➋ is the Sankey diagram visualizing the possible design idea candidates; ➌ is the control panel; ➍ is the node operation panel; ➎ is the description view.
This creativity support tool aims to fulfill the following tasks:
T1: Support searching for random keywords in any components;
T2: Demonstrate the relationship between different components based on query results;
T3: Enable multi-directional and multifaceted exploration of all the components; and
T4: Track browsing history for user reference.
The data and code are available on Github.
Reflections
This work presents Metaphoraction, a creativity support tool for gesture-based interaction design with metaphorical meanings. This tool supports designers in systematically exploring the metaphorical meaning of gesture-based interactions; in other words, it provides a web interface to facilitate the exploration of design idea candidates, consisting of four distinct components, i.e., gesture, action, object, and meaning. We connect these four components through three steps. First, we conduct a literature survey of 71 papers on commonly adopted upper-body mobile/wearable interactive gestures. Second, we invite crowd workers to translate those interactive gestures into daily actions plus associated objects based on their similar appearances, movements, and experiences when performing those actions. Third, we explore the potential extended meanings of those objects through metaphorical mappings with probabilities predicted based on crowdsourced ratings. Experts from our design workshop suggest that Metaphoraction can support the ideation of meaningful gesture-based interactions with improved productivity and creativity. We discuss how our insights evolved into a meaningful interaction design and present future work which could further empower interaction designers.