Browsing by Subject "Gesture"
Now showing 1 - 5 of 5
Results Per Page
Sort Options
Item Application of single and multi-touch gestures in a WebGL molecule viewer(2011-08) Slininger, Andrew David; Bajaj, Chandrajit; Gaither, KellyThe number of devices with touch input such as smart phones, computers, and tablets has grown extensively in recent years. Native applications on these devices have access to this touch and gesture information and can provide a rich, interactive experience. Web applications, however, lack a consistent and uniform way to retrieve touch and gesture input. With the quality and robustness of web applications continually growing and replacing native applications in many areas, a way to access and harness touch input is critical. This paper proposes two JavaScript libraries that provide a reliable and easy way for web applications to use touch efficiently and effectively. First, getTjs abstracts the gathering of touch events for most mobile and desktop touch devices. GenGesjs, the second library, receives this information and identifies gestures based on the touch input. Web applications can have this gesture information pushed to them as it is received or instead request the most recent gestures when desired. An example of interfacing with both libraries is provided in the form of WebMol. WebMol is a web application that allows for three dimensional viewing of molecules using WebGL. Gestures from GenGesjs are translated to interactions with the molecules, providing an intuitive interface for users. Using both of these libraries, web applications can easily tap into touch input resulting in an improved user experience regardless of the device.Item Arm-Hand-Finger Video Game Interaction(2012-02-14) Logsdon, Drew AnthonyDespite the growing popularity and expansion of video game interaction techniques and research in the area of hand gesture recognition, the application of hand gesture video game interaction using arm, hand, and finger motion has not been extensively explored. Most current gesture-based approaches to video game interaction neglect the use of the fingers for interaction, but inclusion of the fingers will allow for more natural and unique interaction and merits further research. To implement arm, hand and finger-based interaction for the video game domain, several problems must be solved including gesture recognition, segmentation, hand visualization, and video game interaction that responds to arm, hand, and finger input. Solutions to each of these problems have been implemented. The potential of this interaction style is illustrated through the introduction of an arm, hand, and finger controlled video game system that responds to players' hand gestures. It includes a finger-gesture recognizer as well as a video game system employing various interaction styles. This consists of a first person shooter game, a driving game, and a menu interaction system. Several users interacted with and played these games, and this form of interaction is especially suitable for real time interaction in first-person games. This is perhaps the first implementation of its kind for video game interaction. Based on test results, arm, hand, and finger interaction a viable form of interaction that deserves further research. This implementation bridges the gap between existing gesture interaction methods and more advanced virtual reality techniques. It successfully combines the solutions to each problem mentioned above into a single, working video game system. This type of interaction has proved to be more intuitive than existing gesture controls in many situations and also less complex to implement than a full virtual reality setup. It allows more control by using the hands' natural motion and allows each hand to interact independently. It can also be reliably implemented using today's technology. This implementation is a base system that can be greatly expanded on. Many possibilities for future work can be applied to this form of interaction.Item One Butterfly : understanding interface and interaction design for multitouch environments in museum contexts(2010-05) Whitworth, Erin Casey; Geisler, Gary; Francisco-Revilla, LuisMuseums can be perceived as stuffy and forbidding; web technologies can enable museums to expand access to their collections and counterbalance these perceptions. Museums are searching for new ways to communicate with the public to better make a case for their continued relevance in the digital information age. With the emergence of multitouch computing, other diverse forms of digital access and the popularization of the user experience, challenge museum design professionals to synthesize the information seeking experience that occurs on multiple computing platforms. As a means of addressing these issues, this Master’s Report summarizes the One Butterfly design project. The project's goal was to create a design for a multitouch interface for federated search of Smithsonian collections. This report describes the project’s three major phases. First, an idea for an interface was developed and designs based on that idea were captured and clarified. Second, a formal review of related research was undertaken to ground these designs in the museum informatics, user interface design, and multitouch interaction design literatures. Finally, the report concludes with a review and reflection on the designs and their underlying ideas in light of things learned in the previous phases.Item "A thousand nuances of movement" : the intersection of gesture, narrative, and temporality in selected mazurkas of Chopin(2012-05) Fons, Margaret Ann; Hatten, Robert S.; Wheeldon, MarianneIt is no secret that Frédéric Chopin was fond of dance music. Dance genres—including the mazurka, polonaise, and waltz—dominate his oeuvre. According to the Henle Urtext edition, Chopin penned fifty-seven mazurkas during his lifetime, writing in this genre more than any other. It is interesting, then, that the mazurkas seem to be one of Chopin’s most historically misunderstood genres. In their haste to point out the mazurkas’ seeming irregularities of rhythm, harmony, mode, accent pattern, and such, critics both of Chopin’s time and in more recent history often ignore two equally fundamental issues: (1) the relationship between Chopin’s mazurkas and the dance of the same name, and (2) the manner in which that relationship might inform hermeneutic readings of the mazurkas. Surely, the perceived “irregularities” were not employed haphazardly, but rather for specific expressive purposes. This essay aims to construct a model for embodied musical meaning as it pertains to Chopin’s mazurkas by examining the intersection of gesture, narrative, and temporal theories. Drawing on Robert S. Hatten’s (2004) and Alexandra Pierce’s (2007) work on musical gesture, I will relate the steps of the danced mazurka to their abstract musical counterparts in Chopin’s solo piano works and examine the affective connection between the physical steps and the musical gestures. I will then call upon the narrative theories of Michael Klein (2004) and Byron Almén (2008) and the temporal theories put forth by Jonathan D. Kramer (1973, 1996) and Judy Lochhead (1979) to construct a framework in which the musical gestures (and the expressive states they imply) interact to produce emergent meanings. Finally, I will present a gestural/narrative reading of Chopin’s Mazurka in C# minor, op. 50, no. 3, which aims to demonstrate both the utility of my proposed theoretical model and the necessity of going back to the dance to grapple with issues of musical meaning in the mazurkas.Item The virtual observing agent in music: a theory of agential perspective as implied by indexical gesture(2015-08) Gerg, Ian Wyatt; Hatten, Robert S.; Almén, Byron; Drott, Eric; Pearsall, Edward; Erk, KatrinThe human body is inseparable from our understanding of music. Through embodied cognition, listeners conceptualize music as performed action. We find evidence of this in our most fundamental musical language. “High” pitches resonate high in a singer’s head, while “fast” rhythms resemble fast bodily movement. Scholars have followed the entailments of these metaphors in recent decades, developing theories of bodily gesture (Hatten 2004, Lidov 2005) and physical mimesis (Cox 2011). These hold that the bodily movement that we hear in music can imitate the physical gestures that we use in everyday communication (e.g., waving, nodding, bowing, or sighing). This has its own entailments; most fundamentally, it implies the presence of a virtual, human-like agent within music that is similar to the “virtual persona” theorized by Edward T. Cone (1974). In other words, in perceiving musical sounds as imitative of physical movement and gesture, we infer the presence of a virtual agent who enacts them. This dissertation extends these theories, demonstrating that musical gestures can be mimetic of indexical somatic movements—that is, bodily movements of pointing, looking, striving, and reaching. These indexical gestures suggest the presence of a virtual observing agent. The virtual observing agent acts a lens through which we, the listener, can experience the interior world (diegesis) of a work. This leads us to embody a single and more individualized perspective on the musical representation. I explore the implications of indexical gesture and perspective with an examination of music from the common practice period. Moreover, I bring the theory of virtual observing agency together with theories of musical narrative and emotion.