Joining Perspectives on Haptics
Online Documentation of Interdisciplinary Cluster Workshop
Why is the topic of haptics important to us humans and what does this mean for the multitude of scientific disciplines represented in the Cluster? In three dense days of workshop, the participants explored the topic of haptics together through the lens of design practice, history of knowledge, cultural studies, philosophy and computer science. Through experiencing demos, creating rapid prototypes, and reflection experiments, a practice-based mode of exploration was collectively developed. Through this rather broad approach, the participants sought to unfold the layers that make up haptic experiences in order to develop a shared understanding and vocabulary of the term. This is the beginning of a critical exchange on haptics with the goal of establishing a dynamic platform for MoA researchers around the haptic sense(s).
In the online documentation the group shares their abstracts, the workshop findings and questions, and invites to future collaborative formats and projects
Participants
Mario Cypko
Dominic Eger Domingos
Johann Habakuk Israel
Sebastian Keppler
Sabine Marienberg
Thomas Ness
Mattis Obermann
Felix Rasehorn
Lucas Rodrigues
Robert Stock
Mareike Stoll
Kristin Werner
Hanna Wiesener
List of Topics
-> Haptic Taxonomy. How to Talk about Tactile and Haptic Dimensions Through Different Perspectives
-> Substitution and Transduction Matters: Filtering and Distributing the Senses
-> Sketching Tactile Experience - A Hand’s On Design Workshop
-> Perspectives on Information Physicalisation and Tangible User Interfaces
-> Virtual Dissection
-> Substitution of Haptic Sensations by Visual Stimuli
You find all abstracts below.
Haptic Taxonomy. How to Talk about Tactile and Haptic Dimensions Through Different Perspectives
Dominic Eger Domingos, Thomas Ness, Felix Rasehorn, Hanna Wiesener
Haptic experiences and haptic technologies are an integral part of our lives, still it is challenging to agree on terms and words that describe or differentiate our experiences. The perception of haptic feedback is based on individual tactile experiences, mood, temperature, environment and many more parameters, thus haptics is a difficult field to conduct experiments in.
Coming together from very different disciplines and cultures of research practices we invited all participants to bring an object with them. This object should incorporate or motivate a certain interest on haptics and introduces the researcher him- or herself. With the object based introduction round we tried to over jump disciplinary boundaries, cultivate a discursive zone and motivate new alignements, shortcuts and perspectives.
But the object became more than just a narrative starting point: exploring it by hand, we shared reflections, testings and experiences, and tried to formulate a certain vocabulary in the group, that was informed by each discipline.
In the moment of presentation and reflection we recognized four layers of description that are commonly referenced.
Through methods of design thinking we started to cluster and document these reflections in a matrix. We defined four zones to channel certain properties, practices and experiences of haptics.
Materiality: what is tactile feedback and how do we perceive the materiality of the object? What is its physical or digital materiality?
Feedback: Combining different senses: are there amplifications or disturbances of the tactile experience? What kind of action or feedback is triggered when interacting with the object?
Extensions: What kind of practice is motivated by touching or interacting with the object? Which body part is being extended and how? Does the object enable a different sensorial outreach?
Symbolic: What is the symbolic dimension of the gesture or object? how we differentiate the sense of touch from other senses and how do we gain knowledge by combining all the senses? What experiences do we refer to when we try to describe tactile experiences? What kind of tacit knowledge is necessary to interpret the object?
By moving the object into one zone of reflections, new properties and perspectives became highlighted and visible, which we shared and documented in the group. The video demonstrates how these categories can be used similar to a board game. According to the position of the object in the diagram, the layer and frame of description is clearly defined. The schema functioned as a communication tool and an aggregation of haptic references gradually emerged.
At the same time, a vocabulary was formed to describe and qualify haptic experiences and their character and interplay with the sense of touch - a »working vocabulary« for the interdisciplinary group and the next workshop phase. This can help to structure discussions and to collaboratively develop haptic experiences.
Substitution and Transduction Matters: Filtering and Distributing the Senses
In my presentation, I discuss how the senses – and in particular the sense of touch – are decentered and denaturalized through their integration in larger technological frameworks. To understand the ways in which haptics and touch are re-contextualized and technologically shaped, I describe their distributed character regarding socio-technical arrangements and processes of filtering and translation enacted by specific – consumer and medical – devices. To begin, I map the state-of-the-art on research scrutinizing haptics and touch emphasizing a shift from historical and anthropological research to a thorough consideration of the senses in their intimate association with analogue and digital technologies. Considering the senses as integral to and shaped by socio-technical arrangements is key in such a perspective. This diagnosis also arises in relation to our present predicament: Technologies, programs and devices accompanying us become increasingly sensory and active. In a word: they touch us.
Since the 1990s, and with the advent of novel developments in human computer interaction, we witness the proliferation and ubiquitous availability of haptic technologies that frame our contemporary condition as an »haptic age« (Jütte: 238), as some scholars would put it. Against this background I am interested in the historical relations of haptics and vision resp. the non-visual, that in the second half of the 20th century – and up to the present – were informed by cybernetic concepts of the body and sensory processes. How, for instance, is visual information translated i.e. through haptic displays? For addressing this question, I describe a history of sensory substitution devices (SSD), that were introduced by the neuroscientist Paul Bach-y-Rita (1934-2006). From a history of science and knowledge perspective, such systems raise several questions: First, SSD are crucial in rethinking the brain as plastic, because blind test subject of a »tactile television« system (1969) or a »tongue display« (1998) learned to translate haptic input as visual impressions thus indicating the plasticity of the brain and general possibility of redirecting sensory and neural processes (Eagleman). Second, SSD raise the question of what a sense modality actually is and how the different senses are interconnected in complex ways (Paterson). Third, SSD show in an intriguing way, how sensory processes are programmed and distributed in socio-technical systems through acts of non/-human delegation. Hence, they indicate the inextricable entanglement of the technological, the social and the organic. By doing so, they point to Barad’s intrinsic insight that matter is active: »touching, sensing, is what matter does, or rather, what matter is: matter is condensations of response-ability. Touching is a matter of response.« (Barad 2014: 161)
However, it seems rather reductionist to consider only experimental lab settings, that is sensescapes and thus a very limited environment. With Haraway we can acknowledge that by enacting touch through medical or ubiquitous haptic technologies, we are also getting in contact with larger social, political, economic and technological debates situated in a frictioned and endangered world: »[…] we are inside the histories of IT engineering, electronic product assembly-line labor, mining and IT waste disposal, plastics research and manufacturing, transnational markets, communications systems, and technocultural consumer habits« (Haraway: 6).
References
Bach-y-Rita, P., C. C. Collins, F. A. Saunders, B. White, and L. Scadden. 1969. »Vision Substitution by Tactile Image Projection.« Nature 221 (5184): 963–64. https://doi.org/10.1038/221963a0.
Bach-y-Rita, Paul, Kurt. A. Kaczmarek, Mitchell E. Tyler, and Jorge Garcia-Lara. 1998. »Form Perception with a 49-Point Electrotactile Stimulus Array on the Tongue: A Technical Note.« Journal of rehabilitation research and development 35: 427–30.
Bach-y-Rita, Paul, Mitchell E. Tyler, and Kurt A. Kaczmarek. 2003. »Seeing with the Brain.« International Journal of Human–Computer Interaction 15 (2): 285–95. https://doi.org/10.1207/S15327590IJHC1502_6.
Bach-y-Rita, Paul. 1972. Brain Mechanisms in Sensory Substitution. New York: Academic Press.
Barad, Karen. 2017. »The Inhuman That Therefore I Am (V1.1).« In Power of Material - Politics of Materiality, edited by Kerstin Stakemeier and Susanne Witzgall, 153–61. Berlin: Diaphanes.
Eagleman, David. 2020. Livewired: The Inside Story of the Ever-Changing Brain. Edinburgh: Canongate.
Haraway, Donna. 2007. When Species Meet. Minnesota: University of Minnesota Press.
Jütte, Robert. 2005. A History of the Senses: From Antiquity to Cyberspace. Cambridge, UK, Malden, MA: Polity.
Parisi, David, Mark Paterson, and Jason Edward Archer. 2017. »Haptic Media Studies.« New Media & Society 19 (10): 1513–22. https://doi.org/10.1177/1461444817717518.
Paterson, Mark. 2016. Seeing with the Hands: Blindness, Vision, and Touch After Descartes. Edinburgh: Edinburgh University Press.
Sketching Tactile Experience - A Hand’s On Design Workshop
Dominic Eger Domingos, Thomas Ness, Felix Rasehorn, Hanna Wiesener
For designing interactions it is necessary to research human sensory perception, interpretation and processing as well as the use of (symbolic) material to motivate certain actions. In order to bring this interplay into a meaningful relation, investigations into how human perception works and what we can derive from it for the nature of analogue and digital systems are fundamental.
In preparation for the workshop, we developed a prototype consisting of a hardware device with a computer mouse, a microcontroller, two vibration motors, custom 3D-printed parts, and simple and quickly modifiable software snippets. On a technical level, the setup maps simple grayscale images (graphics, heat maps, photos, drawings) as a virtual haptic landscape that can be experienced via a pen interface. The size of the landscape, the intensity of the feedback, the rhythm theme, etc., and all other parameters can be easily adjusted, and a wide range of tools can be used to quickly create a »landscape«.
The workshop started for the workshop participants with a short demonstration of the possibilities of the prototypes using sample programs and sample content, after which each group could directly start their own experiments, gain experience and explore the space of possibilities together. In the ongoing discussions, the ideas could then be made directly tangible and verifiable.
Despite the buggyness of the vibrational motor and software, the lofi experimental setup invited to quickly reconfigure, adjust or iterate for upcoming questions. The advantage of always having a physical output/demonstrator at hand in the discussion, as well as the possibility of such low-threshold and rapid development cycles for idea generation, enabled an inspiring atmosphere and lively discourse across such very different disciplines.
In different experiments the workshop groups tested questions about sensorial capabilities, recognition or collaboration. The combination of tactile feedback with matching or diverging visual feedback led to very different results. Aspects of distributed perception and collaboration were explored by dividing sensory feedback between different parts of the body (left hand, right hand) or between different group members giving commands (symbolic). The experience with these setups led to a fruitful discussion about the inner hierarchy of sensory perception (tactile, visual, auditory) and sensemaking, and about the interplay between analogue and digital.
Perspectives on Information Physicalisation and Tangible User Interfaces
The domain of information visualisation is essentially based on the work of Edward Tufte (2001), who identified optimisation criteria for information visualisations as, among other things, the effort to present as much information and ideas as possible in the smallest, so that viewers can perceive them as fast as possible. It has been shown that computer-assisted interactive visual representations of abstract data can assist human cognition in understanding them, for example by reducing search times and supporting pattern recognition (Card, Mackinlay, and Shneiderman 1999).
Literature on data physicalisation sees it primarily as a form of information visualisation using physical objects. The definition of Jansen et al. (2013) is central here, according to which a data physicalisation exists as soon as a physical artefact encodes data through its geometry or its material. Data physicalisations have a long history that goes back several thousand years. Examples include clay objects into which data was engraved by carvings, nodes in ribbons or physical models of molecular structures. There is also a body of empirical evidence in the field of data physicalisation that supports the thesis that this supports human cognition. In terms of perception, one salient feature of data physicalisation is that it supports active perception, in which we use our motor skills to explore objects. Additionally, spatial depth perception is natively supported. Furthermore, it is precisely the non-visual senses that play an essential role; touching objects allows us to detect textures, stiffness and temperatures that are not visible in purely visual representations. Data physicalisation makes it possible to combine sensory modalities in a sensory consistent manner in a way that cannot be achieved in any other way. In order to achieve this, other media, such as virtual reality, have to make an enormous technical effort. Advantages of data physicalisations also arise in terms of availability and manipulability, which lead to higher user engagement.
Tangible user interfaces (Fitzmaurice, Ishii, and Buxton 1995; Hornecker and Buur 2006; Ishii and Ullmer 1997), which not only represent information but also implement application logics, go one step further. The characteristic feature is that the representation of the data also serves to control it. Typical applications include token-based applications such as schedulers, musical instruments and games.
References
Card, Stuart K., Jock D. Mackinlay, und Ben Shneiderman, Hrsg. 1999. Readings in information visualization: using vision to think. San Francisco, CA, USA: Morgan Kaufmann Publishers Inc.
Fitzmaurice, George, Hiroshi Ishii, und William Buxton. 1995. »Bricks: Laying the Foundations for Graspable User Interfaces«. S. 442–49 in SIGCHI conference on Human factors in computing systems (CHI’95). Denver, Colorado, United States: ACM Press.
Hornecker, Eva, und Jacob Buur. 2006. »Getting a Grip on Tangible Interaction: A Framework on Physical Space and Social Interaction«. S. 437–46 in CHI 2006. ACM Press.
Ishii, Hiroshi, und Brygg Ullmer. 1997. »Tangible bits: towards seamless interfaces between people, bits, and atoms«. S. 234–41 in CHI’97. Atlanta, Georgia.
Jansen, Yvonne, Pierre Dragicevic, und Jean-Daniel Fekete. 2013. »Evaluating the efficiency of physical visualizations«. S. 2593 in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI ’13. New York, New York, USA: ACM Press.
Tufte, Edward R. 2001. Visual Display of Quantitative Information. 2. Aufl. Cheshire, Conn: Bertrams.
Virtual Dissection
Virtual Dissection investigates the creation of novel visuo-haptic interaction techniques in the exploration of volumetric datasets. We hypothesize that haptics might enhance the perception and comprehension of visually-undecipherable volumes. Our project aims to integrate haptics into data analysis platforms that are commonly used by tomographic imaging practitioners. An important preliminary step to the creation and integration of haptic rendering into the aforementioned platforms is to study the human haptic perception of particular digital material properties in the given context [1]. We aim to create a haptics design space that enables us to make informed decisions regarding the optimal encoding of radiodensity values as elucidative haptic stimuli. Therefore, Virtual Dissection is planning a series of psychophysics experiments to expand existing knowledge about haptic perception [2].
During the Haptics Workshop, we seized the opportunity to conduct a brief pilot in preparation for an upcoming study comparing different haptic rendering modalities and their associated haptic discrimination potential. Our pilot study was adapted to the workshop's time constraints and participants. Since our audience was composed of haptics-savvy researchers, we inferred that none of the participants would be naïve to our brief experiment's hypotheses. Thus, we focused on obtaining expert feedback and informed oral observations about the presented stimuli. Our experiment presented four unnamed semi-translucent container boxes of stochastic colors at randomized positions. We employed a grounded force-feedback device to track the movement of a physical stylus to guide a virtual probe in 3D space and receive appropriate forces originating from our simulation [3].
Whenever a participant moved the virtual probe into a box, it became opaque and contained a randomly-positioned object, which was occluded by the box container. While touching the box, the physical probe would communicate one haptic feedback effect, namely friction, vibration, or viscosity. The object within the box communicated a higher intensity of this same haptic effect, the disparity between inner and outer areas being consistently above the just-noticeable difference threshold [4]. One of the four boxes served as a control stimulus and did not contain an inside object, communicating the same haptic effect intensity throughout its volume. We asked participants to rate the stimuli boxes regarding haptic discrimination, agency, and ergonomics. Seven researchers participated in our pilot for an average time nearing ten minutes. Although our participant sessions were brief, the valuable feedback that we received revealed important flaws and points for improvement in our design. For example, we identified position effects that randomization did not address. In this case, haptic discrimination was impacted by stimulus position as it affected the probe's pose, which is partially due to our device's limited workspace. As another example, the need for stereoscopic rendering was evidenced as participants were unable to infer the depth of our stimuli cubes, which made them uncertain about the presence of haptic feedback when active touch was necessary to perceive forces [5].
We also learned that the optimal encoding of objects through haptic rendering modalities is task-dependent, which implies that we should be seeking different ways to study haptics. The feedback we received surpassed our expectations and benefited us beyond our intended research questions. Since our participants stemmed from a variety of disciplinary fields, we received insights that deeply enriched our understanding of haptics. Given the productive results from this short workshop, we are looking forward to future events and interdisciplinary collaborations that might arise from this common research interest.
References
[1] Grunwald, M. (Ed.). (2008). Human haptic perception: Basics and applications. Springer Science & Business Media.
[2] Gescheider, G. A. (2013). Psychophysics: the fundamentals. Psychology Press.
[3] Massie, T. H., & Salisbury, J. K. (1994, November). The phantom haptic interface: A device for probing virtual objects. In Proceedings of the ASME winter annual meeting, symposium on haptic interfaces for virtual environment and teleoperator systems (Vol. 55, No. 1, pp. 295-300).
[4] Ekman, G. (1959). Weber's law and related functions. The Journal of Psychology, 47(2), 343-352.
[5] Gibson, J. J. (1962). Observations on active touch. Psychological review, 69(6), 477.
Substitution of Haptic Sensations by Visual Stimuli
Sensor substitution is the ability to compensate a sensory stimulus of one sensory modality by another modality. When a sensory modality is lost due to disability or disease, usually only the ability to perceive information is lost or restricted. The transmission channels for signal processing in the brain remain unaffected and continue to work. Thus, subjective mental images can still be »seen« through other sensory stimuli. Visuo Haptic is primarily concerned with the interaction between visual and haptic perception. Research by James et al. has shown that both sensory modalities often share the same information in the exploration and identification of objects (James et al., 2002). Interestingly, the visual sensory modality is always the primary one. Even though certain object properties such as shape, size, and orientation are acquired during haptic exploration, processing still occurs primarily via visual perception. This observation shows that a substitution between both senses is possible. In digital immersive virtual environments, the haptic stimulus can only be halfway simulated by complex technology. Common devices are e.g. special gloves that can stimulate a haptic feeling by built-in vibration. Vibration motors belong to vibrotactile feedback and are the most common form for haptic interaction. Other possibilities for haptic feedback include force feedback methods where motors apply an opposing force to block further movement. However, such technology and devices are usually difficult to obtain in the XR area, expensive or undesirable because, for example, users have to carry additional devices on their bodies. This is where the use of visual stimuli comes in to substitute for haptic feedback.
That visual feedback can improve user experience and performance has already been demonstrated (Argelaguet & Andújar, 2013). To be specific, users prefer to receive visual feedback rather than no feedback at all (Vosinakis & Koutsabasis, 2018). Prachyabrued and Borst, and Canales et al. have investigated several different visual cues for substitution when manually grasping virtual objects (Canales et al., 2019; Prachyabrued & Borst, 2016). Here, users experienced a virtual environment in which hand movements were transferred to a virtual hand. The visual cues included grasping a virtual object in which the fingers could optionally penetrate the object or stick to the surface. Coloring of the object and fingers was also used to visually represent successful grasping feedback. The result of the study showed that users preferred the non-penetration of the fingers, which was modeled after reality. Interestingly, however, this variant gave the worst performance, while the non-realistic representation with finger penetration into the object gave the best performance. The additional coloring as visual feedback resulted in a good compromise in user experience and performance. This demonstrates the effectiveness of visual feedback in increasing efficiency in the absence of haptic feedback
References
Argelaguet, F., & Andújar, C. (2013). A survey of 3D object selection techniques for virtual environments. Comput. Graph. https://doi.org/10.1016/j.cag.2012.12.003
Canales, R., Normoyle, A., Sun, Y., Ye, Y., Luca, M. D., & Jörg, S. (2019). Virtual Grasping Feedback and Virtual Hand Ownership. ACM Symposium on Applied Perception 2019, 1–9. https://doi.org/10.1145/3343036.3343132
James, T., Humphrey, G., Gati, S., Servos, P., Menon, R., & Goodale, M. (2002). Haptic study of three-dimensional objects activates extrastriate visual areas. Neuropsychologia, 40, 1706–1714. https://doi.org/10.1016/S0028-3932(02)00017-9
Prachyabrued, M., & Borst, C. W. (2016). Design and Evaluation of Visual Interpenetration Cues in Virtual Grasping. IEEE Transactions on Visualization and Computer Graphics, 22(6), 1718–1731. https://doi.org/10.1109/TVCG.2015.2456917
Vosinakis, S., & Koutsabasis, P. (2018). Evaluation of visual feedback techniques for virtual grasping with bare hands using Leap Motion and Oculus Rift. Virtual Reality, 22(1), 47–62. https://doi.org/10.1007/s10055-017-0313-4
Central Laboratory
Sophienstraße 22a
10178 Berlin