Project Details
Projekt Print View

Content-Based Haptic Texture Retrieval

Subject Area Electronic Semiconductors, Components and Circuits, Integrated Systems, Sensor Technology, Theoretical Electrical Engineering
Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Term from 2015 to 2018
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 290848755
 
Final Report Year 2018

Final Report Abstract

Visual and auditory information are predominant in modern multimedia systems. The acquisition, storage, transmission and display of these modalities have reached a quality level which is typically referred to as high definition (HD) and beyond. Similar HD technology for audio is also available. Technical solutions addressing the sense of touch (also referred to as haptics), in contrast, have not yet reached the same level of sophistication. In the context of haptic interaction, kinesthetic and tactile interactions are typically considered separately, as different perceptual mechanisms are involved. While the kinesthetic modality has been studied extensively in the context of teleoperation systems, the analysis, processing and reproduction of tactile touch impressions has received comparatively little attention so far. This is surprising given the fact that we as humans rely heavily on the tactile modality to interact with our environment. In a Virtual Reality application, for example, a typical intention of a user is to interact physically with the objects in the virtual scene and to experience their material and surface properties. Many challenges have to be overcome before tactile solutions will reach the same level of sophistication as corresponding HD video or audio solutions. With recent advances in Virtual Reality (VR), Augmented Reality (AR) and Telepresence, however, the topic is rapidly gaining in relevance and is becoming an enabling technology for novel fields of application, like E-Commerce with tactile feedback (T-Commerce) or touch-augmented VR systems (T-VR). Important aspects for enabling remote touch experiences have been investigated in this project. Similar to a camera capturing images under various viewing conditions, the developed Texplorer device is able to capture haptic properties of object surfaces. Based on the recorded sensor signals, mathematical features describing major perceptually relevant dimensions are defined to form a feature vector representation of the surface material. Beyond the classification of materials, the retrieval of perceptually similar materials has been successfully realized in this project. Additionally, the available sensor data can be used to represent and render the materials in a virtual environment, which potentially enables the future use for displaying materials in virtual shopping malls or online stores. Several challenges were faced during the execution of this project. Starting with the vast range of possible surface materials/textures available, only a subset could be examined. Moreover, the fact that users employ unpredictable exploration patterns while interacting with material surfaces result in highly-variant sensor signals during subsequent scans which heavily depend on the exerted scan speed and force. We observed that a combination of different sensors and the design of specific features which exhibit a strong invariance with respect to scan-time conditions, however, can mitigate this dependency. We see two major future applications for the outcome of this project. First, we envision a lowcost system capable of identifying materials, similar to a content-based image retrieval system for images or video. Secondly, the recorded sensor information and the derived features can form a compact model of the object surface properties. This is particularly interesting for haptic experiences in virtual environments.

Publications

  • "Deep learning for surface material classification using haptic and visual information," IEEE Transactions on Multimedia, vol. 18, no. 12, pp. 2407-2416, December 2016
    H. Zheng, L. Fang, M. Ji, M. Strese, Y. Özer and E. Steinbach
    (See online at https://doi.org/10.1109/TMM.2016.2598140)
  • “Content-based Surface Material Retrieval,” In IEEE World Haptics Conference (WHC 2017), Munich, Germany, June 2017
    M. Strese, Y. Böck, and E. Steinbach
    (See online at https://doi.org/10.1109/WHC.2017.7989927)
  • ”Multimodal Feature-based Surface Material Classification,” IEEE Transactions on Haptics, vol. 10, no. 2, pp. 226-239, April 2017
    M. Strese, C. Schuwerk, A. Iepure, and E. Steinbach
    (See online at https://doi.org/10.1109/TOH.2016.2625787)
  • ”Toward High-Fidelity Haptic Interaction with Virtual Materials: A Robotic Material Scanning, Modelling, and Display System,” IEEE Haptics Symposium, San Francisco, USA, March 2018
    M. Strese and E. Steinbach
    (See online at https://doi.org/10.1109/HAPTICS.2018.8357184)
 
 

Additional Information

Textvergrößerung und Kontrastanpassung