When we view digital contents such as pictures and 3D virtual scenes through 2D displays, the visual information needs to be projected beforehand onto a physical screen such as a tablet glass or projection screen.
In the context of interaction on a surface where the user’s interaction space is restricted to 2D space, the haptic interaction point (HIP) and the proxy point cannot reach the virtual geometry and are blocked by the physical screen. Because the conventional concept of penetration depth is no longer applicable to surface interaction, calculation of the interaction force between the HIP and virtual geometry needs to be remodeled.
To resolve this issue, we define a force computing model for surface haptic interaction based on how humans perceive the surface geometry while actively scanning real objects - human users tend to rely on the gradient (or slope) feature of the geometry when manually exploring the geometry.
Based on earlier work on lateral haptics, we first propose and validate the gradient-based haptic rendering using an active force display in the context of interaction with 2D digital images. To obtain the geometric configuration of the object with respect to the image plane or display plane, we employed a computer vision technique to estimate homographic information that describes a mapping from the object coordinate to the target image coordinate. Experimental results showed that participants could understand the presented virtual scene with their tactual experiences.
We then propose a novel tactile rendering algorithm for simulating 3D geometric features, such as bumps, on flat touch surfaces using a friction-based electrovibration display, which is a new type of lateral haptic display. Relating the gradient feature of geometry with friction, we hypothesized and validated that the object geometry could be inversely simulated on flat surfaces with friction variation. From a scalability perspective, we propose a generalized model from a ps...