HapLearn Iteration 1

Ahmed Anwar
5 min readMar 7, 2021

Objective

Our project is aimed towards Haptic Data Visualization (HDV) using the force-feedback and vibrotactile feedback. We wanted to explore different ways in which trends and data can be experienced in graphs using haptic interaction in a way that can assist visually impaired individuals.

Motivation

The motivation behind the first iteration was to explore multimodal approaches to represent graphical data. Three members of the group did not have Haply so they decided to focus on the vibrotactile feedback while the fourth member experimented with the Haply.

Hardware Setup

Since, three members of the group did not have the Haply so we decided to use vibrotactile feedback using simple coin actuators and Arduino Uno board. For the development environment we used Arduino IDE and Processing IDE.

Coin actuator

Contribution

In the first iteration, a lot time was spent on literature review and deciding on encoding mechanisms. I personally worked on multimodal aspect of the project where I integrated the auditory feedback with the vibrotactile feedback. A major challenge was keeping both types of feedback in sync.

Approach

For the first iteration, we decided to start with a simpler bar chart with dummy data for the population of different countries. The bar chart generated by the processing is shown below. We decided to use a multimodal approach by combining audio and vibrotactile feedback to represent the graphical data.

Bar chart rendered by processing

Audio feedback

The auditory feedback is meant to assist a blind user in particular while navigating through the graph. We felt that auditory feedback was necessary because:

  • The user can move outside the sketch window and get lost on the computer screen.
  • Only vibrotactile feedback was not enough to easily decode the y-axis values.

To incorporate audio feedback, we found two libraries for processing namely “Minim” and “ttslib”. We integrated these libraries in our bar chart sketch. One of the challenges was keeping the vibrotactile and auditory feedback in sync as otherwise there would have been a lot of noise in the sketch. This was achieved by performing different iterations.

Vibrotactile feedback

We also had to decide on the encoding mechanism for vibrotactile feedback to represent the y-axis values. Through literature review we found five parameters that can be tweaked to render vibrotactile feedback. These parameters are amplitude, frequency, waveform, duration and pattern. We decided to focus on amplitude and pattern for the first iteration.

Amplitude Encoding

While experimenting, we started with a constant amplitude of vibrations representing each category. However, after reviewing literature, and brainstorming we realized that the amplitude of the vibration should reflect the values on the y-axis. Therefore, we experimented with varying amplitudes of vibrations. Some of the issues we came across were:

  1. How to map y-axis values onto the amplitude of vibrations.
  2. The y-axis can have negative and decimal values as well.
  3. How significantly should we distinguish between different y-axis values?

After considering these bottlenecks, we decided to restrict the graph for the first iteration to a simple bar chart of the population of different countries. For the sake of simplicity, we:

  • Used population values in multiples of 10
  • Restricted our graph to only 5 categories
  • Did not include any population greater than 10.

We then encoded the y-axis values i.e. population values by tweaking the signal to the Arduino. It was a trial and error process where we played around with the intensity of the signal until we felt that we were able to distinguish between two bars. For instance, if we are in the bin for England, then we felt the strongest vibration since its corresponding value was the largest. In contrast, if we are in the bin for Germany, we felt a subtle vibration since its corresponding value is the lowest. One issue we faced was that often the amplitude was so low that it was barely noticeable. Hence, we had to re-adjust the damping factor associated with the amplitude.

Pattern Encoding

In addition to amplitude, we also utilized the the number of vibrations to map the y-axis values. We had to decide on the encoding mechanism for the vibrations. Since y-axis values were in multiples of 10, we decided to map that for the number of vibrations. Hence, the mapping for vibrations will be as following:

To test out the sketch, I tied the vibrotactile actuator to my index finger using a velcoro strap. Then I navigated on the sketch and experienced the vibrotactile and auditory feedback from the sketch.

Demo of Vibrotactile and Auditory

Reflection

In this iteration we were able to achieve our objective of creating and experimenting with a low fidelity sketch to represent the graphical data by using audio and vibrotactile feedback. We believe that a bar chart is a great starting point and we would extend this to different types of graph data. However, a major reflection point was to determine the efficacy of the vibrotactile feedback. Although we successfully rendered a multimodal way of representing a bar chart but in hindsight I believe that we could have made vibrotactile feedback the crux of the sketch.

During this iteration some of the challenges we faced were the hardware limitations since all members did not have the same hardware. We had to also come up with a way of incorporating force-feedback as one of the member had the Haply. Finally, collaborating across different time zones was challenging but we were to build trust and organize meetings.

Next Iteration

For the next iteration, we are planning to more effectively utilize the haptic/vibrotactile feedback in our sketch. To achieve this we will experiment with multi-dimensional data. Moreover, since we have hardware limitation we will try to incorporate mobile phone vibrations in our sketch.

--

--