CanHaptics Project Iteration 2: Compiling the tools

Goal

We are building a haptic-assisted coloring platform for novice artists and
enthusiasts based on the Haply 2-DoF device. The goal is to create a tool that people can use for coloring 2D shapes with haptic assistance to (1) improve fine motor skills and (2) reduce anxiety and increase mindfulness.

In Iteration 1, we did a wide exploration in three spaces: Haptic Guidance (Marco), Haptic Richness (Preeti), and UI Design (Linnea). During Iteration 2, we narrowed down our design exploration space to focus on the mindfulness application. Our focus is to integrate the best design from each of the three spaces together and make one good coloring experience.

How we got started on Iteration 2?

We faced several coding issues in Iteration 1 and realized our code could have been more effective. We gathered that each of us approached the coding differently and hence decided to have a co-coding session to reflect and learn from each other. Before diving deep into the design phase, we decided to identify our main focus. We crossed-off some items from our pre-planned list — adding non-geometric shapes, generating automatic 2D stencil — as they might have deviated our attention from the main goal of creating an engrossing haptic coloring experience and refining it.

Who did what?

We decided to divide our work again into three parts :

  1. I integrated all three elements from Iteration 1 together, did a thorough UI/UX testing, worked with Marco to create a texture bank to gather feedback on haptic textures people find pleasing and suitable for coloring application, reviewed the final code and tweaked the parameters, and finally did another round of UI/UX/bug testing.
  2. Linnea improved upon the interface to create a UI almost completely controlled through the Haply, she worked the issues that came out of the first round of UI/UX testing.
  3. Marco added haptic guidance element to the walls and haptic feedback to UI elements, generated textures for the texture bank, created a questionnaire to get feedback on textures.

The first round of UI/UX testing

As we are going for a mindfulness application, any small issue in the program can hinder the pleasant experience. So, I did an initial round of testing to see if everything works properly after integration and nit-picked on UI/UX things that can be improved. Shown here are some identified issues:

Linnea figured out fixes for most of these issues and also added new exciting UI features!!! 👩‍💻 Check out her blog here!

Increasing the workspace dimensions

While working with textures I realized if you move the end-effector of Haply doing a short-stroke you might not feel the texture that you feel while doing a longer stroke. I thought increasing the workspace might mean that the user can do a longer stroke on Haply when the cursor on the screen moves a shorter distance. That exploration didn’t work out so well. Might go back and check that out again. If you have any suggestions feel free to comment! 🕵️‍♀️

Integrating textures

The textures I created used the basic damping wall element. The wall acts as a tactile pixel or taxel. To create variations in textures, I played around with horizontal (gray square) and vertical grids (purple and brown square). I also thought of using a criss-cross pattern but it didn’t provide a unique haptic sensation, it just felt like a normal damping square. Increasing the width of each taxel, created a smoother texture (like a brush), and decreasing the width made it rough (something like a crayon). The textures created by Marco used the variable force field on the cursor (more info here).

The attached code shows how a vertical grid is created. For brush like pattern (purple square above), wall width was increased, wall_w = 1. For horizontal patterns, the x, y coordinates were switched. You can find the Texture Bank code here! 👀

/*texture specific code*********************************************/
space = 1.2;
wall_h = 1.3;
wall_w = 0.1;
jloop = int(ydim/2);
iloop = int(xdim/space);
if (HIDE_TEXTURE){
opacity = 0;
}
for (int j = 0; j<jloop; j++) {
for (int i = 0; i<iloop; i++) {
tgrid[j][i] = new FBox(wall_w, wall_h);
tgrid[j][i].setPosition((i+1)*space, (j+0.8)*2);
tgrid[j][i].setFill(40, 62, 102, opacity) ;
tgrid[j][i].setDensity(100);
tgrid[j][i].setSensor(true);
tgrid[j][i].setNoStroke() ;
tgrid[j][i].setStatic(true);
world.add(tgrid[j][i]);
}
}
/****************************************************************/

The code integrated all the damping effects, coming from the wall, UI elements, and the grid texture.

if(((playerToken.h_avatar.isTouchingBody(colorSwatch[0])) || (playerToken.h_avatar.isTouchingBody(colorSwatch[1])) || (playerToken.h_avatar.isTouchingBody(colorSwatch[2])) || (playerToken.h_avatar.isTouchingBody(colorSwatch[3])) || (playerToken.h_avatar.isTouchingBody(colorSwatch[4])) || (playerToken.h_avatar.isTouchingBody(colorSwatch[5])))) {
playerToken.h_avatar.setDamping(850) ; //850
}
else {
if (drawingModeEngaged==true){
playerToken.h_avatar.setDamping(680) ;
textureUpdate();}
else{
playerToken.h_avatar.setDamping(0) ;
}
}

FBox touchWall ;
for (Wall item : wallList) {
touchWall = wallToWorldList.get(item);
if (C.isTouchingBody(touchWall)) {
playerToken.h_avatar.setDamping(damp); //820
}
}

textureUpdate function:

void textureUpdate() {
for (int j=0; j<jloop; j++) {
for (int i=0; i<iloop; i++) {
if (playerToken.h_avatar.isTouchingBody(tgrid[i][j])) {
playerToken.h_avatar.setDamping(damp);
tvar = 1;}
}
}
}

Hapticolour Iteration 2

After integrating different components, I tweaked the parameters of world gravity (100) and avatar damping (680) again just to make sure everything works coherently. Marco added the damping halo wall guidance effect (more here). I added the vertical taxel grid here as it felt the best according to me and our survey respondent. Linnea added several new buttons and functionalities — buttons to increase and decrease the brush size, save and clear the sketch, an eraser to clean a particular area, and buttons for palette selecting (more here).

Here is the final code for you to try out! Let us know how you felt!!!🎨

Testing for Iteration 3 already!

One of the strange things, I noticed is that the texture granularity is more prominent on the left side of the canvas as compared to the right. It is something I need to dig through. Some of the UI functionalities also need further improvements in the next round.

What’s next?

We integrated a promising texture and haptic guidance effect in our interface for now. We might add more texture options based on the feedback on the texture bank. Linnea started exploring the visual effects that will go with the tooltip. I found several resources that could help us add an enriching multi-modal aspect to the current version — brush texture, pixel flow, generative watercolor, brush strokes using bezier curves, paint sounds, processing sound library. For this iteration, we used the same 2D sketch stencil as a base, we might explore more intricate designs that will make the coloring experience more enjoyable and rewarding.

What are the designs that you would like to color?
And what are the tooltips that you would like on a haptic coloring tool?

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store