CanHaptics Project Iteration 1: Setting up the Canvas


We are building a haptic-assisted coloring platform for novice artists and
enthusiasts based on the Haply 2-DoF device. The goal is to create a tool that people can use for coloring 2D shapes with haptic assistance to (1) improve fine motor skills and (2) reduce anxiety and increase mindfulness.

As a project iteration one, our goal was to explore the design space, first widely and then converging towards feasible alternatives.


We started our first iteration design process by sketching ideas about how we imagine a haptic-guided coloring platform. Sketching is a fun grounding practice while working in a team. It makes it easy to talk about the concept in our heads and reflect. Here are some of the sketches made by Linnea, Marco, and me.

As we shared our sketches and discussed them out loud, several themes emerged (as shown in the Miro Board). We pinned the major three. Each one of us took one of these themes for design and technical exploration:

1) Haptic Guidance (Marco): How the user will be guided during the coloring exercise? What type of feedback would be provided? Can a user customize the intensity and type of feedback?

2) Haptic Richness (Preeti): Can we design different haptic textures for different tooltips? What types of texture can we render using Haply? Can we generate haptic texture similar to physical tooltips like brush, crayons, etc.?

3) User Interface Design (Linnea): How the user will interact with the interface? How can they move between different coloring spaces? How would they select the tooltips and colors?

Haptic Guidance

We did another narrowly focused brainstorming and came up with theme-specific ideas. For Haptic Guidance, we mostly focussed on bounding wall-based feedback. If you color within the lines, there will be no force feedback but as you go close to the lines, there will be an opposing force, so that you don’t cross the lines. We further scoped this down to: “What the user will experience (1) when they will get close to the wall? and (2) while crossing the wall?”

We tagged these ideas with colors that denote implementation difficulty (green is easy, yellow is moderate, red is difficult). Marco played around with some of these ideas (marked in green ticks). Check his blog for more details!

Haptic Richness

When you use a physical art medium, you experience contact-texture based on the type of tooltip. For example, a brush soaked in water-based color will have a smooth contact texture, a brush dipped in the paint will have slight friction, crayons, on the other hand, will have high friction and irregular roughness. It would be great to have these contact-texture effects in a digital coloring platform to enrich the experience. We brainstormed several ideas to render texture through Haply. Viscosity-based damping was the fundamental element we used in these designs.

I created a basic testing playground to test different textures. Changing simple elements like line widths, spacing, and damping parameters lead to different texture perceptions.

In the demo below, the top left box sequence feels like a contact-texture of a brush, whereas the bottom left sequence feels rough and more like a crayon. The top right feels like a dip and the bottom right feels like a bump. Playing around with the damping factor (topmost slider), I noted values between 450 to 850 work the best. The middle horizontal line here was for testing a push-through wall texture, values between 500 to 970 work the best. Increasing the value beyond 970 leads to instability.

Further, I am trying to dynamically vary texture on-screen by controlling the line width, spacing, and damping. I am in process of scripting a modular code. It took some time to understand the Fisica objects and the world parameters.

Setting Up First Sketching Prototype

In addition to exploring different components, we also worked on having a base prototype where we can integrate all the elements after testing. To keep it simple and quick, I got started with Linnea’s Maze code and build a coloring sheet made out of rectangular boxes.

Next, I tested tool-tip traces on the screen. Technically, as we already know the x, y coordinates of Haply, it should be easy. I tried this processing example and overlaid pulsating ellipses at Haply’s end-effector coordinates but it wasn’t leaving any traces.

if (true) {
angle += 5;
float val = cos(radians(angle)) * 12.0;
for (int a = 0; a < 360; a += 75) {
float xoff = cos(radians(a)) * val;
float yoff = sin(radians(a)) * val;
ellipse((corr_posEE.x)*40 + xoff, (corr_posEE.y)*40 + yoff, val, val);
ellipse((corr_posEE.x)*40, (corr_posEE.y)*40, 2, 2);

Later on, I realized the background() function was constantly redrawing white layers and hiding the traces. Commenting out that line was a low-key “aha” moment. I was unfamiliar with the layering concept in processing (which Linnea figured out later. Kudos to her!!!).

void draw() {
/* put graphical code here, runs repeatedly at defined framerate in setup, else default at 60fps: */
if (renderingForce == false) {

I tried some fun tooltips as added them as images to haplyAvatar. The idea is to add relevant haptic textures to suit the visuals. Playing with some of the color swatches and experiencing the visual effects already felt relaxing to me, especially the multicolor swatch (6th from the left) and the rainbow swatch.

public void b1(int theValue) {
tooltip = button_img[0];
haplyAvatar = loadImage(tooltip);
haplyAvatar.resize((int)(hAPI_Fisica.worldToScreen(tooltipsize)), (int)(hAPI_Fisica.worldToScreen(tooltipsize)));

User Interface Designing

Linnea took this code forward and explored interface design that allows the user to engage and disengage with a coloring mode, move between bounded areas, switch colors, and change brush tips. She encountered several challenges while trying to create two independent cursors — one which takes the form of a tooltip and the other which appears as a cursor when coloring mode is disabled. Eventually, she figured out how the layering works in processing which was super exciting! Check her blog for more details!

What didn’t go quite right?

As discussed before, the process of making the traces took some time to understand. Additionally, making the code modular is something really challenging as I am not fully familiar with Fisica library functions yet and how the updates work in Processing (here is a code in process). Very often, processing throws these non-descriptive errors and sometimes I am unsure whether is it because of low memory space in my computer or there is a legit error. I haven’t worked extensively with GUI elements in processing and I am yet to find a good solution so that I don’t have to copy-paste button functions gazillion times.

What are the next steps?

We currently explored three different dimensions of the process. Although we explore some initial ideas, we still have a lot more to try and test. We played with modular elements and haven’t dwelled deeper into the Fisica Library and its functions. It's something that we would check in further iterations.

Our individual components are yet to be integrated into a prototype. We are planning to have a prototype in which we can plug and play different concepts, for example, different haptic feedback for movements along the wall, types of texture and their mapping with physical mediums, and different sketch stencils.

As the project is aimed towards improving fine motor skill movement and reducing anxiety, we will read relevant work in the literature to inform our design decisions and evaluation metric—How should we design the sketching stencils to allow for improvement in fine motor skills? Should we also adapt tooltips for different fine-motor exercises? Which textures give a sense of pleasure and reduce anxiety? What type of tooltips can be included? Should we use similar tooltips like physical media or can we also explore unrealistic tooltips like fireworks, rustling leaves, firewood, rains that might serve hedonic goals?

Lots to look forward to!
Meanwhile, feel free to explore some of the color swatches and textures.



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store