Smart Phone








HaptiComm Logo
A hand print centred within 3 circles




Home icon

About Icon

Research Icon

Support Icon

Contact Us Icon














About the HaptiComm Project









Introduction to the HaptiComm Project

A non-technical explanation of the HaptiComm system.

by Sven Topp (Revision 1, 4th August, 2018)

The HaptiComm – Haptic Communicator – was introduced to the Haptics, Deafblind and general industry communities quite recently. Having won the best “Hands-on demonstration” category at EuroHaptics 2018 (Pisa, Italy), been demonstrated during the Helen Keller International Conference (Beni dorm, Spain) and winning the “Research Applications” category in the University of Sydney Student Innovation Awards (Sydney, Australia) a key question that is commonly asked is “What is HaptiComm”?

Before proceeding with a more direct response in relation to the HaptiComm device it’s important to first introduce the concept that drives HaptiComm. We use a key phrase “HaptiComm embodies a paradigm shift in accessibility, adaptive technology and disability-centric design principles”. The paradigm shift occurs in the recognition of the amount of personal preference that is present in the Deafblind sector for communication techniques and the kind of tactile sensations each person utilises within these methods. In so doing, the overall approach to HaptiComm development is to design a platform that is as flexible as feasibly possible (within the tactile constraints).

For this reason “What is HaptiComm?” is perhaps not the right question to ask but rather “What do I want HaptiComm to be?” is the question you should be posing to yourself.

The HaptiComm prototype that is currently being demonstrated represents an application of a much broader system and capabilities. In its current form the device has an array of 24 electro-magnetic actuators which each have the capacity to generate a tap sensation as well as vibration to the skin surface on both the palm and fingers. Each actuator can be controlled individually, simultaneously with all other actuators (with the same tactile sensation being used), simultaneously but instinctive (same time but different sensation) or as part of a distinctive tactile pattern of playback through the array.

Each device is individually 3D printed for height, shape and actuator layout preferences. A hand moulds. A laser scan is taken to generate a 3D model of the surface of the hand and fingers. From there, the outer shell is printed on which the user rests their hand. The actuators are mounted vertically below the user’s hand (not in direct continual contact with the individual who has their hand on the device. Actuators then rise and/or vibrate to reproduce a tactile sensation that represents a letter, word or concept (at present it only generates letters). The HaptiComm is “Live” and does the translation in real-time. At its very core, this makes HaptiComm a character – tactile sensation converter.

So what can you change in HaptiComm? Almost all aspects of the design can be modified if we can 3D print it at this point in time and it’s within tactile perception constraints (the skin has certain limits to what it can and cannot perceive very well). What you can change from the physical device itself:

-         Size and overall shape of the device

-         Size and shape of the actuator tip (that meets the skin)

-         Materials so long as it can be 3D printed. Predominantly this is various types of plastic and silicone at present but will vary as advances in 3D printing progress.

-         Layout, density and overall number of actuators in the device (current maximum of 31).

From the perspective of reprogramming the device you can change:

-         The audio waveform patterns that drive the actuators (you can use any audio file so long as it’s a wav format).

-         The patterns associated with each letter, word or concept.

-         Duration of playback on each actuator

-         Timing between each playback within a pattern and between patters.

-         Overall timing (egg the rhythm of the sentence)

-         Intensity of a vibration

-         Intensity of a tap sensation up to 1 Newton of force (approximately)

What this means is that HaptiComm provides us with a lot of flexibility to personalise the device to anyone’s preferences for type of sensation, shape, size, layout, materials, language or communication technique. We can also add elements like emotional content through the natural form of communication (e.g. smileys are not required or specific emotional signals).

Applications of HaptiComm are not particularly limited. Some of the possibilities are:

-         Speech to Tactile communication method (already available)

-         Receiving a phone call from a hearing person and receiving it in your chosen format

-         Reading a book or article through Optical Character Recognition (OCR)

-         Linking with your screen reader/magnifier and acting both as a visual cue (for pointer positioning) and reading what’s on your screen

-         Receiving input from a braille keyboard

-         Real time voice activated captioning

-         Tactile music playback and/or musical instrument emulator

-         And essentially any other possible application so long as you can send a character to HaptiComm

For further information or questions please contact HaptiComm:







Key Features


Technical Data

Photo Gallery

Development Team








Home Icon







About Icon





Key Features


Technical Data

Photo Gallery

Development Team

Reseatch Icon



Current Work


Support Icon







Contact Us


General Enquiries

Presentations & Events


Connect with Us






Copyright © HaptiComm

All rights reserved







Last Modified: August 8th, 2018