Empowering artists and designers with hand impairments to draw digitally


9 months
Sep 2022 - May 2023


Michael Tranquilli - SME
Eric Forman - Advisor
Liz Danzico - Chair


User Research
User Testing
UX/UI Design


Artists and designers with motor impairments affecting their arm, wrist, or hand who wish to create digitally must adapt to tools not designed for them. Many current solutions focus only on hardware, while software and user experience have been overlooked.

Although there are many talented artists and designers with motor impairments, current adaptive tools are not accessible to many who do not have precise control. Additionally, drawing software interfaces do not focus on the needs of this community.

As a result, even if people are interested in visual arts or design, professionally or as a hobbyist, they may get discouraged.


  • Conducted rigorous design research with a SME to inform design decisions
  • Co-designed prototypes with users throughout the research from the beginning
  • Designed and tested low and high fidelity prototypes with users
  • Presented research and design outcome to the thesis committee

Final Design

Draw With Voice

Introducing VoCreate, a drawing app for the iPad that empowers artists and designers with hand impairments to draw with their voice. The entire experience design meticulously with artists with hand impairments to be able to express themselves by empowering them.

Demo Video

Get your precision back

VoCreate is designed to be used handsfree completely. Users can use different sounds to interact, such as vowel sounds to move and paint with the brush tool, and make other sounds like "ck" and "ch" for clicking and "pop" to open tools menu.

Tools that "pop" out

The tool menu was redesigned to help the user access different drawing tools from anywhere on the screen. This is more accessible than having a fixed location for the toolbar.

Save, view, and share

An important addition to the menu was to add the new canvas option to the top of the Gallery so it is accessible all the time. Also, using the similar design language on the menu to make inconsistent with the rest of the application.

Research and Discovery

User Interviews

I conducted 6 user interviews remotely with people with varied scale of motor impairments affecting their ability to control their hands, wrist, or arms. Users ranged from having temporary disability due to minor injuries to permanent disability due to spinal cord injury. Key insights were:

  • The trauma recovery journey is an isolating experience
  • Current adaptive tools were too difficult to setup
  • Users have strong command over voice and speech
  • Smallest errors and mistake cost a lot of time to recover

Contextual Inquiry

For the scope of the research, I focused on users with Spinal Cord Injury (SCI), I conducted literature review and desk research and identified problem space. Briefly, spinal cord is a nerve tissue column running down the back, protected by layers of tissue and surrounded by vertebrae. Injuries to the cervical region (neck) of the spinal cord can lead to impairment in controlling neck, shoulder, and arm movements. Such injuries cause quadriplegia, affecting everything below the injury level. Focus of the study was on injury levels from C4 - T1, which effected the persons ability to control their hands accurately.

Image of human spinal cord injury

Subject Matter Expert Advisory

To get myself trauma informed about SCI, I worked with a subject matter expert, Michael Tranquilli, an occupational therapist and accessibility professor at NYU with a decade of experience working with people with SCI.

Co-Designing Workshops

Once the primary users were identified, I set a goal to involve people with lived experiences from the start so I am able to design with them. To achieve this, I designed Co-designing workshop sessions with 2 users with SCI for the duration of the research. The workshops were divided into four sessions:

  1. User Interview and Behavioural Studies
  2. Ideation and conceptualisation
  3. Wizard of Oz Testing
  4. Usability Testing

Competitive Analysis

To understand what adaptive tools artists with hand impairments are using, I analysed user experiences of 6 products and services. Splints were the most used adaptive tools, they are wooden or plastic appendage that allows artists to hold a pencil, stylus or brush. However, they are cumbersome to use as they require frequent assistance and do no assist with finer motor movements. Through this analysis I was able to identify key opportunities for innovation. Users wanted to draw more frequently, so they desired an experience that is easy to setup, assistance-free, allows finer motor control, and affordable.

🟢 Feature Available | 🟠 Feature Unavailable | 🟡 Key Market Gap

User Journey Map

One of they key outcomes of the research and discovery phase was the user journey map which was designed to empathise with the user's experience during the trauma recovery. At the initial stages, it is difficult to introduce any non-medical solutions as users need medical assistance. After the surgery, when the user is in their recovery phase, they feel lonely. Art therapy has proven beneficial in rehabilitation, but, the tools used in therapy are frustrating to use and learn.

User Journey Map identifying where in user's journey can the experience be improved


If I design a voice-activated drawing tool, users with hand impairments can draw more accurately and easily

Assumption Mapping

Once the research hypothesis was derived, key assumptions were identified and categorized based on their importance. Assumptions that were critical to test were the important and unknown assumptions. The key assumption to be tested was the ability to draw with voice.

Assumption Mapping highlighting key assumptions to be tested

User Personas

Problem Statement

HMW empower artists and designers with motor impairments to draw assistance-free?

Ideation and Concept

Brainstorming Sessions

To test the hypothesis, we needed to test a primary assumption: are users comfortable drawing using their voice. To complete this test, I conducted 4 experiments with users, stakeholders, and experts. The goal was to determine how users wanted to draw using their voice. Key insights from these tests involved:

  1. Visual stimuli (illustration cards) helped users focus on the interaction being tested
  2. Users were more accurate when given the ability to draw micro adjustments
  3. Users made more mistakes with conversation style interaction due to cognitive overload
Image of me testing our a concept using illustrated cards. The user could look at the card and give me prompts about what to draw based on what they were seeing as I acted as a voice operated drawing tool.

Rejected Concept - AI Drawing App

While our users are still used to contemporary art and design tools which are heavily reliant on complete control over what is created, it was interesting to see where such AI tools can help generate more ideas in less time at the cost of losing control. It was important to focus on what is the user's goal?

Drawing with AI transforms the entire concept of drawing from making to thinking. The basic interaction is having conversations with the system and based on those conversations the AI will generate images. Instead of voice mapped to interact with the computer, in this concept am questioning what it really means to think about creating something.

Prompt-based AI drawing concept with editing capabilities

Promising Concept - Voice-Controlled Painting

Through user testing and observation, it was revealed that having high control was desired. I tested two main methods of moving the paint brush on the screen:

  1. Directional method: users say "move mouse up 20 pixels" to move mouse up
  2. Vowel Joystick: an interaction innovated at the University of Washington where users can move the paint brush, or cursor, by saying different vowel sounds, that is, ae to move up, aw to move right, oo to move down, and ee to move left.

After several user testing, Vowel Joystick was selected due to following advantages:

  1. More control over the movement direction and speed of the cursor
  2. More accessible to people who did not speak English only
  3. More fluid despite users showing initial skepticism
  4. Less respiratory requirements benefitting users with SPI due to impacting breathing
Illustration of the two main methods of moving cursor I experimented with

Initial Sketches

Once the interaction method was decided, sketches of the main screen was designed based on users mental model of a drawing tools. This was important to highlight the key features needed to provide a user-friendly experience. Every interaction was modified for voice only input method.

First sketches of the interface and features


Based on the sketches, wireframes were designed using Figma. The goal was to test if the information and tools available meet user's expectations and metal model and is a viable solution to pursue.

Low-fidelity prototype mocked up on MacBook used for testing interactions

User Testing

The prototype was tested semi-moderated remote tests with 4 users. The goal was know whether the interaction was easy to understand and met user's expectations. As the entire system was supposed to run with voice, the users were only allowed to speak to interact with the wireframes and I performed the actions. They key results were:

  1. The application met user's mental model for drawing and was easy to use
  2. Users found all tools and features helpful and worked as expected
  3. The top menu was on difficult to reach when the cursor was on a different side
  4. Users preferred the application to be on an iPad (tablet) instead of a laptop
  5. User's were confused whether the system was listening due to lack of feedback

User Journey

Another key insight gained from the research was the user's trauma journey. To emphasise the user's live experience, illustrations were used to depict the stages of the user's journey instead of a map. As a result, stakeholders were able to empathise at a deeper level.

Sketches showing user's journey about when and where the user will encounter VoCreate

Design Iteration

Prototype (Version 1)

Based on the feedback on wireframes the first prototype was designed. The idea was to further test the concept in a more finished form. They key improvement were:

  1. Changing device from desktop/laptop to iPad
  2. Reorganising tools and features based on users mental models
  3. Improved user interface for better usability and feedback
Second iteration based on feedback mocked-up on an iPad application

Usability Testing

With the prototype, I conducted remote usability tests with 2 users during the Co-design workshops. Users were given tasks to operate the system with as little help as possible. Based on the user feedback, iterations were made. Key results of usability tests were as following:

  1. Inconsistent user experience with hybrid vowel joystick and word recognition
  2. Confusion due to weak feedback of using vowel joystick
  3. Expectations of the application compared to UI decisions

Prototype (Version 2)

The version 2 was the most important iteration reflecting the most critical feedbacks from the users. This included considering users’ reduced lung capacities due to the injury, their drawing environment with limited manoeuvrability, and immersive interface that allows user to have a delightful drawing experience.

Prototype Version 2 after incorporating design changes for better heuristics

Branding and Visual Language

Along with the interaction design, it was important identify a suitable visual language that will be consistent throughout the product. The key attributes affecting the decisions were: simple, fluid, and playful.

Outcome and Impact

As a result of this research, the concept was able to achieve vastly better results than the splint, a standard tool used by artist and designers with hand impairments currently. In the comparison chart below, I compare the key results achieved by VoCreate that encourage users to draw more with less effort.

Some things our users said

“I can imagine using this product during the recovery time when I was unable to move my hands at all... this would've brought much joy to my then isolated life.”

User with C4 Spinal Cord Injury

“I've been injured for 10 years and no one ever involved me in designing products for my own needs. I really enjoyed the design we did together, I think it will help so many people with the same trauma experiences.”

User with C5 Spinal Cord Injury for 10 years

Key Learnings

As the most important design project in my career so far, I have learned many valuable lessons throughout this journey. The first lesson is that I do not propose VoCreate as a solution that will solve the problem I started with but a concept that was built with some of the people in the community. Some other impactful lessons are:

Design led by user research

During the research, I diverted most of my energy into understanding the behaviours, needs, and barriers of people with hand impairments. To do this, I had to research extensively to understand the very complexities of the problem space and make trauma informed decisions. I achieved this was by defining hypotheses and design methods to test key assumptions. This helped me streamline my research goals and allow a benchmark to measure research success.

Designer's role in Accessible Design

As I designed for accessibility, I questioned my role as a designer in the whole process of user-centered designer. I encountered the most critical stance by Liza Jackson who warns about designing "innovative" solutions that end up being disability dongles the community is not interested in. While it will take me many more years to improve on this, for the thesis, I designed with my users as closely as possible given the resources to make sure their voices and perspective led the key design and experience decisions. As a result, I was able to foster genuine connection with the people and community and prototype interesting concepts which otherwise would not have been possible.

Future applications of voice-first interactions

I believe voice interactions, or Conversation Design (CxD), is one of the most natural way of interacting with machines. With the advances in Artificial Intelligence and Large Language Models, there is a possibility we might be closer to interacting with machines similar to how we interact with each other, through conversations. Through this project, I was able to conceptualize one aspect of a person's language, vowel sounds, can perform one action, drawing, and how it can help marginalized communities to prosper. Future impact of this could mean artists with hand impairments could gain back their creative career and be more financially independent after the trauma. Additionally, similar interaction could enable such users to perform other complex tasks like driving their wheelchair or interacting with computers. The idea is to work with people in such communities to design better solutions with them, in whatever capacity they wish to use it.

Next Project

Truck It In
National Logistics Start-up
Web and Android Application, MVP