From voice to canvas

Empowering artists with motor impairments to draw using voice interface

Role

UI/UX Designer
Voice Interface Prototyping
Co-Design Facilitation
Usability Testing

Advisors

Eric Foreman, Thesis Advisor
Michael Tranquilli, Subject Matter Expert

Key Achievements

Awarded the Paula Rhodes Award for most innovative concept.
โ€
Recognized by American Institute of Graphic Arts (AIGA) NY as top 10 most innovative designers.

Overview

A voice-controlled drawing application for artists with motor impairments using AI to translate human sounds to computer interactions

VoCreate offers a voice-user interface specifically for art creation, revolutionizing how we approach artistic expression. VoCreate empowers individuals with motor impairments by providing a novel way to bring their ideas and creativity to life through drawing.

My research involved interacting and prototyping with people with Spinal Cord Injuries resulting in severe motor impairments impacting their ability to control their bodies shoulder and below.

The Problem

How might we help artists with motor impairments draw digitally with more control over their expression?

Challenge 1 - Limiting current methods result in frustration

Hand splints, robotic prosthetics, and eye tracking allow users to interact with physical and digital devices, it does not allow them to control it accurately and is dependent on caretakers.

Opportunity
An easier to learning and use method

Challenge 2 - Motor impairments limit hand control significantly

Injuries like Spinal Cord Injury impact the person's ability to control arm, hands, and legs. This impacts a person's ability to use common objects, including pen/stylus, mouse, or keyboard.

Opportunity
โ€
Empower artists to draw with more control

The Solution

VoCreate is a voice-controlled drawing experience that allows artists with hand motor impairments to draw

VoCreate is ย designed meticulously with the input of artists with hand impairments, to ensure that it meets their needs and allows them to express themselves freely.

Voice-First Interface

Users can control the app using their voice, such as vowel sounds to move the brush and paint. By making the "EE" sound the user can move the paint brush to the left and by making the "AW" sound to the right. It's fun, but also useful when drawing longer strokes while maintaining artistic control.

Accessible Navigation

The redesigned tool menu allows users to access drawing tools from anywhere on the screen, making it more accessible than a fixed toolbar. This reduces the need to constantly reach for frequently used tools, saving users energy and breath.

Ideation and Concept

Brainstorming Sessions

To test the hypothesis, we needed to test a primary assumption: are users comfortable drawing using their voice. To complete this test, I conducted 4 experiments with users, stakeholders, and experts. The goal was to determine how users wanted to draw using their voice. Key insights from these tests involved:

  1. Visual stimuli (illustration cards) helped users focus on the interaction being tested
  2. Users were more accurate when given the ability to draw micro adjustments
  3. Users made more mistakes with conversation style interaction due to cognitive overload
User testing the first concept of drawing with voice instructions with users to test

Rejected Concept - AI Drawing App

While our users are still used to contemporary art and design tools which are heavily reliant on complete control over what is created, it was interesting to see where such AI tools can help generate more ideas in less time at the cost of losing control. It was important to focus on what is the user's goal?

Drawing with AI transforms the entire concept of drawing from making to thinking. The basic interaction is having conversations with the system and based on those conversations the AI will generate images. Instead of voice mapped to interact with the computer, in this concept am questioning what it really means to think about creating something.

Prompt-based AI drawing concept with editing capabilities

Promising Concept - Controling the brush using voice interaction

Through user testing and observation, it was revealed that having high control was desired. I tested two main methods of moving the paint brush on the screen:

  1. Directional method: users say "move mouse up 20 pixels" to move mouse up
  2. Vowel Joystick: an interaction innovated at the University of Washington where users can move the paint brush, or cursor, by saying different vowel sounds, that is, ae to move up, aw to move right, oo to move down, and ee to move left.

After several user testing, Vowel Joystick was selected due to following advantages:

  1. More control over the movement direction and speed of the cursor
  2. More accessible to people who did not speak English only
  3. More fluid despite users showing initial skepticism
  4. Less respiratory requirements benefitting users with SPI due to impacting breathing
Illustration of the two main methods of moving cursor I experimented with

Initial Sketches

Once the interaction method was decided, sketches of the main screen was designed based on users mental model of a drawing tools. This was important to highlight the key features needed to provide a user-friendly experience. Every interaction was modified for voice only input method.

First sketches of the interface and features

Wireframes

Based on the sketches, wireframes were designed using Figma. The goal was to test if the information and tools available meet user's expectations and metal model and is a viable solution to pursue.

Low-fidelity prototype mocked up on MacBook used for testing interactions

User Testing

The prototype was tested semi-moderated remote tests with 4 users. The goal was know whether the interaction was easy to understand and met user's expectations. As the entire system was supposed to run with voice, the users were only allowed to speak to interact with the wireframes and I performed the actions. They key results were:

  1. The application met user's mental model for drawing and was easy to use
  2. Users found all tools and features helpful and worked as expected
  3. The top menu was on difficult to reach when the cursor was on a different side
  4. Users preferred the application to be on an iPad (tablet) instead of a laptop
  5. User's were confused whether the system was listening due to lack of feedback

Comparing Assistive Technologies

To understand what adaptive tools artists with hand impairments are using, I analysed user experiences of 6 products and services. Splints were the most used adaptive tools, they are wooden or plastic appendage that allows artists to hold a pencil, stylus or brush. However, they are cumbersome to use as they require frequent assistance and do no assist with finer motor movements. Through this analysis I was able to identify key opportunities for innovation.

  1. Able to setup without any assistance
  2. Allows finer motor control
  3. Affordable (less than $50 bucks)
๐ŸŸข Feature Available | ๐ŸŸ  Feature Unavailable | ๐ŸŸก Key Market Gap

Designing a more intuitive user interface

Before
The v1 UI was built on top of current design patterns

The first user interface was based on standard design patterns in the arts software industry with a modern look and feel. However, the interaction was not highly compatible with voice interaction.

After
Redefined UX patterns for native voice user interface

I redesigned each component to take be more compatible with Voice UX. I worked on improving navigation accessibility, feedback loops, and simplifying the UI resulting in a more usable and desirable experience.

Impact

We user tested the concept with multiple artists with spinal cord injuries which revealed that VoCreate was 20x more usable.

Users provided highly positive feedback and recommended VoCreate over currently available common solutions such as hand splints and eye-tracking, claiming it was quicker, easier, and more independent.