top of page

Talking to Your Self-Driving Car

Enabling people 65+ to regain their independence via emerging technology 

Aaron Faucher, Ketki Jadhav, Zohaib Khan
Conversation Design, Video Directing & Editing
Adobe Premiere, After Effects
3 Weeks

By 2050, the senior population is expected to double. Around this time, Autonomous Vehicles will be the way we get around. How can we enable senior independence using Autonomous Vehicles? 


Designed a user experience for commanding an autonomous vehicle using conversation, not controls. 



Experience Prototyping

Conversation Diagramming

Video Production

Artboard 5 Copy 4.png
The senior population is expected to double by 2050 (Census)
Artboard 5 Copy 5.png
More than 2 out of 3 people fear losing their independence in their old age (Telegraph)
Artboard 5 Copy 6.png
Seniors can find it overwhelming to adapt to new technology 

A driverless car helps elderly people get back their independence, and a CUI makes it easy for them to say what they want the car to do instead of learning a new technology and interface.

A Focus on Senior Needs

CONVERSATIONAL TONE The CUI converts the AV from an inaccessible, intimidating piece of technology to a friendly, responsive assistant. For example, the car will say "where are we going?" instead of "please set the destination". 

VOLUME SENSITIVITY Many older users are hard-of-hearing, and the CUI contains an interaction to support this. It expects error states where the user may not be able to clearly hear the conversation, and updates the volume accordingly.


PREDICTIVE MEMORY ASSISTANCE Older users may benefit from the types of interconnectivity the car can provide with other devices. In our scenario, we predict an example where the car connects to a smart fridge to remind our user that she may want to pick up milk while she’s on the way to the grocery store.


SMART DROP-OFF / PICK-UP Many older users have limited mobility. The CUI will support this need by allowing the car to drop users closer to destination entrances and self-park, thus reducing walking distance for seniors. 

Exploring Earcons

We used earcons as a pre-attentive state. We predicted that most riders would be involved in heads down activities when riding in the AV. If the CUI began talking to the user immediately and without prompting, it would be too easy for the rider, especially one hard of hearing, to miss the first part of the utterance. We opted to use subtle but attention-grabbing earcons that would notify the user that the CUI was about to speak. Earcons played a similar role in providing auditory feedback when the user prompted the CUI, as well.

Error earcon

Pre-attentive earcon

Volume increase earcon

Conversation Design
Conversation Snapshot.png

Designing a CUI is very different from any other visual interface. Affordance, feedback, feedforward, everything that informs the user about how to use the system has to now be thought of as intangible, sound based interactions. The way to design this is using a few key interactions:

  1. Pre-attentive prompts

  2. Recognition of utterances

  3. Prediction of intention

  4. Feedback response

  5. Session termination

  6. Errors

  7. Error recovery 

Diagramming - 3.jpg

Diagramming key interactions between the CUI and the user. 


The full diagram can be viewed at 


© Ketki Jadhav

Feature Analysis Matrix - CUIs
Screen Shot 2018-08-16 at 12.37.48
Scenario Building 

In an autonomous vehicle where the user is  relieved of typical responsibility of driving the car, should the Conversational interface take on a less passive role? Potentially as entertainer, conversationalist, or a source of content?

In a CUI, not having visuals and an interface to play with unlike a traditional interface, makes it challenging for designers to design a solution. Scenarios solve this problem by allowing designers to immerse themselves fully in the situation to help imagine a natural dialogue flow.

Experience Prototyping 
  1. Voice activation vs. Gesture activation

  2. There is no natural conversation going on between the user. The CUI only repeats the user's commands.

  3. How will the CUI know which entrance/exit she'll take? How does it know when her errand is complete – especially because she too longer buying more items?

Experience prototyping allows designers to show and test scenarios with users through active participation. 


We got great feedback from conducting this exercise. Many of our interactions fell into the category of command-and-response; we were encouraged to think of ways to make the interaction more conversational. In response to this, we expanded our palette of possible interactions, thinking about other features or user intents that might lend themselves to more back-and-forth.

bottom of page