TEAM

Line 136

Elleyce Pahang, Erica Lee, Netty Lim, Rachel Connelly, Timothy Kwan

ROLE

Line 136

UX, Research, Storyboarding, Video Filming/Editing, Music

DURATION

Line 136

10 Weeks

OVERVIEW

This project addressed the Microsoft Expo's 2019 theme: empathy at a scale. Ponder is a conversational AI that helps users understand their own biases, helps them process their emotions, and exposes them to new perspectives. It's meant to be an introduction to empathetic thinking, and was designed to be like training wheels, rather than a replacement or supplement to social support in the user's life. Our main deliverable was a presentation and a video, and I worked on filming and editing the video with Tim. 

FULL VIDEO

WHO IS PONDER FOR?

People who can't process their emotions in a productive way.

In the video, Anna is often cut off or ignored by her male coworkers during meetings. Upset, she then she tries to talk to her friend about it, but her friend is busy, or just doesn't care deeply enough to talk. Some people don't have the space, social connection, or even nerve to be able to express their feelings in a positive or productive manner. It hinders them from being able to empathize, and leaves them in a dangerous place of negativity and loneliness.

HOW PONDER WORKS

People who won't process their emotions in a productive way.

In the video, Cody is an angry road rager who does not have the time or patience to show empathy for the slow driver in front of him. In his rage, he does not recognize his own faults or biases, and is a far cry from being able to empathize. Realistically, not everyone is self-aware, not everyone wants to empathize, and not everyone wants to become better individuals. But in order to target this product to a wide range of users, we must account for the people who simply won't want to take the necessary steps to empathize.

Screen Shot 2019-03-20 at 12.04.58 AM

Understanding the user.

Ponder is a very personal experience; it needs to understand the user’s existing habits, trains of thought, and personality in order to provide the most productive conversations. Though the AI is able to learn the user's common trains of thought over time, there is initially no data to use when the user first starts using the app. Thus, an intimate onboarding introduces users to the app. We based our questions from the Myers Briggs personality test.

Listening to the user.

To address empathy at a scale, we realized that we needed to create something on a widely accessible platform, like a smartphone, and to give the people who might not care about becoming more empathetic a reason to use our product. Thus, Ponder is initially used as a space to vent or reflect, and helps guide emotional reflection in a productive manner by asking open-ended and positively reflecting questions. Without a presence like Ponder, people are often unable to process their emotions productively, leading to misunderstanding and hindering empathy. When the user vents or reflects to Ponder, Ponder detects key words and tones to better understand the context of the user’s emotions and biases. This helps Ponder ask better clarifying or reflective questions.

.

Guiding the user.

Ponder helps teach empathetic imagination by storytelling other possible perspectives relevant to the user’s situation. The AI pulls anecdotes from articles and allows the user to dive into sources. Over time, we hope that our users will start to imagine alternate perspectives on their own, and build a habit of empathetic thinking. We hope that Ponder is a start, not a supplement, and that users will eventually not need Ponder to help them reflect and empathize.

How did we get here?

UNDERSTANDING EMPATHY

Here's a paraphrase of most online dictionary defintions: “Empathetic understanding is understanding another person by placing oneself imaginatively in her or his experiential world.” 

I was intriguied by how largely imagination is a part of empathy. Further preliminary online research showed that the ability to empathize is developed through the exposure to and use of storytelling, whether we hear stories of other people or imagine them ourselves.  

IMG_0242

UNDERSTANDING THE PROBLEM SPACE

Our research showed us that empathy is a very human-to-human connection. This posed a problem because when we looked at technology and empathy, we didn't exactly see harmonious or positive results. Technology often detaches us from real events and real people. The lack of physicality allows us to dehumanize the people we disagree with, or to numb us from the situations real people go through. Thus, our team was adament that our technological 'solution' never replaced or supplemented human connection, only aided it. We decided to design our product as something that wouldn't stick around in a user's life, but only be used as a tool to get started if the user didn't have anywhere else to start. This is what allowed us to narrow down our two target audiences. 

But you might ask, how is a conversational AI NOT replacing human conversation? Good question. In our Wizard of Oz testing and preliminary research, we found that reflecting or venting to your friends often leads to an echo chamber of validation. Most friends will just agree with you, and validate even the most negative of emotions. This never leaves space for empathy or change. Our AI, however, would gently guide conversations in productive and positive discussion, instead of reiterating negativity or intense emotion. Instead of replacing a human interaction, it creates its own interaction, and helps users eventually be able to empathize with other humans.

 

techaddict

INITIAL DESIGN

Initially, we wanted to heavily emphasize empathetic storytelling, discussion, and imagination in our product. We ideated a product that would simply counter users that vented to it with alternate stories. If a user ranted about a rude customer, the product would say something like, "What if this customer recently lost his job, and is having a stressful day?" The goal was to help the user begin to consider other perspectives. 

However, we quickly realized that no one would want to use our product if all it did was directly challenge the user. This could further upset or infuriate them. Users want to feel validated and listened to, especially because situations that require empathy are often emotional situations. We learned this through Wizard of Oz testing (I didn't conduct or help with this part, however), and also learned that our product needed to build rapport with the user.

With further research, we found that empathy is really only possible when we have processed through our intial emotions and can think rational thoughts. We realized that tackling the problem from the root was a more effective way to promote empathy at a scale. Thus, we renamed our product from Debait to Ponder, and broke down Ponder's user process into three parts:

ponder_process

01 VENTING

This gives both our target audiences, those who can't work through their emotions and those who won't, a reason to first use our product. People like to vent, especially at the peak of their emotions. Ponder's interface is designed to be responsive, calming, and an active listening companion.

02 REFLECTING

Instead of directly countering the user, Ponder asks clarifying questions that may implicitly and indirectly challenge the user's original perception. This allows the user to properly reflect on the situation or event with a calmer, less emotional mind.

03 IMAGINING

The last step brings back empathetic imagination. Instead of making up stories, Ponder pulls from real stories found in the media. 

MY ROLE

Screen Shot 2019-03-20 at 12.30.17 AM

Designing Ponder's information architecture.

Along with the team, I helped design Ponder's information architecture, and was responsible for forming Ponder's Question Guidelines. For each step of the user process listed above, I researched and thought through suitable examples of questions Ponder would ask. Here are Ponder's Guidelines:

01 Always ask neutral, open-ended questions.

02 Never blame the user or make it seem like it’s their fault (even if it is).

03 Based on tone, pitch, and volume of the user’s voice/language of their texts, determine the difficulty or complexity of the next question asked. (ie, If the user is spewing straight fire and chaos, don’t make them recall a memory just yet.  Continue to ask questions that help them walk through their current feelings.)

04 Give the user the space they need to talk. Visually cue that Ponder is listening, and wait a brief period of time after the user has finished talking before speaking.

05 Always have an escape route. Don’t force the user to stay on a certain path.

Visualizing the story.

We brainstormed a broad storyboard structure and direction together as a team. When we had a script in place, it was my responsibility to think through each scene in detail, describe camera angles, and visualize the scene's acting and aesthetic.

Screen Shot 2019-03-20 at 12.30.46 AM
Screen Shot 2019-03-20 at 12.30.28 AM

Bringing Ponder to life.

I helped film the final video with Tim, and edited the rough cut of it, which was essentially establishing the flow and pace of the story. Then I handed the video to Tim to color correct, add in Rachel's animations, and polish the cuts and scenes. I was also in charge of finding royalty-free music that was emotionally inspiring and appropriate, which was challenging but rewarding, and of narrating the script in a serious but positive manner. The music I chose was intentionally more sad in the beginning, and picks up pace and positivity once Ponder is introduced. 

Thank you!

ponder_team_photo

All projects:

amazonProject type

rooProject type

study buddyProject type

pill palProject type

nationalProject type

travel lightProject type

Shoot me a message:

itshannahmei@gmail.com

LinkedIn

Instagram