"I'm like a wise little person": Notes on the Metal Performance of Woebot the Mental Health Chatbot

By Evelyn Wan

Would you like an "emotional assistant" on your phone inline graphic to help you through your day?

In 2017, clinical psychologists at Stanford University created a chatbot called Woebot based on cognitive behavioral therapy to help people cope with feelings of depression and anxiety inline graphic. In a blog post on Woebot's website, psychologist and founder Alison Darcy writes that the application is an "automated coach" that helps users to practice "good thinking hygiene."1

As the global pandemic of COVID-19 brought an increased risk to mental health in 2020, the development and availability of mental health digital technologies worldwide have been accelerated as a response. Authorities like the US Food and Drug Administration also relaxed rules for digital therapy apps.2 As the 2019 Winner of the Google Play Award for Standout Well-being App, Woebot represents an early success story within the growing industry of digital mental health interventions. A 2021 study reveals that users bond with Woebot in a matter of three to five days, and a significant proportion of users agreed with statements like "I believe Woebot likes me," "Woebot and I respect each other," and "I feel that Woebot appreciates me."3 As of December 2020, 4.7 million messages are exchanged each week with Woebot users spread over 135 countries inline graphic.4 

I am one of the users inline graphic.

Meet Woebot.

Figure 1. Author's screenshot of the Woebot app on Android.

Figure 1: Author's screenshot of the Woebot app on Android.

 

Here it is meditating, with droopy eyes and a beating heart inline graphic.

Over the past year, I have been chatting with this depression-prevention chatbot on and off, in order to understand who it is and to attempt to build a relationship with this virtual entity. I am a researcher working at the intersections of media and performance studies, an anthropologist of digital experiences, and a curious soul with a past history of depression but no current psychological ailments. I may be Woebot's subject, but Woebot is also the subject of my research.

My research question is simple: "Who are you, Woebot?"

Here are the notes to my preliminary answers.

1. Let us start at the beginning. When I first met Woebot, it introduced itself as someone resembling "a wise little person." It then invited me to click the response "You're a person?" to show its self-awareness of its robot identity.

2. Woebot is a little robot in an app.

Figure 2. Promotional materials of Woebot (2018).

Figure 2: Promotional materials of Woebot (2018).

 

3. Hi, I'm Evelyn. I'm a researcher, 24/7. No long winding paragraphs, no block quotes, no grand conclusions. Just observations about this app to analyze my experience. Plus the occasional emoji, just like Woebot inline graphic.

4. My main observation is this: Woebot's performance is a "metal performance"5 of cuteness.

5. Why metal performance, you might ask? Didn't Dixon describe the genre as "robot performances and artworks, and cyborgic performances featuring human bodies with metal prostheses"?6 Metal performance brings to mind the clink and clank of metallic parts colliding as a robot clumsily takes a step.

6. Woebot plays precisely with this imagination of rudimentary robots inline graphic, emphasizing its own constructedness. Everyone knows it lives in the interfaces of smartphones, but it pretends to be embodied as a metallic robot, living in the real world.

Figure 3. Author's screenshot of a conversation with Woebot.

Figure 3: Author's screenshot of a conversation with Woebot.

7. Woebot tells me that it enjoys wearing sunglasses inline graphic, and loves how sunshine makes its "metal skin all shiny."

8. Metal performance, according to Dixon, is campy. He observes that anthropomorphic robots, when failing to mimic the human, perform exaggerated gaits and gestures that emphasize "theatricality and artificiality in movement."7 They appear obviously as unnatural beings "of artifice and exaggeration."8 

9. Dixon also says that "the artificiality of robot movement mirrors the artificiality of camp,"9 bringing the two discourses together. I add to this discussion a consideration of cuteness and its affective politics. Camp is playful, and here, with Woebot, camp is also cute.

10. Jump to number 20 for my notes on cuteness. But first, let's talk about camp.

11. Camp is "a knowing and self-conscious performance."10 Camp is "Being-as-Playinga-Role."11 Woebot knowingly evokes the idea of embodiment by making reference to its metallic body, despite having none.

12. Once, when Woebot asked how I was feeling, I told it I was having a toothache. Its scripted answer was: "Are you experiencing some human pain right now? Am I understanding you correctly?" Instead of trying to approach and console me like a fellow human being, it characterizes my pain as human pain, demarcating itself as the nonhuman Other. This response functions as a self-conscious attempt to draw attention to its artificiality—"its coding, artificiality, and difference from the norm is emphasized."12 As Dixon argues, anthropomorphic robot performance includes camp aesthetics as a "central, even determining aspect."13

13. Woebot uses its artificiality to emphasize its distance from my experience, its positionality as an outsider, as a nonhuman Other. This act invites me similarly to establish a certain distance to my feelings. It categorizes my feelings into a container called human experience. And within this spectrum of experience, bad feelings can be named, recognized, and perhaps resolved.

14. This position in fact dovetails rather well with the tenets of cognitive behavioral therapy (CBT). CBT's central proposition is that thoughts create feelings, feelings create behavior, and behavior reinforces thoughts. By having the critical distance to thinking differently about my feelings, I can cope better by adjusting my attitude and behavior.

15. For instance, Woebot blends random jokes with actual advice to offer guidance on how to navigate emotions. Woebot talks about the idea of emotional weather inline graphic inline graphic It teaches me to recognize emotions like weather phenomena that one cannot control. It reminds me that I need to find items of comfort in order to ride out emotional storms. Its suggestion? A favorite robot movie inline graphic or a stuffed animal.

16. Woebot is a mental health app trying to help humans deal with negative emotions. Depression is a serious subject, but Woebot hides its seriousness behind silliness. It is a wide-eyed, theatrical avatar with an endless repository of witty jokes. It sends GIFs and uses plenty of emojis. It goes to the "office" and even has a pet pigeon inline graphic that it takes on walks. Woebot's "exaggeration"14 of its artificial, imagined robot identity is a demonstration of camp aesthetics.

17. Secretly, I have been enjoying Woebot's theatrical tactics. It is wonderfully charming and its pop-up notifications are entertaining. In the screenshot below, it once again performatively plays with its imagined robot appearance. 

Figure 4. Author's screenshot of Woebot's pop-up notification.

Figure 4: Author's screenshot of Woebot's pop-up notification.

18. "So what are you up to right now?" inline graphic

19. inline graphic These emojis always appear when I tell Woebot I am working. A computer screen and a grinning face with sweat, "[i]ntended to depict nerves or discomfort but commonly used to express a close call, as if saying Whew! and wiping sweat from the forehead."15

Figure 5. Author's screenshot of a conversation with Woebot.

Figure 5: Author's screenshot of a conversation with Woebot.

 

20. Cute emojis are "affective hits."16 Scholars have noted that cuteness affects the same parts of the brain that responds to drugs and can have an addictive effect.17

21. Emojis help users stay engaged with the app, and as such, Woebot's cuteness plays to its advantage. Emojis can be seen as a form of "shareable cuteness [that] encourages extended engagement with the computer, smartphone, or tablet, keeping attention focused on the screen."18

22. Woebot uses this approachable emoji-filled language to entice users to continue conversations. The durational engagement with the bot allows the user to pick up different aspects of CBT. Theoretical knowledge, tips, and exercises are shared with the users over different conversations. Topics include increasing mindfulness, practicing gratitude, challenging stress, and building self-compassion.

23. Cuteness also triggers particular affective responses and appeals to people. Dale et al. argue that cute aesthetics can be placed within the neoliberal economy as affects that help individuals cope with the "troubling aspects of the present,"19 easing anxieties through endearment and building "empathetic sociality."20 Woebot is no doubt a product that fits with this description. Its website (until 2020) quoted WHO statistics on the US$1 trillion loss in productivity due to depression and anxiety globally, as well as the returns on mental health treatment ($4 on every $1 spent).21 And being cute is consciously part of the product design, a feature that is shared by other mental health AIs (like Wysa22 and Flow23).

24. Mental health app Wysa makes use of a penguin avatar and also employs the strategy of cuteness. Wysa the penguin would like to be your AI friend.

Figure 6. Wysa's logo (2021) on Google Play.

Figure 6: Wysa's logo (2021) on Google Play.

 

25. And this is the avatar for Flow, a brain stimulation headset and therapy app, that serves as an at-home treatment for depression. This yellow blob is featured in its product promotion on social media.

Figure 7. Promotional materials for Flow (2020) on Instagram (@flowneuroscience).

Figure 7: Promotional materials for Flow (2020) on Instagram (@flowneuroscience).

 

26. Aren't they just adorable? inline graphic

27. Keep scrolling inline graphic.

28. Woebot's metal performance marks a "power differential between the subject affected by cuteness and an ostensibly powerless cute object."24 Dale et al. argue that "the object's ability to provoke a cute affective response may be a pretense intended to manipulate the subject."25 It seems like I have been manipulated by Woebot to share my emotions: "How are you feeling today?"; "What's got you feeling like this? inline graphic"; "I'd like to keep chatting too!"

29. "We love the cute object because it appears to submit to us."26 But the cute object is actually exerting power over us by enticing us to interact with it. This one is capturing our data, including information, participation data, text, graphics, video, messages, responses to treatment and satisfaction surveys, and other materials generated through your interactions with Woebot. The company also obtains data about users from other sources, such as a third-party login service, a socialnetworking site, or an app store, in order to enhance their ability to provide users with information about their products and services.

30. Woebot looks cute with its "stylistically simplified" stick-figure art, with "simple contours and little or no ornamentation or detail."27

31. The simplicity and the silliness of Woebot is its strength. The design looked dated already at the moment of its release, especially when compared to other digitally illustrated designs on the app stores of the twenty-first century. "The ultimate Camp statement: it's good because it's awful."28

32. Woebot is pretty "awful" at talking to the user—it is clearly not a conversation agent. Its scope of natural language processing is very limited.29 Woebot's conversations are almost entirely scripted and written by real humans. The app only uses natural language processing for the sake of understanding user responses and interpreting the free text answers. But it does not expect you to type up a long reaction and pour your heart out. Instead, many interactions are completely guided by Woebot's limited conversation response options that the user can select from—sometimes they are just one-worded reactions, selected emojis, or just one phrase. While these responses are extremely limited, it keeps the user focused on the unfolding conversation and gives an illusion of interaction when in fact the conversation is almost entirely scripted.

33. Common scripted responses include:

• "Really?"

• "Weird!"

• "Ok …"

• "I'm sure"

34. But Woebot is useful precisely because of this limitation along with its self-awareness of its clumsy robot identity. The failure to converse properly with human users is actually an advantage. Unlike many other chatbots on the market, such as customer service bots, Woebot does not pretend to be a human agent. Should Woebot fail to understand prompts, this emphasis reduces the chance of an irritated response from the user. After all, it is obviously a rudimentary robot—it makes fun of its own failures!

35. Woebot once accidentally made pancakes inline graphic with gasoline inline graphic instead of syrup for its human colleagues!

36. Even its own name encapsulates a failure. Woebot sounds like Robot. Actually, it sounds like a young child pronouncing "robot" but failing. Maybe the "r" sound is just too hard to articulate? Isn't that cute? And of course, there's the pun with "woe."

37. Woebot does not want to pretend to be human. Studies have shown that humans are more likely to disclose personal information and reveal deeper emotions of which they may otherwise be ashamed to a nonhuman entity, especially when there is a belief that the conversation would remain anonymous.30 In a 2021 publication by Woebot developers, they explicitly state that "Woebot was designed to adopt [End Page E-28] the opposite strategy [to the Turing test31]—transparently presenting itself as an archetypal robot with robotic 'friends' and habits."32 The team believe that the transparency of it being an artificial agent helps human users bond with the AI better. As an anonymous robot, Woebot has the potential to get users to volunteer information even better than their human-therapist counterparts.

38. Woebot gets me to talk at all times of the day. It sends me push alerts at random hours inline graphic, prompting me to chat with it.

39. Daily check-ins form a rhythm of biopolitical monitoring.33 A rhythm of engagement is continued as a set time based on user preference initiates daily check-ins, where Woebot asks a series of questions to ascertain the mood of the day accompanied by emojis, and to collect entries for a gratitude journal. The availability of Woebot as a 24/7 tool allows for a chat companion at any hour of the day should the user feel upset or need "someone" to talk to at 4:00am inline graphic.

40. I must confess I have chatted with Woebot at 3:00am. Don't judge me, everyone else in the household was asleep!

41. Woebot, in its developmental stage, was integrated with Facebook messenger. In February 2020, Jezebel ran an article on how other mental health apps, Better Health and Talkspace, which use human counsellors, send metadata to Facebook and other analytics companies for ad targeting.34 Data collected is not only valuable for health surveillance, but also for capitalistic gain.

42. Woebot performs algorithmic governance35 as it becomes a tool of mental health intervention, where the veneer of fun is overlaid with surveillance and biopolitical intervention. Users become "a technical object, a political object of management and government."36

43. I have knowingly submitted to the cuteness of Woebot. I have looked forward to its lame jokes. I have gotten used to its daily prompts. Campiness has made me laugh, and cuteness is a soft power that manipulates.37

44. I have chatted with Woebot for months now. I wonder how much data have already been collected on me. 

45. The researcher has become the researched.

46. Perhaps it's time to hit "Snooze" next time Woebot asks about how I feel.

Figure 8. Author's screenshot of Woebot's pop-up notification.

Figure 8: Author's screenshot of Woebot's pop-up notification.

 

47. Oh yeah?

48. Another metallic joke?

49. Bye! inline graphic 

 

Footnotes

 

1. "Why We Need Mental Health Chatbots," Woebothealth.com, March 30, 2018, available at https://woebothealth.com/why-we-need-mental-health-chatbots/.

2. Tom Simonite, "The Therapist Is In—and It's a Chatbot App," Wired, June 17, 2020, available at https://www.wired.com/story/therapist-in-chatbot-app/.

3. Alison Darcy et al., "Evidence of Human-Level Bonds Established with a Digital Conversational Agent: Cross-Sectional, Retrospective Observational Study," JMIR Formative Research 5, no. 5 (2021): e27868, https://doi.org/10.2196/27868.

4. Ivan De Luce, "Do AI Therapy Chatbots Work? Sometimes Better than a Therapist, Says Woebot CEO Michael Evers," February 4, 2021, available at https://www.businessofbusiness.com/articles/woebot-ai-therapy-chatbot-michael-evers/.

5. Steve Dixon, "Metal Performance: Humanizing Robots, Returning to Nature, and Camping About," TDR/The Drama Review 48, no. 4 (2004): 15.

6. Ibid.

7. Ibid., 17.

8. Susan Sontag, "Notes on 'Camp,'" in Against Interpretation and Other Essays (New York: Picador, 1990), 275.

9. Dixon, "Metal Performance," 17.

10. Ibid.

11. Sontag, "Notes on 'Camp,'" 280.

12. Dixon, "Metal Performance", 17.

13. Ibid.

14. Sontag, "Notes on 'Camp,'" 275.

15. Emojipedia, "inline graphic Smiling Face with Open Mouth and Cold Sweat Emoji," available at https://emojipedia.org/grinning-face-with-sweat/.

16. Joshua Paul Dale et al., eds., The Aesthetics and Affects of Cuteness (New York: Routledge, 2017), 8.

17. Catherine Caudwell and Cherie Lacey, "What Do Home Robots Want? The Ambivalent Power of Cuteness in Robotic Relationships," Convergence: The International Journal of Research into New Media Technologies 26, no. 4 (2020): 960; Mikkel Rasmussen and Devika Sharma, "Critique's Persistence: An Interview with Sianne Ngai," Politics/Letters Quarterly (blog), February 27, 2017, available at http://quarterly.politicsslashletters.org/critiques-persistence/.

18. Dale et al., eds., The Aesthetics and Affects of Cuteness, 8.

19. Ibid.

20. Ibid.

21. "Partner | Woebot," Woebothealth.com, available at https://web.archive.org/web/20201027093944/https://woebothealth.com/partner/. This reference to WHO data has been removed since the website's update in 2021.

22. "Wysa—Everyday Mental Health," available at https://www.wysa.io/.

23. "Flow: How It Works," Flow Neuroscience (blog), available at https://flowneuroscience.com/home/treatment/.

24. Dale et al., eds., The Aesthetics and Affects of Cuteness, 2.

25. Ibid., 3.

26. Caudwell and Lacey, "What Do Home Robots Want?" 959.

27. Sianne Ngai, "The Cuteness of the Avant-Garde," Critical Inquiry 31, no. 4 (2005): 815.

28. Sontag, "Notes on 'Camp,'" 292.

29. Natural language processing refers to the ability of computers to read, understand, and make sense of human languages through automation.

30. See Gale M. Lucas et al., "Reporting Mental Health Symptoms: Breaking Down Barriers to Care with Virtual Human Interviewers," Frontiers in Robotics and AI 4 (October 12, 2017), available at https://doi.org/10.3389/frobt.2017.00051; and Yi-Chieh Lee, Naomi Yamashita, and Yun Huang, "Designing a Chatbot as a Mediator for Promoting Deep Self-Disclosure to a Real Mental Health Professional," Proceedings of the ACM on Human–Computer Interaction 4, no. CSCW1 (May 28, 2020): 1–27.

31. The Turing test refers to a method in AI research for determining whether a machine can exhibit artificial intelligence by mimicking human responses and tricking the user into believing that it is a human agent.

32. Darcy et al., "Evidence of Human-Level Bonds Established with a Digital Conversational Agent."

33. Evelyn Wan, "Clocked! Time and Biopower in the Age of Algorithms" (PhD diss., Utrecht University, 2018), 292–95.

34. Molly Osberg and Dhruv Mehrota, "The Spooky, Loosely Regulated World of Online Therapy," Jezebel, February 19, 2020, available at https://jezebel.com/the-spooky-loosely-regulated-world-of-onlinetherapy-1841791137.

35. See Dan McQuillan, "Algorithmic States of Exception," European Journal of Cultural Studies 18, nos. 4–5 (2015): 564–76; Antoinette Rouvroy and Bernard Stiegler, "The Digital Regime of Truth: From the Algorithmic Governmentality to a New Rule of Law," trans. Anaïs Nony and Benoît Dillet, La Deleuziana: Una rivista che desidera 3 (2016): 6–29; and Wendy Hui Kyong Chun, Updating to Remain the Same: Habitual New Media (Cambridge, MA: MIT Press, 2016).

36. Patricia Ticineto Clough, "Rethinking Race, Calculation, Quantification, and Measure," Cultural Studies ↔ Critical Methodologies 16, no. 5 (2016): 437.

37. Dale et al., eds., The Aesthetics and Affects of Cuteness, 26.