The Virtues (and Limitations) of Virtual Therapists

Users of the new app Sleepio receive treatment for insomnia from an animated shrink.

Recently, a San Francisco-based start-up called Big Health launched an iOS app, Sleepio, that features an animated psychotherapist named the Prof. The avuncular, grey-templed Scotsman guides users through a course, based on the principles of cognitive-behavioral therapy (C.B.T.), that is designed to treat insomnia. C.B.T’s basic principles are simple enough that patients can follow them with a book, but the company hopes to capture some of the dynamics of in-person therapy, which means encouraging its users to bond emotionally with its digital shrink­.

When the Prof first appears, he’s waving hello in silhouette against the sun. He has a stocky body and skinny legs—proportions that suit an iPhone screen—and he’s dressed in a natty red sports jacket and a black tie. Users eventually learn that he rides in a hot-air balloon and that he has a narcoleptic hound named Pavlov. His gestures—for example, swinging his arms and grabbing his lapels—project an appealing vigor. Like a sportscaster, the Prof is frequently offscreen, so his voice is probably the most crucial aspect of the experience.

Big Health tested regional British accents before settling on an optimistic but gentle brogue. “It combined authority with approachability, but yet with a sort of a no-nonsense streak that can nudge you towards doing a bit better,” the company’s C.E.O., Peter Hames, told me. The role ultimately went to a voice actor who lives in Glasgow. Hames added that the company is open to creating a figure with a more familiar voice—perhaps a woman. Though he says that the app is equally effective in treating insomnia in both men and women, anecdotal evidence suggests that the sexes don’t respond to the Prof in the same way. Men tend to absorb lessons without noticing him, while women are more inclined to form a bond. According to Hames, one woman reported, “When he congratulates me I feel great.”

Sleepio costs a hundred and forty-nine dollars for twelve weeks—expensive by app-store standards, but a bargain compared with traditional therapy. In face-to-face therapy, Hames says, “there’s some kind of dark-matter effect that people really like, and that has a really positive effect on outcomes.” Though the Prof is animated, Hames has found that the character keeps users coming back. A paper he co-authored in the medical journal Sleep found that a placebo group encountering a version of the Prof who spoke “convincing nonsense” stuck with the course at almost the same rate as subjects who received the Prof’s real C.B.T. course.

The idea that people can become invested in a non-human figure like the Prof or Apple’s Siri is not radically new; sometimes, at least in Hollywood, they even fall in love. But, outside of the entertainment industry, fully animated guides have struggled to take hold. Timothy Bickmore, a professor of computer science at Northeastern University, studies the use of digital agents (his preferred term) in health care. His research has shown that some groups who are traditionally less familiar with technology—older people and those on the lower end of the socioeconomic ladder—will adapt to agents. For instance, he conducted a study revealing that some hospital patients prefer to have a virtual nurse discharge them, and he has co-written papers showing that having a virtual coach can increase the amount seniors exercise in the short term and can reduce their loneliness. Bickmore identified agents’ slowness of speech and repetition of gestures as features that tend to turn users off.

The larger issue, of course, is that digital agents simply aren’t as smart as humans. The Prof, for example, mostly learns about users’ sleep habits and goals by asking them multiple-choice questions. (It can also sync with several wearable sleep trackers on the market.) To improve the diagnostic abilities and responsiveness of digital agents, researchers are starting to build far more sensitive data-gathering tools. The Institute for Creative Technologies at the University of Southern California has developed a platform, known as SimSensei, that is equipped with a microphone, a webcam, sensors, and a screen showing an image of a woman named Ellie. As Ellie asks questions of subjects and they respond, she accumulates data on their speech patterns and motions in order to assess their mental condition. Albert (Skip) Rizzo, a psychologist who’s co-leading the project, told me, for example, that people with post-traumatic stress disorder often touch their faces. In one ongoing study, soldiers in the Colorado National Guard met with Ellie before combat deployment to Afghanistan, and will meet with her at least once more when they return. The goal is to determine whether the data Ellie gathers can be used to predict mental-health difficulties. (The project has received funding from the Defense Advanced Research Projects Agency.)

Although this kind of therapeutic interaction introduces privacy concerns and other potential for abuse, there are reasons why people might prefer the help of a properly calibrated digital agent. A study that ran earlier this year in the journal Computers in Human Behavior, using Ellie, found that subjects talking to a “virtual human” felt more at ease disclosing sensitive information if they trusted that it was acting autonomously rather than masking a real person—because, they said, they believed that the machine was less likely to judge them.

Mind you, that may change as agents become more intelligent and responsive. Using the desktop version of Sleepio, I rushed through a few sessions without doing the required homework. The Prof betrayed an edge of irritation—“Agaiiin, we’re missing diaries for most days”—before reverting to his upbeat tone. The mild scolding left me feeling inexplicably ashamed.