An "Uber, but for Mental Health" App Is an Unsettling Therapist in 'Eliza'

Eliza at first comes across as a near-future dystopia in the vein of Black Mirror: it’s a visual novel where you play the part of a young “proxy” for a virtual psychotherapist—Uber, but for mental health and less money-hemorrhaging. Proxies are simply there to provide a human, empathetic face for a therapeutic algorithm that generates responses to patients as they unburden themselves about their depression, stress, family strife, or whatever else might make someone seek counseling. Whatever they do, they must not deviate from the script the algorithm feeds them. The stage is set for a story of automation gone wrong.


This is a companion discussion topic for the original entry at https://www.vice.com/en_us/article/9kewbp/eliza-review-zachtronics-visual-novel-ai-therapist
3 Likes

When I opened this article, I was terrified that it might be actual news about a new app. The “gig economy” is going to turn sour sometime, y’all. Might have to pick this game up at some point.

I thought this was going to be about ELIZA who I would ask in middle school on my System 7 to help me with my homework.

ELIZA itself and it’s creator are actually incredibly interesting to read up on. Joseph Weizenbaum is considered by many to be one of the fathers of AI and also one of it’s largest critics and critics of computer science in general after he saw how people were reacting to ELIZA and where computers were headed in general.

1985 interview

Q: How can people continue to do this, knowing that the things they build will be involved in killing people?

A: People have a series of rationalizations. People say for example that science and technology have their own logic, that they are in fact autonomous. This particular rationalization is profoundly false. It is not true that science marches on in defiance of human will, independent of human will, that just is not the case. But it is comfortable, as I said: it leads to the position that “if I don’t do it, someone else will.”

Of course if one takes that as an ethical principle then obviously it can serve as a license to do anything at all. “People will be murdered; if I don’t do it, someone else will.” “Women will be *****; if I don’t do it, someone else will.” That is just a license for violence.

Other people say, and I think this is a widely used rationalization, that fundamentally the tools we work on are “mere” tools; This means that whether they get use for good or evil depends on the person who ultimately buys them and so on.

There’s nothing bad about working in computer vision, for example. Computer vision may very well some day be used to heal people who would otherwise die. Of course, it could also be used to guide missiles, cruise missiles for example, to their destination, and all that. You see, tthe technology itself is neutral and value-free and it just depends how one uses it. And besides – consistent with that – we can’t know, we scientists cannot know how it is going to be used. So therefore we have no responsibility.

Well, that is false. It is true that a computer, for example, can be used for good or evil. It is true that a helicopter can be used as a gunship and it can also be used to rescue people from a mountain pass. And if the question arises of how a specific device is going to be used, in what I call an abstract ideal society, then one might very well say one cannot know.

But we live in a concrete society, [and] with concrete social and historical circumstances and political realities in this society, it is perfectly obvious that when something like a computer is invented, then it is going to be adopted will be for military purposes. It follows from the concrete realities in which we live, it does not follow from pure logic. But we’re not living in an abstract society, we’re living in the society in which we in fact live.

If you look at the enormous fruits of human genius that mankind has developed in the last 50 years, atomic energy and rocketry and flying to the moon and coherent light, and it goes on and on and on – and then it turns out that every one of these triumphs is used primarily in military terms. So it is not reasonable for a scientist or technologist to insist that he or she does not know – or ca not know – how it is going to be used.

5 Likes

I can’t help but read the title of this game as “el-liiiii-zaa” as in the Schuyler Sisters song from Hamilton. Help me.

1 Like

This game is fantastic. But also now I want a really good solitaire app. Does anyone have any Android recs?

I use the Microsoft one. It’s mostly just solitaire from windows pcs ten years ago but in an app.

I would second the Microsoft Solitaire app. I was hooked on it immediately, and they always throw in daily challenges and a rotation of “events” that keep you entertained, as far as my experience goes.

Loved this game.
Bought it purely on the crew’s recommendation and I’m very glad I did.

I’m only two chapters into this, but I really want to thank the Waypoint folks for putting it on my RADAR, because it’s good as hell so far.

Finally got around to finishing this! Wow, I’m really surprised how much this game grew on me. I started out liking it a good bit, but by the end it had shifted into one of my favorites of the year!

I really loved the writing, and the subjects explored throughout were timely while feeling very much a part of the game’s world. The characters and voice-work were phenomenal, as Rob says in the review. All fairly dynamic, and layered.

Finally, I loved how things started opening up in the final third. The moment when you are given an option (finally) to go off the Eliza script, felt amazing as was so well earned. Up to that point I had already gotten into the groove of rolling as a proxy. The first couple sessions in the game was where I was expecting there to be more of the deviation options to jump out and for where the main tension of the game would be, but we end up getting way deeper into the main character’s history with the program and get to dive into so many other conflicts or potential conflicts. All of which really sucked me into this world and these characters. So when we get to empower Maya and nudge her to shift her perspective on things, I was so there for that. At that point I knew not only Maya better, but also the Eliza program (it’s strengths and weaknesses) and Evelyn’s character as well. The things I had been saying in my head a little, were given a voice.

I also appreciated all the ending options. Not just who you work with/share ideology with/have a relationship with, but also the many, many therapy options for what your character wants out of all this: 20190921113839_1 This was another moment that felt very earned and where I was moved the most I think. If I had been given a choice I would’ve avoided having a session with Eliza, but it ended up having the best moment in the game. Evelyn finally giving voice to what she wanted in all this, when it had been people talking with her (or her as a Proxy) about what they wanted up until that point.

For my ending I choose to work with Nora. I really loved that character even though they, like all the others, were a bit too into their own technical world, they truly wanted to share the freedom of self they had found with Evelyn, and personally I resonated with the idea of creating something truly for myself to share with the world without outside pressures or influences. There was a weird moment with this ending where you get e-mailed an anonymous harassment note that didn’t get addressed at all, which felt frustrating and triggering, but otherwise it really was the ending I wanted. If there was a be with/support Nora option, while becoming a therapist, I’d probably would’ve been down for that, but Rae’s loyalty to Skandha was unsettling to me.

Anyone else have thoughts on their ending? I tried looking up other published thoughts online, but only found a click-bait type piece about a “shocking” ending that wasn’t the one I chose, and was a even bit a dismissive of my route, so that was a bummer.

Just finished this game, the questions it asks are extremely interesting to me and it’s very well written, but I find myself wanting more. Which is probably unreasonable given all the big questions it raises!
Specific example, none of the ending choices appealed to me, I felt like I was being locked into extremes.

I initially chose to work on Eliza with Rainer, because this was a disaster waiting to happen that my stepping away from 3 years ago made much worse and this project needed guidance out of Rainers hands…but apparently that choice means drinking ALL the kool aid and sitting back to wait for the Singularity…so…unsatisfying for me.

then there was Nora’s ending

in Which I remember there being discussion of stealing the Eliza code and making it open source…but it never happens! Just the start of a DJ career.