Review of the computer game, Eliza, by Zachtronics (2019)

Given that AI has become such a popular topic 'round these parts, I thought I'd share a review of a game that seems almost prescient in dealing with today's issues.
The game is called Eliza, and it was released in 2019, before the dawn of ChatGPT. The game's name is a reference to ELIZA, the first AI chatbot. You can still chat with a modern version of ELIZA here.
Warning: Review contains spoilers.
Eliza (the game, not the machine) is a visual novel about what happens when AI takes the role of therapist / counselor. Fittingly, the name of the AI counselor is Eliza. You play as Evelyn Ishino-Aubrey, a disaffected and slightly depressed young woman who works as an Eliza "proxy". As a proxy, your job consists solely of reading out AI-generated responses during counseling sessions, essentially presenting a human face for the AI therapist. There is nothing hidden about this: the clients know that they're talking to an AI, but "research shows" that customers are still more satisfied when it's channeled through face-to-face interaction, at least according to your company superiors.
The corporatization of AI therapy is one of the main themes the game explores. The impersonal corporate vibe of your job is thick. Your job assignments and all communications are handled through a company app. You get rated and paid based on how well you stuck to the AI script... strictly no deviating!... and how well the customer rates the experienced. The company itself, Skhanda, is a mega conglomerate, like Amazon or Apple, of which Eliza is just the latest product. The CEO is a Jeff Bezos/Steve Jobs/Sam Altman-like figure. When you first meet him, you can't help but be impressed by his talent and vision. Yet, there is a darkness to him---it's clear that he has big ambitions and doesn't much care about the implications of what he's doing on humanity.

The second theme of the game is the medicalization of therapy. Over the course of the game, you meet many clients who have all sorts of problems. There's a man who's having trouble managing the responsibilities of being a husband and father. There's an artist who is having trouble dealing with her lack of success. There's a university graduate student who's head over heels in love with one of his classmates. As they share their stories with you, you as the player have freedom to control how Evelyn responds. Do you stick to the AI-generated script, which almost always suggests some sort of drug or relaxation app (built by Skhanda of course) for treatment? Or do you go off script and give a more human response? If you do, you'll get docked pay and may even get fired! But the more you play, the more you realize that Eliza is little more than a front for getting people hooked on products that are supposed to solve their problems, but really just blunt their emotions or distract them from thinking about them. Eventually, some clients express their frustration that Eliza's recommendations aren't solving their problems, and you'll be forced to choose how to respond.

A third theme the game explores is the responsibility of AI developers. Over the course of the game, you learn that Evelyn was actually one of the core engineers that built Eliza. She left the company because she became uncomfortable with how development was going, and she became a proxy to try and understand how Eliza was being used on the ground. Eventually, you're given a choice of how to move forward. You can rejoin Skhanda as an AI engineer, resuming your former role, and help steer the direction that development goes, though the implication it that you'll be helping the CEO with more nefarious aims. You can join a new virtual reality startup, which takes the medicalization of therapy to a whole new level: the startup's idea is that people won't need AI therapists at all if they can spend their life in simulated bliss (this is the Matrix outcome). Lastly, you can go your own way, abandoning the industry completely and instead reconnecting with an old friend to make music together. Interestingly, there is no option to sabotage the AI or change the course of therapy in a more humane direction, which may not be wholly satisfying but is likely realistic. That was a good choice by the game's developers, in my opinion.
Eliza was a thought-provoking game that deals directly with issues that we're facing now. I don't know if Zachtronics was aware of how much the capabilities of LLMs would grow, or how soon we were going to have to be dealing with these issues. It's a thought provoking piece that takes about 4 hours to play. I'd recommend it to anyone who enjoys visual novels and is interested in AI & society.