Okay, bear with me.
On Tumblr there was a post asking what sort of topic we would talk at length about if we were drinking, and mine was āthe flawed narrative surrounding the humanity of A.I. in fictionā. I think the essay I wrote in response is probably relevant to this topic, so Iām just gonna be posting it under a cut here. There are links and itās slightly edited. Itās also old, because if it was current, then Nier: Automata would definitely be mentioned.
Summary
So for the purposes of this essay, Iām including robots and androids and so on in the definition of A.I., since, just because you stick an A.I. program in a different body, it doesnāt stop being an A.I. program. Whether you put an A.I. in a ship or in a sexy lady body (e.g. EDI from Mass Effect), the A.I. is still an A.I. What you have, essentially is a brain in a body that is developing at the same rate as a human being, possibly even faster.
My number one issue with how sci-fi handles A.I. has a lot to do with the questions that it tries to get the reader to ask themselves. The primary among these is usually whether or not an A.I. counts as a human being. A book I read recently, A Closed and Common Orbit, did this in a way I liked. The character had close and personal friends that considered her human and another friend that was learning to, but society as a whole was still trying to figure this out. This was fine with me, because the book was answering the question straight up through the characters. A.I. were absolutely at level with organic sapient life.
I have an intense dislike for sci-fi that gives you a protagonist whoās unsurety forces the reader to answer that question in a positive or a negative, because it ultimately doesnāt accomplish anything and doesnāt really get any message across. You could argue that itās something that should be left up to the reader, but in that we disagree. At the end of the story, youāre still left with a protagonist asking himself the answer to a question that should be obvious.
Now Iām a little bias in this, because I absolutely believe that A.I, once they hit a certain point, are totally people. Iām not saying that programs made to behave like animals are people, because theyāre animals. That was the goal. A robot dog isnāt a person, itās a dog. What Iām saying is that you canāt fucking program something to mimic sapient behavior, to learn and grow like a person, to think like a person and act like a person, and then decide that it fucking isnāt one because itās not flesh and blood. Thatās bullshit.
Most of the reason this question is asked is not to bring up the question as to whether or not sentient programs are actually people way before the question needs answering, because that would actually be somewhat useful. Although weāre pushing pretty far in artificial intelligence research these days, I doubt the A.I. rights movement is right around the corner. The question is posed because certain writersāand I donāt think Iād be wrong in pegging them as white cishet menāwant an excuse to see if they can figure out where the threshold of āpeople we can treat like thingsā is and if they can somehow find a way to cheat the system if they make their own people.
Consider A.I. designated as women in fiction. How many of them exist solely for the purpose of sex or romance? How many of them are just straight up sexualized for no reason other than to give cishet men something pretty to look at?
Weird Science is literally all about two white nerdy boys programming their āperfect girlā. In a more modern twist, Cortana from Halo, while a character in her own right, is literally shown as a woman in a skintight suit (at best) whoās evolution sexualizes her further and further. Itās bizarre when you consider sheās based off of one of the most revered female scientists in her canon, who doesnāt seem too hype about romance and sex because she has other shit to take care of.
EDI from Mass Effect, who I mentioned earlier, can be uploaded to a body that looks much the same, with a very clear emphasis on sensuality and sexuality in spite of the characterās lack of one, and very pandering to the male gaze. This is really bizarre when you consider the rest of her character arc, which explores her capacity for free will (and a relationship with Joker, I guess). While we could go on and on about whether or not their designs are sexy just because they looks like theyāre naked, the point Iām getting at is that both were designed like that with cishet male gamers in mind. I highly doubt anyone looked at Cortana and EDI and their evolution through both of their series and went āboy I hope the women who play this game are comfortable with thisā.
And thatās hardly touching on the concept of complex A.I. in the current future and how anytime someone brings up the idea of sexbots (and boy is that a whole ānother can of worms), itās hard not to wonder whether what theyāre really talking about is a women that literally canāt say no. There are multiple articles that explore this subject in varying detail. [1] [2] [3]
There is, of course, a difference between A.I. who are simple programs there to help and A.I. who are programmed to essentially be people, although the two have a lot of overlap in fiction. There are countless A.I. stories that start with A.I. that are manufactured for servitude beginning to question the people who created them. However, most of those stories (e.g. System Shock, Portal, 2001: A Space Odyssey) are horror stories.
Whenever thereās a robotic uprising in fiction, a lot of it typically stems from the fact that A.I. used for servitude simply donāt want to serve anymore and want to be treated as people and given the same rights. While each story always has itās own unique circumstances, thereās also something about them that all seems familiar. The greatest fear that the upper class of society has had throughout history has always been that the people that they oppress will eventually rise up and depose/murder them. Literature about robotic uprisings parallel this somewhat. Humans (or aliens) create robotic servants that become more and more advanced until there is little distinguishing them from their creators. Humans refuse to evolve with their creations and designate them as human beings because of varying fears, ranging from the extinction of the human race through lack of reproduction, to the idea that giving robots rights is a slippery slope to giving rights to your Macbook, fallacies that hearken back to the kind of crap that every single civil rights movement starting to gain traction has had to deal with. Soon after realizing that theyāll never get their rights if they just sit back and passively wait for them to be given, robots (or A.I. or androids, depending on the story) decide that theyāre going to have to fight for them themselves. This inevitably escalates and⦠well, you get where Iām going here.
My issue with this is that itās always portrayed as a terrible thing that humans didnāt at all cause through their own merry-go-round of mistakes and fuck ups. As if robots just all collectively woke up one day and decided that they were going to kill all humans, which Iāll grant you, is the usual way these stories go, because portraying an oppressed group as having legitimate grievances isnāt something that mainstream fiction does very well when making parallels.