VU BreakThru

Home » News » Perceptions of AIs, Humans, and Others (Part 1)

Perceptions of AIs, Humans, and Others (Part 1)

Posted by on Thursday, May 23, 2019 in News, The Ethics of Artificial Intelligence (AI).

Artificial Intelligence

Written by Nicole Gillis (student of UNIV 3275)
Note: This post is a modified version of the author’s late-term synthesis exam essay submission. Part 2 is here.

Ted Chiang explores the complicated nature of how humans could both reinforce and blur the perceived distinctions between AIs and humans in his work The Lifecycle of Software Objects.  Within his novella, Chiang showcases a fictional world where humans treat artificial intelligence (AI) as something inherently other and subordinate to themselves, while contradictorily raising those AIs with similar emotions and social structures as one would raise a human child.  Chiang establishes this dynamic by creating a realm where humans have created a simulation world named “Data Earth,” and in this simulation AIs, called “digients,” are trained through experience and education to evolve from animal-like creatures to beings capable of basic level sentences and human-like emotion.  In this world, there is a clearly established hierarchy; humans are responsible for making digient “life” on Data Earth possible and control the decisions that affect digient life.  Despite the humans existing in a clear position of superiority over the digients, they often forget that the digients are inherently different than humans and project human-level expectations of development and emotion onto the AIs in a way that characterizes them as children.  The projection of human developmental expectations leads the humans to refuse the digients sexual maturity and delves into the question of whether these AIs are capable of “expressive authenticity” as explained in Colton, Pease, and Saunder’s 2018 paper “Issues of Authenticity in Autonomously Creative Systems.” The contradictory nature of the human interactions with the digients shows that the distinctions between AIs and humans can simultaneously be qualified yet blurred when human emotion is added to the situation.

Throughout The Lifecycle of Software Objects, Chiang emphasizes that humans are superior to digients.  By making humans responsible for keeping the simulation world up and running, the fate of the AIs depends on the humans.  Additionally, the AIs could only develop their own culture after the humans taught them how to read, write, and go through school, thus making humans their mentors.  In the closest approximation to the human form, the digients can leave Data Earth by projecting their code into a robot body in the human world, but even then, the code is run through servers outside of the robot’s shell.  Chiang’s assertion of the digients’ dependency on humans in combination with the AIs’ inability to permanently exist within the human realm makes the AIs inherently dependent on, and “other” than humans.

Despite the obvious nature of the digients’ “otherness,” the humans still develop emotional attachments to the AIs and treat them like human children that exist in a simulated world.  The humans teach the AIs how to speak, how to read, and give them educational lessons and homework.  They play with the digients and even interact with them in robot bodies in the human world.  This parental relationship is exemplified when Ana, a human, tells Jax, her digient, “My life might be simpler if I didn’t have you to take care of, but I wouldn’t be as happy.  I love you, Jax.”  To which Jax replies: “Love you too,” (Chiang, 2010, p. 103).  Ana’s motherly love for her AI shows that the subordinate nature of her digient makes her see the AI as a relatable child capable of human-like development rather than an “other,” and blurs the line between what distinguishes this AI from a human child.

Because the humans treat their digients like children, the humans are eventually faced with the dilemma of granting their digients sexual maturity, and therefore autonomy over their own “bodies” when a sex-doll company asks the humans to sell their digients with the goal of turning them into sex workers.  The humans all object initially, denying that their digients are mature enough for sexual matters because of their “lack of experience of romantic relationships and jobs,” (Chiang, 2010, p. 144). Yet this assumption treats the digients like human children when digients are inherently different than human children. Sexual acts do not exist in the digient world because humans have refrained from giving them that knowledge.  One could say the same about human children; however, sexual acts doexist in the human world whether parents want children to gain this knowledge or not.  Some theories like Sigmund Freud’s psychosexual development theory from “Three Essays on the Theory of Sexuality,” (1905) even suggest that sexuality is an integral part of child development. Although psychosexual development theories are considered unreliable by many, if it were true, then withholding sexual information from the digients could impact their development to make them even less like human children.  This key difference between the acquisition of knowledge between human children and digients exemplifies how the distinction between humans and AIs can be qualified.  The character Derek in the novella notes this distinction, and when thinking about whether or not to expose digients to sex he thinks “But perhaps the standards for maturity for a digient shouldn’t be as high as they are for a human; maybe Marco is as mature as he needs to be to make this decision,” and ultimately this thought process leads Derek to ask his AI if it agrees with becoming a sex worker, and proceeds to follow through given the affirmation of the AI (Chiang, 2010, p. 144).

Comments are closed.


Back Home   

Recent Posts

Browse by Month