Science Fiction and Ethics II: Robots and Artificial Intelligence

(A collection of famous robots by Daniel Nayari)

This unit’s readings by Isaac Asimov and Ray Bradbury engages the impact of robotics and artificial intelligence on our culture. Although these stories were written in the 40s and 50s, are you read, I want you to think about how some of their visions of the future of robotics and artificial intelligence have come true in our present day society.

As evidenced by the illustration of several famous robots from TV and film, robots has been depicted in several popular Science Fiction stories and have played a variety of roles, ranging from the villains like the Terminator and HAL9000 to the heroic like Astro Boy and C3PO. Like the aliens from the previous unit of stories, the robot is a versatile figure and can be created to realize the “what if” scenarios that Science Fiction likes to engage. A robot can be conceived and designed in any form and to serve any purpose, from life like androids to simple machinery.

Robots and Artificial Intelligence come with their own ethical concerns as well. Often, the figure of the robot is used to question basic assumptions made about humanity. For example, the idea of Artificial Intelligence questions the very basic understanding of what it means to think, be conscious, and be a person. Scientists and philosophers have debated whether or not machines can truly “think” and whether or not Artificial Intelligence is equivalent to human intelligence. After all, according to Descartes, “I think, therefore I am”, so if thinking is what give man his humanity, then what happens to our definition of humanity if we develop machines that can also think? In addressing this question, we are also forced to think deeply about humanity and to put into words what exactly it is about human thought, consciousness, and emotions that is unique to us and would separate us from machines who can also rapidly process information.

British mathematician and father of modern Computer Science Alan Turing famously devised the “Turing Test” for determining machine intelligence. He proposed that if a human and computer were both given questions by a third party and that third party could not consistently tell whose answers were the computer’s and whose were the person’s, then one may conclude that the computer could be described as intelligence.

To further explore the question of ethics and robotics and Artificial Intelligence, I want you to read the short article “Morals and the Machine.” I will cover this in the quiz and use it to analyze the stories you will read.

In particular, I want you to think about how our definition of humanity and ethical systems are perhaps altered or impacted by the rise of Artificial Intelligence and robotics. If we deem a robot intelligent and program it to have emotions, are those emotions genuine? Should robots have rights and personal interests if we agree that they are intelligent?

Furthermore, there is the question of the ethical use of robots. What happens when robots become so advanced that they could do all human labor? Would it be right to allow them to put all of the world’s workers out of jobs? When we program robots as workers, how do we make sure that they make ethical choices? This forces a serious reconsideration of ethics because we would have to distill these complicated ideas into a set of laws and procedures for the robot to follow. Afterall, when you make an ethical decision, do you always go through the complicated philosophical reasoning or do you usually just do what “feels” right? If a robot cannot “feel” what’s right, then how might its ethical system be different from ours? Is it possible to learn to become more ethical from a robot who is not clouded by emotions, or will that only cause more issues?

.

Isaac Asimov’s Robotic World

(Isaac Asimov)

Ethical Questions

1. How do intelligent robots complicate our understanding of what it means to be human?

2. Is it possible to program robots to understand and operate within a human ethical system?

3. How are robotics and Artificial Intelligence changing human civilization, including culture, industry, and personal relationships?

4. Are human relationships with robots possible and can they be as genuine as relationships with other humans?

.

Isaac Asimov’s Biography (Biography.com)

Isaac Asimov was born Isaak Yudovick Ozimov on January 2, 1920, in Petrovichi, Russia, to Anna Rachel Berman and Judah Ozimov. The family immigrated to the United States when Asimov was a toddler, settling into the East New York section of Brooklyn. (Around this time, the family name was changed to Asimov.)Judah owned a series of candy shops and called upon his son to work in the stores as a youngster. Isaac Asimov was fond of learning at a young age, having taught himself to read by the age of 5; he learned Yiddish soon after, and graduated from high school at 15 to enter Columbia University. He earned his Bachelor of Science degree in 1939 and went on to get his M.A. and Ph.D. from the same institution. In 1942, he wed Gertrude Blugerman.In 1949, Asimov began a stint at Boston University School of Medicine, where he was hired as an associate professor of biochemistry in 1955. He eventually became a professor at the university by the late 1970s, though by that time he’d given up full-time teaching to do occasional lectures.

Yet even with his impeccable academic credentials, writing for general readers was to be the professor’s passion. Asimov’s first short story to be sold, “Marooned Off Vesta,” was published in Amazing Stories in 1938. Years later, he published his first book in 1950, the sci-fi novel Pebble in the Sky—the first in a line of titles that would mark a highly prolific writing career.An influential vision came with another 1950 release, the story collection I, Robot, which looked at human/construct relationships and featured the Three Laws of Robotics. (The narrative would be adapted for a blockbuster starring Will Smith decades later.) Asimov would later be credited with coming up with the term “robotics.”

Asimov was also known for writing books on a wide variety of subjects outside of science fiction, taking on topics like astronomy, biology, math, religion and literary biography. A small sample of notable titles include The Human Body(1963), Asimov’s Guide to the Bible (1969), the mystery Murder at the AB A(1976) and his 1979 autobiography, In Memory Yet Green. He spent most of his time in solitude, working on manuscripts and having to be persuaded by family to take breaks and vacations. By December 1984, he had written 300 books, ultimately writing nearly 500.Asimov died in New York City on April 6, 1992, at the age of 72, from heart and kidney failure. He had dealt privately with a diagnosis of AIDS, which he’d contracted from a blood transfusion during bypass surgery. He was survived by two children and his second wife, Janet Jeppson.

Over the course of his career, Asimov won several Hugo and Nebula Awards, as well as received accolades from science institutions. He stated during a televised interview that he hoped his ideas would live on past his death; his wish has come to fruition, with the world continuing to contemplate his literary and scientific legacies.

.

Asimov’s Three Laws of Robotics

Asimov’s collection of robot stories from the 40s and 50s included some of the first ever discussions about robots and ethics. Asimov was a visionary who saw that the primitive robots and artificial intelligence of his era would soon advance into the amazing technologies we have today. For example, think about programs like Apple’s Siri which uses a vast library of information to instantly cater to our needs and questions as if she were our human personal assistant. When Apple programmed Siri, they had to consider ethical issues. For example, what if we asked Siri how to make a bomb? Would it be ethical to allow the program to give us that info? What about all the personal information that Siri has about us when we give ask questions? Should Siri store that information and allow others to access it? Should she keep it secret like our friends (hopefully) do? How would she know what to keep secret?

Asimov anticipated these concerns about robotics and Artificial Intelligence in the 40s and thus created his now famous Three Laws of Robotics, which are spelled out in the story “Evidence”.

The Three Laws are:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

These laws are the ethical system that is supposed to guide how the robots are programmed to interact with humans. Yet, as you will see in his stories, these laws cannot always be applied smoothly. Often ethical decisions have no perfect solutions. Sometimes we have to violate one ethical belief for the sake of another. The question is, then, how do we program a robot to make these decisions? Can ethics be distilled down to a simple formula, or does ethics rely on the uniquely human instinct toward ethical conduct?

.

“Robbie”

(“Robbie” illustrated on the cover of a French translation)

Technophobia

“Robbie” was originally published in the September, 1940 edition of Super Science Stories. It was Asimov’s first of many robot stories that would eventually be compiled in the book I,Robot. “Robbie” is especially notable for its unique take on the human/robot relationship. All previous stories about robots presented them as villains. Typically, those stories centered on the dangers of giving super human abilities to machines who cannot reason or possess ethics. Eventually, the robot revolts against the creator and the moral message that science has its ethical limits is established.

Asimov’s stories about robots sought to complicate this simplistic view of the robot. Sure, the robot could be programmed poorly and revolt, but he also believed that the robot could be capable of good. The robot, like all technology, is not inherently good or bad. There are only good or bad uses of that technology. One should be concerned about the ethics of applying new technology and consider the impact before unleashing it, but one shouldn’t dismiss it altogether out of fear.

In “Robbie,” Mrs. Weston represents this technophobia while little Gloria represents what we might term “technophilia” in that her love of Robbie is the opposite of her mother’s fear. As you read this story, think about why Mrs. Weston is so concerned about her daughter’s attachment to Robbie. Is this attachment healthy for Gloria? Will it hurt her maturation as a young girl or is she capable of building the kinds of social and thinking skills a person needs via a relationship with a robot?

It is clear that robots in a factory can replace laborers, but can a robot replace domestic duties like raising a child and providing for a family? “Robbie” raises the question if robots could some day be seen as members of the family. If robots can be programmed to simulate everything a child needs from its family, then what does this say about the nature of human relationships?

Think about the role empathy plays in this story. Do you feel bad for Robbie when he is sent away? Did you have an emotional reaction at the end of the story? Remember that Robbie is programmed according to Asimov’s three laws of robotics, which means that he is programmed to serve humans. Normally we have empathy for those that can feel, and when someone capable of helping us is harmed, we feel empathy for them. But, does Robbie “feel”? And, if he is programmed to sacrifice his well being for humans, does he really merit our sympathy when he doesn’t really “choose” his action? Does a good deed depend on choice or does it matter what motivates the deed as long as it is “good”?

.

“August 2026: There Will Come Soft Rains”

By Ray Bradbury

(A Comic Adaptation of Bradbury’s Story)

Ethical Questions:

1. What are the dangers to humanity in the Atomic Age?

2. Is there value to the continued existence of the Earth after mankind has vanished?

3. Will robots be able to continue their existence after mankind has vanished?

The Robotic Home

Unlike previous stories in which humans are in conflict with technology, Bradbury’s story does not contain any human characters at all. Throughout the short story, we see the daily chores of a fully automatic, robotic home and its eventual demise when a tree branch falls and causes a fire. Yet, just because there are no humans acting in the story, that does not mean that the commentary on the dangers of technology dependence are not present. As you read the story, I want you to compare it to the other Bradbury story, “The Pedestrian”. How do both stories depict the possible future of a dependency on technology?

Historical Context

You might be asking, where is this technological dependence? The house does not seem sinister at all, and you would be right. Instead, the dangers of technology are only hinted at. You may also be asking, where is the family? This too is answered in the same passage:

“The entire west face of the house was black, save for five places. Here the silhouette in paint of a man mowing a lawn. Here, as in a photograph, a woman bent to pick flowers. Still farther over, their images burned on wood in one titanic instant, a small boy, hands flung into the air; higher up, the image of a thrown ball, and opposite him a girl, hands raised to catch a ball which never came down.”

Just what is going on in this description? These silhouettes are a reference to nuclear war, and in particular, the bombing of Hiroshima and Nagasaki.

In the image above, you can see the shadow of a man burned onto a wall. When the bombs went off in Hiroshima and Nagaski, the blast burned the shadows of objects into walls and the ground. Thus, the reference to these “permanent shadows” clues us in to the implication that the family that owned the house was killed in a nuclear war. It is possible that the entirety of humanity was killed by nuclear war.

Bradbury wrote this story in 1950 only 5 years after the bombing. While humanity has been fortunate to avoid another nuclear war, in the 1950s with the arms race against the Soviets, many people predicted that nuclear war would become the new norm. Thus, we could read this story as a warning about becoming reliant on nuclear technology to solve global conflicts.

Think about the poem that the house recites to the now long dead family. It’s called “There Will Come Soft Rains” by Sara Teasdale.

There will come soft rains and the smell of the ground, 
And swallows circling with their shimmering sound;

And frogs in the pools singing at night,
And wild plum trees in tremulous white,

Robins will wear their feathery fire
Whistling their whims on a low fence-wire;

And not one will know of the war, not one
Will care at last when it is done.

Not one would mind, neither bird nor tree
If mankind perished utterly;

And Spring herself, when she woke at dawn,
Would scarcely know that we were gone.

As you read the poem, consider its message about war and its aftermath. What is Bradbury trying to express by including this poem? What does it mean if neither bird nor tree would care if man perished utterly? What does this imply for the existence and future of human civilization? Will the world simply go on without us?

Finally, did you feel sympathy for the house? If so, why? What could be the roots of seeing the demise of a house, which in of itself, has no feelings of its own?

.

“All Watched Over by Machines of Loving Grace”

By Richard Brautigan 

I like to think (and
the sooner the better!)
of a cybernetic meadow
where mammals and computers
live together in mutually
programming harmony
like pure water
touching clear sky.

I like to think
(right now, please!)
of a cybernetic forest
filled with pines and electronics
where deer stroll peacefully
past computers
as if they were flowers
with spinning blossoms.

I like to think
(it has to be!)
of a cybernetic ecology
where we are free of our labors
and joined back to nature,
returned to our mammal
brothers and sisters,
and all watched over
by machines of loving grace.

.

As you read this poem over, I want you to compare its vision of a future meeting of man and machine to those of Bradbury and Asimov. Consider how Brautigan’s combination of images from nature and technological language depicts the future. Is this a utopian or dystopian view of the future? How can you tell?

Brautigan was a popular counter culture writer from the 60s. He lived in the epicenter of the hippie movement in San Francisco and many of the flower power generation read his work. Just like many from his generation, he was an environmentalist and he was critical or man’s alienation from nature in the civilized world. Yet, in this poem, he does not seem to be against technology, but perhaps he sees how it can serve his goals for nature.

One final idea to consider when analyzing this poem is the cyber futurist theory of “the singularity.” The singularity will occur when technology and artificial intelligence has become so powerful that it will become bonded with and eventually take over natural intelligence. There will not longer be a divide between the two at all, and individual intelligences will be all interconnected with each other. Written at a time in the 60s before the personal computer, Brautigan’s poem may have seemed light and romantic, but with how technology has advanced since his time, his “cybernetic ecology” may be approaching whether we like it or not.

 

About drdimock

Dr. Chase Dimock teaches Literature and Composition at Broward College
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s