Version 3 2023-03-14, 23:27Version 3 2023-03-14, 23:27
Version 2 2023-03-13, 23:55Version 2 2023-03-13, 23:55
Version 1 2021-11-10, 23:08Version 1 2021-11-10, 23:08
thesis
posted on 2023-03-14, 23:27authored byVize, Brendan
<p>Consider Lt. Commander Data from Star Trek: The Next Generation, the droid C3PO from Star Wars, or the Replicants that appear in Bladerunner: They can use language (or many languages), they are rational, they form relationships, they use language that suggests that they have a concept of self, and even language that suggests that they have “feelings” or emotional experience. In the films and TV shows that they appear, they are depicted as having frequent social interaction with human beings; but would we have any moral obligations to such a being if they really existed? What would we be permitted to do or not to do to them? On the one hand, a robot like Data has many of the attributes that we currently associate with a person. On the other hand, he has many of the attributes of the machines that we currently use as tools. He (and other science-fiction machines like him) closely resembles one of the things we value the most (a person), and at the same time, one of the things we value the least (an artefact), leading to an apparent ethical paradox. What is its solution?</p>
History
Copyright Date
2011-01-01
Date of Award
2011-01-01
Publisher
Te Herenga Waka—Victoria University of Wellington
Rights License
Author Retains Copyright
Degree Discipline
Philosophy
Degree Grantor
Te Herenga Waka—Victoria University of Wellington
Degree Level
Masters
Degree Name
Master of Arts
Victoria University of Wellington Item Type
Awarded Research Masters Thesis
Language
en_NZ
Victoria University of Wellington School
School of History, Philosophy, Political Science and International Relations