I don't think I'm giving too much away if I say that the HBO series Westworld raises ethical questions about artificial intelligence. The series is set in an amusement park where people can live out fantasies of the Wild West, fantasies that to a striking extent involve murder and rape. But no one actually gets murdered, because the characters that populate the world are (very lifelike) robots. They feel no real pain, despite appearances to the contrary, and can be repaired relatively easily. And the customers are paying high prices for the privilege, so what could they have to feel guilty about?
I assume no one believes Jesus' idea that committing a sin in one's heart is just as bad as committing it in the flesh, but is also seems about as clear as can be that much of what Westworld's clients are paying to do is very bad (even though it is, in a sense, not really done in the flesh). Standard ethical theories seem incapable of handling this fact. Or, if not incapable, not at all well positioned to do so plausibly or simply. The problem is similar to the well known one about Kantian ethics and mistreatment of animals: if ethics is all about respect for reason and the creatures that embody it, then (why) is tormenting animals wrong? The Kantian answer is that it is bad because it makes tormenting people more likely, but this is fairly plainly inadequate. A dying man could spend his last moments torturing bunnies and do nothing unethical at all on this view.
Shooting at (robots that look and behave just like) people for fun is cruel. Perhaps part of why cruelty is bad has to do with its effects in the world, but, as Kant saw, bad will is bad on its own, regardless of whether it turns out to have bad consequences or not in any particular instance. It seems to me that Westworld therefore shows that consequentialism and textbook Kantianism are wrong, or incomplete, as moral theories. I'm sure others see it differently though.
On a related note, in Avengers: Infinity War the baddie is a consequentialist and the goodies "don't trade lives" (cf. Kant and Romans 3:8). I usually don't like science fiction-inspired philosophy, partly because it's usually metaphysics or epistemology (which are not my thing) and partly because it so often seems to be wrapped up in concerns about what is cool (also not my thing). But Westworld, which also raises metaphysical questions, seems to me to demonstrate something important about ethics that is rarely shown.
Attempts to anticipate dissent:
1. The clients of Westworld don't do anything wrong--you can do what you like to robots, as long as no actual people's rights are violated. In a way I agree with this. But perhaps that just shows that there is more to ethics than questions about actions and their rightness or wrongness. There is, it seems to me, just something obviously very bad about choosing to have the experience as of shooting a man and seeing him bleed to death, screaming in pain, etc. Perhaps the badness is located more in the heart-mind of the person choosing to behave this way, or in the choice to act this way, than in the act itself, but that there is badness there seems about as plain as it could be.
2. So you're saying it's wrong or bad to play video games that involve shooting, etc.? Not necessarily. But there is something bad about playing a game that focuses on violence and in which the players want the violence to be as realistic as possible. Space Invaders is not like this. Nor is Angry Birds. (Although would the Buddha play either of those games?) No doubt there is a gray area somewhere. Such is life.