Is a creator allowed to punish his creations?

  • Thread starter Thread starter laocmo
  • Start date Start date
Status
Not open for further replies.
L

laocmo

Guest
Consider this future scenario. Computer scientists have perfected artificial intelligence to the point that robots are available with a form self recognition. They can be given a list of “do not do this” items. But have a choice after considering and assigning weights to all the consequences they can think of, to ignore the “do not” commands. To reinforce the obedience demanded they are also given a threat of a punishment for disobedience that they have been programmed to fear, maybe a power down or some task that they would prefer not to have to do. Now does the designer-builder of these things have the right to punish them for their disobedience? After all as their creator he gave them the ability to decide their own actions. Is it right for the designer to be in any way displeased and vindictive at the actions of these things?
 
“Allowed” or “right” according to what standard? Are you asking for opinions? Or is there a particular set of rules somewhere that define right and wrong that you think is beyond anyone’s opinion, and you’re just asking for someone to check that set of rules for an answer?
 
If not, then I suppose when I put my son in time out I am immoral?

The main issue with most views of punishment is human fallibility. So no one trusts a punisher in the sense that I think you are asking in, as I also assume you ask in a sense to relate to God.

So let me say the often perceived human notion is that he in authority will come home and just begin beating the robot with a bat for no reason if given the right to punish.

The God version is more like the maker comes home and finds the robot about to stick a magnet to itself which will wipe its hard-drive so it is punished to learn not to essentially…

Now if one is evil via the God example that is akin to the human doing nothing wrong and the robot deciding to take over the world… hence the hell level.
 
While an interesting situation, this isn’t at all comparable to the relationship between God and His creations. God, we must not forget, is literally the foundation of reality – without Him, His creations cannot exist, not just because He wouldn’t have made them, but also because it is unintelligible for a being that came into existence to say “I exist” without appealing to the only thing that exists of it’s own nature (exists “a se”), God.

Similarly, God is literally The Good. He isn’t just some creator – He is the foundation of everything that is Good, worthwhile, noble, and profitable. His commands are thus not arbitrary, not could they potentially have been different, as in the computer programmers commands. His commands are what will ultimately be to the creature’s benefit. They are in fact, the only things that will make the creature’s life meaningful, because all things draw their benefit from the one Good.

Finally, in the case of the computer/programmer, we are actually discussing a case where both are intellectually equal, or possibly where the computer may be more intelligent than it’s creator. God, who knows all true propositions, will by definition know more than any finite creature that knows only a finite number of truths. If a lesser mind reasoned that God’s commands were wrong, it would be because of it’s lack of full knowledge and it’s imperfect reasoning capacity, not because God has come to an inaccurate conclusion.

In your example, the primary conflict you seem to be concerned about is the conflict between the free will the computer has been given and the commands the computer has been given. If the computer freely wills to disobey its commands, how can the creator say this is bad? Now, remember, as we discussed, in God’s case, the situation is different. It’s not just that the commands have been given – the commands are literally an expression of the nature of moral reality. They are true statements that cannot ultimately be denied anymore than one can deny gravity, or 2+2=4. So for the individual to disobey God’s commands is the result of faulty reasoning and logic on it’s side, comparable to the computer attempting to deny the truth programmed into it that 1+1=2. Neither the computer nor the God-created individual can operate properly without these truth’s they have been given.

So indeed, it seems to me perfectly correct to say that the free will of the individual is objectively wrong. Though it is allowed to deny reality if it prefers, God can no more tell the individual that this is “ok” anymore than He could tell the computer that 1+1=3. In fact, ultimately neither the computer nor the individual actually believes these incorrect propositions. If they did, they would not be able to function properly (in the computer’s case because all its software would stop functioning, in the human’s case because the moral law is engraved onto our hearts as deeply as the law of mathematics is engraved into the computer’s software).

In a sense, the punishment is a direct result of the disobedience itself. It’s as if the computer programmer told the computer “Ok, you can believe 1+1=3 if you really insist on it, but be aware that your software will stop functioning and I’ll have to pull you off the internet because if you interact with the other sentient computers, you’ll hurt them.” The “punishment” is simply a necessary result of the disobedience and is in fact required to protect the other computers. So I really see no conflict between the free will given to humans and God’s commands to punish them if they disobey, as the punishment is simply a natural consequence of the moral law and could not have been different anyway.
 
The creator loses all rights over an object, intelligent robot or otherwise, once ownership passes to someone else.

-Tim-
 
Why would someone create a robot and then give them a choice to follow orders or not?

Does the creator* own* the robots? Or is she just the creator/inventor?

Does the robot have feelings? Unless the robot has feelings, I imagine that the “punishment” would not hurt them.

But having asked and said that, I see what you’re getting at and my answer is that the designer *should NOT punish *the robot if it goes against orders, since it gave them the ability and free will to discern and decide the actions they will take.

If they own the robots, then they have a right to “punish” them. If someone’s washing machine doesn’t work, they have a right to kick it.
The robots are not people, but mechanical objects.
But they cannot–*should *not–“punish” someone else’s robot because it’s hurting someone else’s property.

.
So if a sentient robot murders someone, the creator should *not *punish it?? Forgive me if I find that logic less than agreeable. 🤷
 
Robots? Really?

How about: A son asks for his inheritance and goes on to blow it on fast cars, fast women and drugs. He ends up in the sewer - the reward for that lifestyle. He’s had more than he can handle, feels like garbage, and goes back to dear old dad, who throws a party that his son is back and has come to his senses. It’s good news, people. Rejoice!
 
Consider this future scenario. Computer scientists have perfected artificial intelligence to the point that robots are available with a form self recognition. They can be given a list of “do not do this” items. But have a choice after considering and assigning weights to all the consequences they can think of, to ignore the “do not” commands. To reinforce the obedience demanded they are also given a threat of a punishment for disobedience that they have been programmed to fear, maybe a power down or some task that they would prefer not to have to do. Now does the designer-builder of these things have the right to punish them for their disobedience? After all as their creator he gave them the ability to decide their own actions. Is it right for the designer to be in any way displeased and vindictive at the actions of these things?
A robot could only simulate making decisions, but cannot actually decide anything. A robot can only simulate being self aware, but is not self aware. What you are talking about is a teaching technique like punishing your dog for barking. The dog learns to associate an action with a negative response from you such that it learns to avoid doing such actions. When it comes to a robot it is not alive and nothing but a machine. Therefore, you could morally treat it as you would your car. We could be concerned about such a robot in as much as it resembles humans. Since we are creatures of symbolism to see something that even resembles a human being mistreated can make us upset. The fact that people could confuse android robots with sentient human beings illustrates this sentimentality to that which resembles us. But a piece of software running on a laptop is much less likely to be confused with sentient humans simply because it does not resemble us. And you would not think twice about throwing your laptop in the garbage if it no longer functions. Humans on the other hand are sacred mysteries, and thus we treat with greater reverence. An android could be thought of as a sort of human icon. It points to something greater than itself. And, ultimately, God is greater than us, and we in a sense point to him being created in his image and likeness. How God treats us is a function of his love for us. A father who does not punish his children shows no love for them. But rather does not care enough about them to set them straight. But of course punishment must also be done with temperance, prudence and mercy.
 
You would not need to punish a robot any more than your car. You simply fix it. When I took my robotics course it would have been pointless for me to punish my robot when it failed a task. Rather, it was up to me to program it better.

Humans however are not programmed. We actually do have a will and an intellect. For humans suffering can bring a change in us for the better. For robots suffering is pointless.

In our pride we may for instance think we know better than God. Just like a child being angry for being punished by his parent. Yet, the parent can see beyond what the child sees, does not enjoy punishing, but does so to help change bad behaviour that must be grown out of in order to be a successful adult.
 
Status
Not open for further replies.
Back
Top