Agnostic.com

22 2

Robot Rights

If the human mind was replicated in computer form, with all of the human properties that one would expect (emotions, desires, curiosity, etc...), would that mind, by extension, also have human rights?

MShek 3 Aug 30
Share

Enjoy being online again!

Welcome to the community of good people who base their values on evidence and appreciate civil discourse - the social network you will enjoy.

Create your free account

22 comments

Feel free to reply to any comment by clicking the "Reply" button.

2

I wanted to welcome you here and enjoyed sweet pic of your child! About your robot, I do not see the day when emotions can be programmed! Human emotions are expressed from each individual’s exposure to stimuli and stored in their nervous system...which happens specifically to each person’s nervous system characteristics (no two people are alike)!

So if you program a computer housed in the body of a robot to do exactly that, learn, does it become on par with a human? Does it not as a function of its unique experiences after its creation become the same as us a product of its environment and experiences? What happens if its computer "brain"contains greater then human potential? I am curious as the history of technology is one of exponential growth and quantum computing and the advance of AI will lead very soon to questions like this being vital to our survival as a species.

@Quarm well, for the future survival of humans, I sure hope not! Just look at the ‘dark’ of technology today!

@Freedompath there is a Chinese curse; "May you live in interesting times." I think we are all in for very interesting times indeed.

@Freedompath and I do agree with you just read about the development of AI. The most chilling line in a documentary I watched was.."The people working on AI do not even really understand what they are doing."

2

Why don't we worry about human and animal rights first.

As long as humans eat animals they will not have rights. I have no plans to stop eating them.

@dahermit That has absolutely nothing to do with the humane treatment, and killing of the animals you do eat.

@Sticks48 Then why did you mention "animal rights"...you brought it up.

@dahermit Eat a dick! That is made of meat. You know what l am talking about. Troll somebody else you ignorant fuck. ☺

@Sticks48 Dunning-Kruger Effect.

1

Bcz we have done such a fabulous job w rights for women , gay , minorities , elderly .

1

I like Asimov's view on it in the bicentenary man.
Any inteligence that is evolved enough to understand the concepts and ask for freedom and rights, deserve it and can't be denied.

1

Not human rights, maybe cyborg rights??

1

How would it ever be proven that robots have all the human properties that one would expect? They may pass the Turing test and appear to have such properties, but you can't put emotions in a test tube. I think humans will deny that robots are truly "human" long after they appear to have attained human equivalency, and so will deny them equal rights until forced to do so by the robots themselves.

1

If a real Human consciousness were somehow preserved (not just duplicated) and constituted an actual person, we would then be talking "sentient being" or essentially, intelligent life. In that scenario, rights could apply.

1

There was a great episode of the series Black Mirror which touched on a similar theme. I believe the episode was called “White Christmas”. Worth checking out if you get the chance.

But to address the topic, I guess the foundational question is on sentience. Once that has been achieved, whether organic or artificial in nature, there must at least be an acknowledgement of an evolving intelligence.

Where the interactions go from there would be up to the application and versatility of the technology. Though I hesistate to say any intelligence realised, even a copy of one, would accept slavery or neglect or threats. If it valued its own survival and could learn through experience I find it hard to imagine any way it could be denied a claim to rights (publicly anyway).

1

Would powering it down kill it? Can it use past experiences to improve? Can it replicate? Can it make an emotional decision? Does it regard its own life?

1

I don’t think it’s possible for consciousness to be created. No one truly understands what consciousness is. How could you write a program and create something when you don’t know what it is you are to create?

There is a slightly plausible other way, which I present in my book, “The Staggering Implications of the Mystery of Existence”, available in the Kindle store, however the book contains woo and is unfit reading for proper atheists. 😟

0

Much better to have something in writing as a baseline....based on our Bill of Rights by our Founding Fathers who understood first-hand what living without such rights was all about! Do you think it would be good to wrangle endlessly each time something came up, with No baseline to even start at?

0

To me, it depends on a few factors. What is a "right?" Does the created mind have the current or future capability to support itself by its own actions? Does the created mind have the capacity to reason, or is it more like an animal, who may have emotions, desires, and curiosity?

A "right", as originally used in the founding documents of the US, is a moral principle defining what actions people, by their nature, can and need to take in society without the forced interference from anyone, including the government. Something cannot be a "right" if it violates the "rights" of others. For a created mind to have "rights", those "rights" cannot come at the expense of others.

But if the created mind is rational and can take actions to support itself, I think it should have rights.

0

Why would you make a computer that could feel pain?

0

Consider that if true A.I. were to be achieved, it would think faster and be smarter than the average human and therefore superior. Also consider that human emotions could not be allowed into A.I. inasmuch as emotions are counter to logical thought. Also, there would have to be an element of self-preservation programed in. However, if humans developed A.I. sans emotion, the A.I. logical "mind" would likely conclude that humans were an ultimate danger (destroying the planet, could shut them down at any time) to the A.I. life form(s), Would not a population of totally logical A.I.'s then seek to destroy humans in self defense?

Yup. And we would never see it coming either.

@Punkrockgirl77 Please explain beyond just a bumper sticker, throwaway comment. Please make your case. (explain your position).

@Punkrockgirl77 So, you define "superior" as "morally" superior then?

@Punkrockgirl77 "Superior" in my value system is higher intelligence. For instance Einstein, Hawking, et. al., were, in my view, "superior" to some knuckle dragger sitting in a bar covered in tattoos.

@Punkrockgirl77 If "superior" is measured by I.Q. alone, then yes he was. His I.Q. allowed him to feed himself, whereas a severely mentally disabled person would likely starve to death if not cared for. The difference between the "superiority" of I.Q. vs. "morally" superior is that I.Q. can be measured whereas morally superior is purely subjective. Consider that the conversation is about A.I...how would something as subjective as morality be installed in A.I.?

@Punkrockgirl77 For the want of a better definition, I.Q. is the defacto default for human intelligence in our society.

I think most of our disagreement centers on what we consider to be meant by "superior". A human is limited by his/her experiences whereas A.I. (in theory) could draw from any number of databases of information. Therefore, an entity (A.I.), could be an expert on Law, Physics, Languages, Medicine, Biology, Mathematics, etc., etc., would be "superior"( in my view at least), than a mere human who was limited to only those things he/she had time to study in their own human lifetime.

0

If animals can then I don't see why not

0

May be in next hundred years there would be cases running about it in all supreme courts and UNO.

0

We fear robots rising up against us because we treat everyone like shit, not just robots. With a cold objective look at our goal of equal rights versus our track record, they would quickly surmise that we can't accomplish shit and take matters into their own hands. The AI uprising won't be their fault, it'll be ours.

To answer your question, every living thing should have equal rights. As soon as robots can tell us that they're alive, they should too. Will we do it, fuck no.

0

A more interesting question is how one would go about testing a robot for those qualities(emotions, desires, curiosity, etc...).

These are qualities that are ultimately first hand. Most of us assume that other humans possess them as well, but I don’t think we can extend the same courtesy to machines of our own making.

But to answer your question, of course they should have human rights.

0

No, they to will be exploited unless humanity get its Collective act together.
?

0

I've been reading/watching a ton of Phillip K. Dick lately....this is a common theme he tackled. So....should? Probably. Would? Who knows?

0

I doubt it. I imagine that we would attempt to use it for some kind of weapon or to make slaves and then say they have no rights because they don’t have a body or ‘soul’.

And then, as you can imagine, it would lead to an apocalypse for humans. There would have to be heavily restricted locks on its software in order to keep it from running wild.

Sounds far-fetched but I genuinely think that if we dabble too far into AI or give a disembodied human brain the abilities of a computer, this will be the results.

0

I think we'd have to give rights to animals, especially mammals, first as they share most, if not all, of those qualities.

Write Comment
You can include a link to this post in your posts and comments by including the text q:167286
Agnostic does not evaluate or guarantee the accuracy of any content. Read full disclaimer.