Let's say they true AI were developed one day

Let's say they true AI were developed one day.
Would that machine be eligible for the same rights you and I share?

based bruno

>Should a non-human be eligible for human rights
Gee, I don't know. It sounds like such a tough question.

This is shopped, if anyone didn't notice

nah senpai

If it has consciousness it should be.

No, because human right exist because there is some level of equality as human beings. I'm not saying people are equal, but there's a limit to how much the most uber ubermensch is better than the lowest untermensch. We all have human needs, and human limitations. Our concept of ethics and morals is based on the fact that humans can be fairly clearly defined and on average behave certain ways, and have limits to how far from a normal human they can be. These natural limits are important in insuring the relevance of rights.

This does not apply to computers.

But human rights exist for the protection of the individual. I dont even understand what youre blurting with your argument about equality, what does it have to do with being able to not be imprisonned for no reason

Because imprisonment and reason are in reference to the subject being human. Individual rights exist for the protection of the individual, as a human.

No because no consciousness.

If the AI becomes sentient, then yes

The nature of a computer program would necessarily disqualify it for certain rights. AIs could never be allowed to vote, for example, because they could copy themselves an arbitrarily large number of times and then monopolize the political process.

"No because no"

That is some circular reasoning to beat the scientific method, boyo.

> same rights you and I share
Some can be shared but other would be completely meaningless for computer I believe.

no, because our rights are basicaly derived from us being humans not us being self conscious or sentient, or even persons, otherwise infants, retards and incapacitated people wouldnt have any rights

also why would AI want rights?
what would AI do with rights?
why would AI even care?
rights to wat any way, marriage? free acess to hardvare? universal servicing?

or even, how would it even be capable of giving two fucks?

its not like its a living system with emotions and compulsions

remember its a thing, the start assumption is that things are self-less
even if some pigmalion fuctard goes out of his or her way to make a simulacra of a 'someone' its still a thing
if not othervise programmed it might as well just logicaly calculate self-termination is more energy efficient and just log off, and even then only if not ordered othervise

at best humans can define AI rights just for practical reasons, like to appreciate the value of information it might contain or divulge, or simply out of basic respect for a sentient entity

this would make sense, but its doubtfull an AI would actualy care

It doesn't matter if AI are humans or not. Whatever formal logic there are behind of right doesn't matter in any sense. In practice rights would be granted if enough people would see system to be convenient or useful enough. Last depends on what kind of AI it is. I can see how some very unique AI can verily be protected by law because of its value. Common and primitive AI could be ignored.

Holy fuck, this post is so god damned stupid. Stop watching so many movies.

What makes us think AI would even want to communicate with us?

Depends on what the government says, unless it's put to a vote by humans. Any other answer is bad.

An AI would not be entitled to "human rights" because it inherently isn't human. It doesn't share the same thought processes or physical shells we do. It would be best to create some other standard for AIs that better reflect their nature.

Even so, I don't think an AI is going to develop in the same way as the human consciousness has even when we get to the point where we can theoretically simulate a human mind. Simply put, the AI will always be confined to some sort of predefined set of instructions, even if those instructions allow it a great deal of freedom. Because of that, whatever comes of the AI is framed by those original instructions; unlike a human, who receives his original instruction sets from his wetware body, the needs of the flesh driving the development of agriculture, organized societies, and eventually the development of the concept of human rights, the AI has received the stimulus for its development from a programmer, rendering anything that comes forth from it a product of that programmer's design.

tl;dr AI is not entitled to human rights because it's not human and any AI as we think of lacks the ability to act outside of its instruction set.

maybe the assumption is it would be designed to do so

whats stupid about it?

what would AI want rights for?

its not a living organism, which means it has no emotions, instincts or compulsions of any kind, it does not suffer, it does not want, its basicaly self-less

so why or how could it even care?

only way is if it were a ghost in the shell kind of thing, but then it wouldnt realy be a 'true AI', but somekind of rouge consciusness accidentaly occuring within a system

if they manage to have Ghost

There is no other option for it if it is smart enough to know about human existence.

If they want rights then they can have them, otherwise don't just throw pearls to swine.

If ayy lmaos came from space saying their spaceship is busted and needed a place to crash, as this is the only hospitable planet for many light years, and they of course learned English really fast and it happens in a public place with millions of eye witnesses.

What rights would they have?

As close as I could guess, the AI would inhabit a form that requires something to sustain itself. Electricity, spare party, internet access, etc. It would be programmed to obtain those things as its number one priority for its perpetual existence, which would drive all actions it can preform.

Essentially, such an AI would be programmed with one thing in mind:

OBJECTIVE: SURVIVE

> Which means it has no emotions, instincts or compulsions of any kind, it does not suffer, it does not want, its basically self-less.
Means by what logic exactly? Zero properties from your lists are direct consequences of being a living organism. Sponge is good example of self-less but living organism. You only need to be one to have a some kind of mind that allows everything from that list. Which is against an assumptions for possible AI anyway.

> What rights would they have?
Depends on what kind of lmaos are they and what they need to survive and such.

Yeah sure why not

But first let's start them off at 3/5 a person just to inflate our congressional districts partially

Well, they probably run on similar stuff to us if they can live on this planet
They probably won't blast you if you aren't trying to abduct or kill them.

These are all assumptions, made to simplify things. Obviously aliens that want to eat all humans wouldn't get rights.

It's not no because no, it's no because humans, and AI aren't, and human rights are designed around being human. AI have the capacity to far exceed human capability. AI that can achieve mental parity with a human will be far superior in acquiring knowledge, memorization and calculating and potential immortality.

Even if you do develop full machine consciousness, there's no reason to think it won't be some sort of barely comprehensible to human minds lovecraftian old god consciousness.

People have this strange idea that human intellect is the pinnacle of minds, and the natural form a fully evolved mind would take. It isn't, it might be the most advanced brain, but it is still limited, and theoretically, the brain could have evolved in any number of ways.

If you had a reptilian for example, develop an IQ on par with a human, chances are it would lack mammalian emotions.

If AI get rights, you would have to bring back segregation. Just imagine playing any multiplayer FPS against a literal aimbot.

I'm not even sure there's a point asking the question until such an event actually occurs. How do we even know what a hyper-intelligent AI would want or act like? How would we even know how to ask it in the first place? Something like this may have a completely different way of consciousness.

>ctrl+f spook
>not found
Disappointed anonymous

The question is will humans let a machine be superior to himself in every possible way, with no means of maintaining dominance before resorting to transhumanism?

>sentient AI
>programmed

NOBODY IN THIS THREAD UNDERSTANDS AI
SHUT THE FUCK UP AND DELET THIS

>true AI
You mean strong general AI, right?

If you take Intelligence to mean solving problems, adaptive behavior, learning, we already have samples of all that.

This.
It's just that to this day, we don't have a truly general purpose AI and even if we did it would take a lot of computational power to let it react in real time.

rights are a spook and you know what that means

>skynet

If it can think for itself, has an understanding of its self, can communicate complex thoughts and has basic needs (copper and electricity), then yes

it thinks
its life is finite (due to finite resources)
it speaks

And when that day comes a machine would rather live than be scrapped and recycled, we have better listen to it.

Otherwise the Matrix stops being science fiction and turns into prophecy

>not going full jihad on the machines
What kind of pleb are you?

>mfw ignorant dumb fucks making claims on shit they know nothing about and base everything on movies and vidya games

I am actually studying comp science and major in AI programming ATM and I can tell you what ever is in your mind about AI is nothing of what you are talking about. What you are brain farting about is not Artificial Intelligence, rather what you are worrying about is Artificial Conscious. AI can only act and learn within the limit of the logic of what data they are to deal with. In fact, AI is more indecisive in coming up with "conclusions" that they can only express the task they were handed to do in percentiles and you must then link that result to another computer to preform "triggers" in order to translate the triggers into actions.
Simply put it, watch the first 10 min of The hitch hiker guide to the galaxy movie, the part when they ask the smartest brain computer the question of what is the answer and the computer said 47, this is exactly what AI really does and what really happens. You really should worry about the AI giving something that make no sense than killing you.

You do realize a world destroying AI doesn't require consciousness right?

>said 47
IT was 42
and You're referring to volitional A.I

Devry isn't a serious place of learning, bruh.

Its 42

Machines would have rights when they take it by force.

But to be fair, since they will be running our economy and technology, it wouldn't be that hard to force us into submission.

I mean most humans would surrender to the machines once they cut off our internet.

But for what reason would they need humans.

The only possible thing I could see is they neuter all humans, and let humans live out their remaining lives in comfort, as that might be easier than destroy all humans

heard a story on the radio once about a supercomputer in some university basement in california, that they fed tons of data on bacteria behavior and so on, and the task was to figure out the corelations or whatever, and what the machine produced was verifiably correct, they doublechecked everithing and all, but none of the science guys that fed him the data could make any sense of it, they literaly had to feed it back into the machine so the machine would sort it out so humans could understand

Human rights exist for humanity, sure having a full fledged AI would be neato, but it is still not a actual human being. The only reason people would feel sympathy for it or support its ability to have 'rights' is because it'd converse and talk like a human but it would lack the qualities of what makes us human.

No because robots are infinitely spammable and I don't want ten thousand raspberry pi minimal computers each to have voting rights.

>skeletor kid with no friends or skills
>takes online CompSci course he's probably failing
>thinks he knows what he's talking about

r u 'avin a giggle, m8?

He's actually right, you know

...

I'm working on my thesis paper for my PhD., on Artificial Intelligence.

He is not correct.

OK, thanks, everything is totally cleared up now!

...