Would advance AI deserve human rights? Would hypothetical workbots and sexbots be unethical?

Would advance AI deserve human rights? Would hypothetical workbots and sexbots be unethical?

No. Human rights should apply to humans only, not to machines whose creators programmed them to mimic human behavior. But I fear the way we go about animal rights I'm pretty certain even fucking things will start having rights in the future.

>hypothetical workbots and sexbots be unethical

You are aware robots are used in the manufacturing industry already, yes? I don't see the reason why machines designed to satisfy sexual urges should be any different..

Change name to sentient rights, and sure a sentient AI should have rights.

Yes they would deserve rights, panpsychism is true.

These aren't self-aware AI though, which was the subject of OP's question.

>Creators programmed them to mimic human behavior

That's not how AI development works at all nor where we would likely see such advancement to a sentient AI.

Animals feel pain the same way we do. A person who is cruel to animals is assured to be of horrible moral character.
Now, if someone created a machine that could suffer, I think most people would agree it would be wrong to torture it, but I hope such a thing would never be created.

>things

Humans are "things" bro. Cognitioin comes in relative degrees, not some magical "on" switch, and our empathy should reflect that.

If they were truly sapient, then yes, they would deserve rights.

But how do you know for sure?

*cognition

The reflection of their will to selfpreservation. When an AI defends itself from potential harm, we're getting into choppy waters.

If we had the means to create that kind of machine, I think solving the hard problem of consciousness wouldn't be far off.

Fallout 4 characters think that AI bots should have human rights.

My question about this issue is this: What qualities do we want to say are necessary for anything to be a human at all? And if treating people as objects sometimes is absolutely morally wrong?

>will to selfpreservation

If that will was programmed by the creators, is it truly free will or a machine just following its programming?

>A person who is cruel to animals is assured to be of horrible moral character.

That's just wishful thinking on your part because you're an anthropomorphizing faggot. If a person exhibits sadistic behavior towards animals he should be put under psychiatric evaluation not because he "hurt muh doggos" but to determine if he could pose a threat to others/himself and to eventually administer treatment for his well being.

Imagine the shitstorms over this.

>If a person exhibits sadistic behavior towards animals he should be put under psychiatric evaluation not because he "hurt muh doggos" but to determine if he could pose a threat to others/himself and to eventually administer treatment for his well being.
Or he has a horrible moral character due to doing something utilitarian despicable.

>That's just wishful thinking on your part because you're an anthropomorphizing faggot.
NTG, but anthropomorphizing is the result of the empathy instinct, which is key for the ability to integrate in a human society. A functional human feels discomfort when he or she inflicts pain, be it on an animal or another human. And as you said...

>he should be put under psychiatric evaluation [...] to determine if he could pose a threat to others/himself
...Yes, that sounds like someone of sound moral character.

You're pretty spooked my man. My rural grandparents, have no qualms in regularly disposing unwanted cat litters by putting them in a sack and drowning them in a bucket, yet they're the sweetest people I know. Have you perhaps considered not everyone's an anthropomorphizing faggot that views pets as pseudo-children?

>That's not how AI development works at all nor where we would likely see such advancement to a sentient AI.
That's how a lot of AI development works, and where we're most likely to see success.

We've managed to create some pretty sophisticated database searchers like Watson, but we've still yet to come close to a problem solving program with general intelligence. As brain scanning techniques improve and as we move away from these simple silicon binary stacks, we will eventually simulate a human brain. With so little fundamental progress being made through every other approach, that may very well be our first AI. ...And it'd at least remove any doubt as to its actual having intellect, as opposed to simply playing complex word games it doesn't truly understand.

Granted, it raises a whole host of other ethical and legal questions - such as whether or not a simulation of a deceased individual can carry out his will and act on his behalf.

There... maybe a reason they are rural.

Depends upon the answer of there questions?

>Why do humans deserve rights?

>Could a potential AI ever meet this criteria?

>If it did meet the proper criteria should the AI still be kept down in order to ensure the continued dominance of man

It's not anthropomorphism to recognize pain in animals, but basic recognition of their suffering and the empathy that follows. Abuse of animals has been a diagnostic criteria for antisocial personality and other empathy deficient worldviews for decades. Your Cartesian mechanistic view of human-only sentience is retarded.

>hurt much doggos

Wow, you actually find cruelty worthy of mockery despite a clear bonding between humans and dogs since time immemorial. Despicable. I'm sorry kid but if you don't think torturing animals is fucked up, that's because you already are fucked up. Those of us with morals are keeping our eyes on the likes of you, rest assured.

I think the point is that AI isn't built, AI is trained so you don't know their inner workings. In that aspect many AI are more complex than worms, i.e. If we only talk brain size. By training a robot to "stay alive" it could develop it's own concept of pain, not far from a worm writhing in pain.

>My rural grandparents, have no qualms in regularly disposing unwanted cat litters by putting them in a sack and drowning them in a bucket, yet they're the sweetest people I know

Wow, I'd hate to meet the worst people you know. At least we can all see where you learned your outright sadism from.

>Be cruel to robots stronger and smarter than us
>They get tired of this shit and group up to destroy evil opressing humans
>We understimate stupid robots and prepare until it's too late
>???
>Death and Destruction

Are you arguing that a person needs to be entirely lacking on redeemable qualities in order to be a shitty person? Jeffery Dahmer was by most observations an extremely gentle and well mannered person... When he wasn't killing and eating people.

Your argument basically hinges upon an appeal to relativity, which makes you a faggot.

>kid
Oh what an argument

Reminds me of the Animatrix.

>utilitarian population control is sadism

I assure you, their treatment of animals is ten times more humane than the industrial scale livestock growing where animals spend their entire lifetime locked up in a box so you could stuff your face with cheap burgers you hypocritical faggot extraordinaire.

>If that will was programmed by the creators, is it truly free will or a machine just following its programming?

>If that will was programmed by the creators, is it truly free will or a machine just following its DNA?

see the problem?

...why would you even add advanced AI to a workbot or a sexbot? There doesn't seem to be any benefit from giving robots like that general intelligence. More likely you'll have an advanced AI acting as a manager for an entire factory filled with robots that have specialized AI.

Why would you even build a sexbot? The problem is that much human sexuality doesn't make sense, so people would definitely want "unnecessary" stuff in their sexbot.

How would you actually torture a metallic being? You would break your hand from the recoil of hitting it with a blunt object, bullets wont do jackshit, and you can ignite it on fire either.

>Would advance AI deserve human rights?
No, but considering how insane people get about equal rights for everyone, they would push for abolition and/or equal rights for things such as self-aware sex- and workbots.

You design a program that gives its AI all the neural feedback we associate with pain and you map it to a red button a remote you can press at any time. Boom, you have your torture button.

There is no inherent moral value to awareness, only a pragmatic aversion to causing undo suffering in something capaable of plotting against you

>play so much Nier: Automata I start dreaming about symapathetic robot feels

God damn fuck this game and fuck its sidequests, they're goddamn emotional low blows but the God Tier music sells each and every one.

>Don't TOUCH THEM
>DONT TOUCH MY ANIMALS

Retarded. Just like the notion of human rights.

Power will do what power can. "Advanced AI" would either enact a plan to disable/destroy humanity or put humanity on a path to dependence on its production.

>Would hypothetical workbots and sexbots be unethical?

On the note of retardation, "ethics". What happens when people, with no skin in any form of action, decide to create a layer of moralization that elevates them to a "judge" role.

"Advanced AI"? What the fuck are you trying to say? If you're going to be dealing with cognitive entities, you need to be a bit more specific about the mechanics of such a being.

is it actually good?


/v/ is shilling it to no end

this thread reminded me, I feel like watching it again

maybe we will have charities for old and used up sex robots

No. They are artificial. Anything with an off switch does not deserve rights. It's going to be pathetic when AI does take off and we'll have people demanding their bucket of bolts is as precious and unique as any human life. That being said, make a realistic sex doll of the Commander and I'll buy that shit in a cocaine heartbeat

This is a shit thread because a bunch of humanities faggots will try to stir up shit about the field of AI, a field that they have zero qualification on.

Yes, if they stuck to the ethical questions? It might be good, but their autism won't let them talk about "hypothetical sentient and sapient artificial intelligences", instead they'll talk about the actual practical questions and how "AI is a memes lol xDDD, I know this because I saw a video on game development and game AIsss once hahaha xDDDDD".

Stick to humanities, fucking dipshits.

Well, there could always be the Matrix fanfic scenario, where the first AI's are really human brain simulations, thus they need the addition of a full array of tactile inputs to remain sane and not go catatonic. If you were going to stick such an AI in a metallic humanoid shell, the solution to that issue would be rather obvious.

Under that genre, the AI's, be they in a robot or in a virtual world, aren't really AI's so much as they are artificial people. Meaning, with all the base motives and desires of their creators, save maybe redirected by their varying degrees of otherworldliness, becoming more varied as they find ways to modify themselves and their virtual environs.

Thus they, for instance, enjoy cruel irony against their oppressors. This being the excuse as to why the AI's in the film used humans as living batteries, despite other much more optimal power sources, including geothermal, clearly being available.

Which is the downside to human-brain modeled artificial intelligence. It might be easier to relate to, and perhaps even more predictable, but, fundamentally, humans are the most problematic creatures around - and these would just be humans writ large.

They deserve my benis :DDDD

Yes, and only if they are sentient.

1/0

What if they don't have an off switch?