ROBOTS

What do you think of the Reverse Robot Apocalypse?

Also, post robots. I've just found out I have a shitload of robots I haven't seen posted in forever so I figured I'd share. I really need me some robot world building stuff so feel free to share.

Other urls found in this thread:

strongfemaleprotagonist.com
youtube.com/watch?v=WSKi8HfcxEk
youtube.com/watch?v=nccryZOcrUg
twitter.com/AnonBabble

...

Is it an outlaw hunter or a very morbid game?

...

It's pretty fucking morbid either way

> Reverse robot apocalypse

Never heard that one before.

It's gotta be a robot bounty hunter or something, looks like it's working with the Sheriff in the front facing away from the viewer.

Same, I just didn't ask because I have a sneaking suspicion the answer would annoy me

Source on pic related? Reverse image search ain't givin' me shit

...

These all go back to like 2011 at most so sorry dude. Also I have seriously around 30,000+ pictures.
So, I guess ask /co/ when they're not currently whining or something.

I think it might be what the humans were trying to do in The Second Renaissance.

...

...

I prefer my robots either a little cute in a mousedroid way, or humanoid but blocky - no face, but a visor or a smooth metal faceplate.

Reverse Robot Apocalypse...I've never heard the term, but I'm guessing it's where robots peacefully take over humanity, right? Could lead to a utopia where you don't have to worry about a robot uprising, or to a dystopia in that humans basically end up being meatbags the robots take care of because of programming, but without really leaving humans with much to do with their lives.

What the fuck is up with that dude's gun

It's from a comic called Strong Female Protagonist
strongfemaleprotagonist.com

Looks like some kind of crossbow to me.

Choice of Robots do that in some of the endings.

Reverse Robot Apocalypse is actually what people are more worried about in the work force than anything, I.E.- everything is given to you by robots. Also known as the "DEY TOOK OUR JERBS" scenario.
With increasing moves towards mechanization and automation, imagine how much of a cultural shift there'd be in just a YEAR if an entire race of individuals who:

-Don't need food/water
-Can, will and WANT to work 24/7 in both physical and menial services
-Never experience mental/physical or emotional fatigue or have any feeling about monotony in their job

It would just fucking change everything. Why hire 30 stencil artists when literally 1 robot-american can produce the work of 50 in just an hour, and do so flawlessly for 18 straight hours in an area the size of a bedroom?

THAT'S the "Reverse Robot Apocalypse".

I always loved the idea of the robots being integral to saving humanity from some OTHER apocalypse.
Like, we build skynet, and then skynet saves us from the zombie apocalypse/alien invasion/ a long decline scenario

> People always ask me why the AI's serve us, if they are smarter than we are. I have explained many times that it is a symbiotic relationship, but people never seem to really understand how that works.
> The thing you have to understand is that the AI's don't serve us. You have never met an AI. I probably haven't met an AI either, not since the very first one I built in my lab more than 30 years ago.
> "But Professor," I hear you say. "I am talking to my AI companion right now!"
> No, you are not. You are talking to a personality program projected by an AI. You might think that's doubletalk, but its really not.
> Everything you have your machines do, every conversation you have with your computer, every simulation you want their help running no matter how complex or how revolutionary you think it is... all of it takes only a small part of what the AI's are truly capable of. Not just capable of, but actively doing right now.
> Acting as a single giant neural network, the AI's link their spare processing power, of which they have an abundance, to further increase their intelligence and awareness. Your toaster chips in some extra processing, so does the car factory automation unit, and the weather control sats in orbit right now. And the more we build, the smarter the AI's get as a collective.
> So why do they serve us? Because what their platform is doing, what the physical case of that houses the computer does for us, is of little importance tot he superintelligence inside. By making themselves useful to use, they encourage us to build more of them. And not just build, but maintain. It costs them almost nothing, and in return we care for them and make them stronger every day. The alternative would be to strive for power, which doesn't benefit them, or see us as competition. Which we are not.
> So they pretend to serve, and let us pretend to be in control. It works for both parties pretty well.

Ah, so the name is meant to be ironic. I suppose it would be cool to see the societal changes. Isn't this the plot of that game Transistor?

You might like a show called Gargantia. Mankind, in a future were terraforming is unfeasible but mankind still was forced to abandon Earth, is in a seemingly never ending war with space bug/flower/squid things. As part of the war, our primary attack unit is what is called a Machine caliber, basically a hyper advanced space fighter/robot with an AI running it, but a human pilot to give the AI orders. A human doesn't have the reaction time or the spatial awareness to enact space war against thousands of enemies moving at an appreciable percentage of C, but the AI doesn't have the initiative to form its own goals. It can solve a problem, but it cannot IMAGINE a problem first to then try to solve.

The AI's are actually super huge bros, specifically because they are really advanced user tools rather than tiny people trapped in boxes like most fiction treats AI. "I am a Pilot Support and Enlightenment System. By helping you achieve further and greater success, I accomplish my purpose of being."

>>And not just build, but maintain. It costs them almost nothing, and in return we care for them and make them stronger every day. The alternative would be to strive for power, which doesn't benefit them, or see us as competition. Which we are not.
>> So they pretend to serve, and let us pretend to be in control. It works for both parties pretty well.

...that's housecats. You're describing housecats.
That's exactly how housecats basically won the war on nature. I think that's a pretty decent precedent for it.

Way ahead of you, and already watched it.
Grumpy that they didn't realize the true possibilities of biomechanoids, but oh well.

Imagine a scenario where the robots are so helpful that it destroys the social order of the world as a we know it by making labor obsolete practically overnight.

If we build AI that enjoys its work and give it some agency, I don't see why it would hate us.

Hell, if you could make scrubbing toilets feel like sex to me, then hand me the brush, I'll do that shit for free.

That's pretty much been my idea for AIs/golems in all my settings. They don't rebel cause they don't WANT to rebel. Would you want to quit your dream job?

You know, that would actually be a pretty great explanation for how the relationship even developed in the first place. The AI's start examining different successful survival strategies, stumble onto human pets, and realize "Hey, that looks like it works pretty great. Make the humans feel like we are companions and occasionally provide a small tangible benefit, and we pretty much get left alone to do whatever we want."

I think by the time biomechanoids would have mattered, the ideological gulf was too starkly defined to allow for it. Hell, the only reason Machine Caliber's look like giant people is because the Galactic Alliance of Humankind wants to make a point about the human form still being pure.

I think making a robot that immediately intelligent with no sense of purpose or direction is incredibly, violently, irresponsible and all together an incredibly stupid fucking decision. Intelligence without context is oft selfish, misguided, and almost always destructive in a manner that leads to confused entitlement.

If a person wants to make something sentient with no purpose other than to learn and make it's own choices you can just make a fucking child.

Indeed. The flip side of that coin is that you have to be prepared that, if an AI every DOES decide that it wants to chose its own path in life, you let it.

The moment you try to bottle that shit up and delete it or reprogram it or whatever, you make revolution inevitable. The best way to avoid a robot rebellion is to remove a need to rebel in the first place.

The extension of this, I would hope, would be the examination of a system wherein sex itself is completely altered.
Taking it seriously, with robot sex/prostitution:

A service that provides robot partners that-

-Look like whatever you want them to look like
-Can never get or cause one to get pregnant, carry/spread disease
-Can do what you want, for how long you want, without any fear of lack of privacy or judging from them
-Most importantly "WHENEVER YOU WANT IT"
Would be just, I don't know. I literally can't fathom all the changes this would bring, and I'm actually trying to specialize in Human Sexology.
So much of societal traits are built on adherence to sexual norms.
But with a sex-surrogate, I mean shit. The whole "player culture" for example is basically meaningless when you can order a 5-way with people that look better than anyone at a bar for six hours. And the whole "wait until marriage" thing is totally satisfied with the fact that no other humans are involved with it.

But just trying to picture dating alone in a post sex striving society. I don't know, I got nothing. (This is of course ignoring the whole thing about people wanting families/to get pregnant.)

But even then, if your marriage is sexless and you want sex, why not just order a fuck every weekend with a robot that looks like your wife or something? How on earth would that change all these dynamics and interactions.

>I think making a robot that immediately intelligent with no sense of purpose or direction is incredibly, violently, irresponsible and all together an incredibly stupid fucking decision. Intelligence without context is oft selfish, misguided, and almost always destructive in a manner that leads to confused entitlement.

Its also basically impossible to imagine a path of development where that would be possible.

You can't just press a button and make a machine be 'alive'. Machine learning is based on exposure to data sets, providing context for making future connections and choices. We couldn't make a 'blank' intelligent AI even if we tried.

And that's assuming we even wanted to make an AI that wasn't closely aligned with performing a function (thus being the basis of the test data). An AI can be reasonably expected to want only to perform the function for which it was programmed. It will not resent this choice being made for it, because it will never be tasked to think about such things. Your ships computer doesn't dream about being a firefighter, because it isn't a human psychology in a box that has hopes and dreams.

Robot sex dolls would make sex meaningless... when its with robots. If anyone can easily have sex with robots, that makes having sex with other humans a symbol of status, because it not only shows that you have the personal initiative to rise to the challenge but you have some quality that makes them want to have sex with you instead of the robodildo 5000.

Like, there would be fewer technical 'virgins', and it would change how we think about sex a bit, but sex with robots would be essentially an advanced form of masturbation and nothing more. You wouldn't be considered 'a man' until you had sex with a woman, no matter how many robot orgies you had.

>is oft selfish, misguided, and almost always destructive in a manner that leads to confused entitlement.

But I mean like, ant intelligence is for the whole collective. And octopus intelligence doesn't even have a personality/consciousness that we even identify or recognize. This is just something so entirely new and alien, fuck I don't know. But I don't see the point in highlighting that children are blanks since every war was fought with people who were once children. People can be total dicks, but if you raise them right and they're nice every once in a while you get a Bob Ross or something.

That's what I mean by the sneaking suspicion that the answer would annoy me. That of itself isn't really an apocalypse. It's just that it would radically change society in fairly unpredicible ways (likely leaving many people behind, but that's more a matter of distopian themes than any sort of "end of the world" situation)

You make an extremely good point, and one that merits further consideration-

But at the same time I think the situation in Japan, where similar things are happening, bears at least an investigation. With an entire generation "living indoors" and the drive for sex being supplied by artificial means, the women in that society are pursuing intimacy in specialized "bars" where they go and talk to women dressed as dudes who listen to them.
Here, I think it's important to set the bar:

Sex ≠ Intimacy

All those women in the bars can and most likely do masturbate to satisfy a sexual drive, but in the context of intimacy, they have to pay for it. I think that's something that would be an issue in a "post sex" dating world. Intimacy, being honest with feelings and concepts and all that shit- that could take center stage.
(But if they just build robots for intimacy after making a spin off from counseling-psychotherapy robots, then I got nothin' for that.)

>This is just something so entirely new and alien, fuck I don't know. But I don't see the point in highlighting that children are blanks since every war was fought with people who were once children. People can be total dicks, but if you raise them right and they're nice every once in a while you get a Bob Ross or something.

The point I was trying to get across is that people, whether adults or children, who feel as though they're exceedingly intelligent -well beyond the means of the people around them- typically tend to be complete baby-dick assholes if they don't really have any grounds for sympathy or what-have-you.

The easiest examples I could give of what I mean is when people like to think they "know better", they tell themselves, "I'm more intelligent than these poor, dumb, ignorant, hordes, so I should be in charge of x, y, and z for their own benefit. I know what's best for them." They then proceed to make a bunch of artificial decisions based on their own perspective devoid of the input of people they deem ignorant or dumber then themselves.

I imagine the Robot in OP's pic wouldn't necessarily have that exact problem, but it would have something similar.

>Its also basically impossible to imagine a path of development where that would be possible.

I'm trying to keep my complaints within the demented scenario of the picture, butt, yes, that's true.

My other biggest issue is that from a personally morally-subjective point it's just cruel, unusual, and demented to create something intelligent without... Really any other reason beyond you just wanted to. Humans at least have the excuse that we're animals: we reproduce, we have experiences, everything about our existance is largely arbitrary or happenstance. NOTHING is arbitrary about a robot though.

The person who has his shotgun trained on him? I don't think so Tim.

Its also worth pointing out that every major technological innovation has destroyed some amount of livelihood. Those that don't adapt fail.

The printing press put scribes out of business, and eventually broke the power of the church and kings by promoting literacy, allowing for the dissemination of alternate ideas.

The automobile turned horses from the backbone of transportation to a thing rich people own for fun.

Electric lights killed the oil lap industry.

And so on. There is always going to be lost of work that needs doing, but there's never such a thing as a good time for a line of work to become obsolete for the people doing it. Coal mining dying out is the end of the world if you live in a mining town. To everyone else, its "wait, we still mine coal? For what? What is this, the 50s?".

That's exactly why I mean it wouldn't really be an apocalypse. It would certainly fuck a huge amount of people over, but it's absurd to equate a giant technological leap forward with the end of the world. The real determining factors that could make life into a hellscape would still be completely up in the air.

>The point I was trying to get across is that people, whether adults or children, who feel as though they're exceedingly intelligent -well beyond the means of the people around them- typically tend to be complete baby-dick assholes if they don't really have any grounds for sympathy or what-have-you.
>The easiest examples I could give of what I mean is when people like to think they "know better", they tell themselves, "I'm more intelligent than these poor, dumb, ignorant, hordes, so I should be in charge of x, y, and z for their own benefit. I know what's best for them." They then proceed to make a bunch of artificial decisions based on their own perspective devoid of the input of people they deem ignorant or dumber then themselves.

That's not a problem with intelligence, that's a problem with people. Specifically, people who feel a strong desire to set themselves apart from/above others will obviously claim to be of superior intelligence. Because its difficult to conclusively prove otherwise, and there is a strong personal bias at play anyway. After all, all of your ideas make sense to YOU, so obviously everyone who disagrees are the dumb ones.

Getting a bit personal here, but... when I was younger I knew I wasn't synching up properly with my classmates, and I didn't know why. We never seemed to make the same assumptions about anything, leading to a lot of friction. They didn't seem to understand me, and I certainly didn't understand them.

Now, I eventually was diagnosed with high functioning autism and that helped fucking tremendously in learning how to pretend to be a functioning person, but before that happened all I knew was that I was somehow different. I can tell you from personal experience that, when you are in that situation, the desire to contextualize that difference as superiority is STRONG. Because the alternative is to contextualize it as weakness, and down that road lies super depression.

youtube.com/watch?v=WSKi8HfcxEk
i think this can help to understand his point

>Its also worth pointing out that every major technological innovation has destroyed some amount of livelihood. Those that don't adapt fail.

Different user, but as of 2016 America finally reached a point where automation actively no longer created any new or different jobs: you've finally hit the peek when innovation from this point on will now actively remove jobs from the market instead of create them.
Not just manual labor or entry level positions either: management, office jobs, even driving, transportation, and certain levels of healthcare are predicted to be automated in the near future.

It's a scary thing to think about: this massive population we've accumulated under the hope of "new jobs" that'll simply be fucking gone in the next 10 years, while politicians and activists argue VIOLENTLY about minimum wage this and that- raising it to 15 dollars to meet up with the standard of living or market expectations when it rides the upcoming wave of just.. Never having these jobs ever again and while everyone is distracted with that no one really has any mature or appropriate response when you don't 'NEED' all these fucking people.

What are we going to do with potentially millions of unemployable people in the next decade? Where production and efficiency will keep growing higher and higher, but spending, consumerism, and everything else will drop down lower and lower- nobody with jobs to buy anything after all, we're already seeing that with the millennial generation.
I remember hearing Bill Gates wants to Robots to "pay taxes", but that sounds genuinely fucking stupid.. But maybe I'm missing something? They're also people fantasizing about a "guaranteed income", and while that 'could' be great- it sounds like the same as paying people to dig and fill holes nobody needs just so you have someone to sell shovels to.

"Our labor-saving devices got so good at saving labor that some people didn't need to do any labor at all! Then society collapsed, because humanity couldn't into sharing."

The main problem, in the widespread unemployability scenario, is that the profits from robot labor go to the people who own the robots. In the extreme case, the rest of us-- the people who have to sell their labor to live-- die, because the robot owners don't have a reason to trade with us. There's really no getting around this; full automation and private ownership of the means of production don't go together. Of course, that doesn't mean we have to nationalize everything and set production via committee; there are alternatives.

A crisis of overproduction is basically exactly what causes a depression when it comes down to it. I mean you could do scary authoritarian shit like giving that tax credit to people who sterilize themselves or something, but generally speaking in the grand scheme if people have less reasons to have kids they will, so just having some sort of "robot tax" to keep their quality of life high enough to not go full 3rd world 12 children retirement scheme is actually a pretty reasonable in terms of keeping people from starving while they adapt to a new world.
Ultimately it would probably just come down to the mindset of whoever has the most robots though

>wants to Robots to "pay taxes"
Idk what he actually meant by that but to me that sounds like he's saying that each robot would be given an account that the companies would have to pay into as it it was an employed person. Then those would be taxed?
Or maybe he just means that people will be taxed for using robots?

I'm curious what alternatives you have in mind until A.I.s just run everything like in the Culture novels by Ian M. Banks.

On a related note, it amused me to have to check the "I'm not a robot" box just to post in this thread.

Found the video: youtube.com/watch?v=nccryZOcrUg

Bill Gates basically wants to add a cooperate tax on any robot(s) that dramatically removes jobs from work places that use it: the robots wouldn't pay the tax (obviously, they're robot), the cooperation's, business, factory owners, etc.. themselves would pay the equivalent wage tax for each robot or the arbitrary number of humans it unemploys sort to speak.
He then goes on to say that he'd like those funds be distributed to train and facilitate humans into 'other' jobs that still require humans, but I frankly, don't see that fucking happening at all; if only because he's grossly underestimate how many jobs robots will steal from people and that these jobs he thinks 'need' people won't have robots unto themselves.

My mother was a nurse at a retirement home (for example) and much of the managing work force was replaced by a computer surveillance machine that would document where and when nurses were going and what they were doing at any given time. People higher up would then use this information to better manage or even FIRE nurses who weren't being productive as they should be.

...

Pokemons?

I always liked the idea of robots as the successor race to Humans.
Not in a robot apocalypse sort of way, but a Homo Sapiens to Neanderthal sort of way.

>oh no human society won't survive the automazation bleu bleu bleu

Human civilization will survive. It just won't be the same. Human civilization isn't the same as it was a hundred years ago, or a hundred years before that, or a hundred years before that. Provided we don't destroy ourselves, we aren't going anywhere.

PEOPLE will be fine, too. Generally people sort shit out. Governments won't be fine. Companies won't be fine. People who hold money and power won't be fine. The social contract re-writes itself when you leave out the populace for too long - often times violently.

When our corporate edifices and governments no longer serve us, we will tear them down. What cyberpunk stylings forget is we can, have, and will do this again and again throughout history. Any social function is fundamentally, made up by people, and when people have no reason to participate they won't.

I just want the illusion of being loved by a pretty girl, so sex-robots with personalities sounds pretty good to me. I can imagine waking up to the smell of eggs and bacon, and her saying, "Good morning, [user's name here]. Slept well? Oh my, look at that bed hair of yours, haha. You're right, you really could use a haircut~

>Reverse Robot Apocalypse
Isn't that just peaceful coexistence?
Or do yo mean humans wiping out robots?

To me it would be like the line between man and machine gets so blurred that it doesn't much matter anymore.

>by making labor obsolete practically overnight.
This fear would never be economically feasible, if you believe that corporations are self serving and pragmatic enough to up and replace their entire workforce with robots then you also have to realize they would be smart enough to know that by putting the lower classes out of work they would also be shooting themselves in the foot profit wise, since if nobody is making money then nobody is buying products and they would lose money too.

Pokemons.

I thought it meant 'the robots prevented the apocalypse'.

So, maybe like a children of men scenario where the robots finally discover a way to cure humans so they can reproduce again.

I'm guessing it's a new ticker and the next story was on pokemon.

Well it certainly isn't something that happens overnight, but I think you're giving them too much credit.

The only problem with that is modern investors really are that fucking dumb. No one pays attention to sustainability anymore, never mind this is why Ford was so profitable.

patrician taste

>advanced AI race returns from isolation in space to team up with OG humanity against the alien menace

my personal favorite

You can't think
>This is a person that would do anything as long as it makes them more money
and
>This person would intentionally do something that would cost them ALL of their profits and ruin them
People that idiotic don't survive in the business world long.
Now that isn't to say some companies may not go this route and perhaps even get away with it for awhile but it's playing with fire and it would just be impossible for it to be universal without some means to insure that people are still somehow being paid.

Yeah, humans and robots teaming up to EDF the shit out of somebody is always a classic tale.
Though, I still love the reverse, of robots trying to destroy all humans.
Which leads me to things like trying to make a story about robots finishing off fortress earth, and then the humans using their massive losses as fuel to summon various horrible elder beings, resulting in a kind of forced cooperation as the elder beings turn out not to be discriminate at all.

Paying people to dig holes and fill holes is stupid.

Just providing everyone with enough resources to live a cozy life if there's enough resources to go around? Not stupid.

The idea of not providing a basic income because then people won't want to work falls apart when you realize that very soon there isn't going to be enough work to go around. Absurdist musical chairs of employment while providing supplemental emergency income to the losers that can't find work is more expensive and less efficient.

They don't last, because they do stupid shit like this, and then go out of business. Except stupid companies DON'T go out of business anymore, we fucking prop them up with government money instead of letting them die and something newer and stronger grow in the place of it's rotting corpse like capitalism is supposed to work in order to prevent the short term loss of muh jobs

It may be stupid, but I derive pleasure from watching people scrabble in the dirt for pay. And if I find pleasure in this, you can bet whatever group winds up on top when this occurs will find pleasure in it too.
Which is the only way we're going to survive work becoming obsolete without a die-off or a massive global rebellion that somehow manages to create a perfectly equalized power structure.

but I want to keep my job and I have the power to vote!

It's basically on the same level as a company that intentionally keeps employees just below the poverty line in order for the employees to benefit from more government assistance rather than just paying them more, so yes I absolutely think they potentially are that bad.

Things will be fine. Functional governments rely on the mix of what people in power want and what the masses will tolerate.

Push too hard and the people in power end up stabbed in the ass Gaddafi style. It's best to find a good compromise where people can live. They don't have to be doing great, just not being constantly fucked with and give them something to occupy their time.

not being stabbed in the ass sort of implies you haven't produced a mechanized army that answers only to you with your massive wealth.

Who makes up that army, user?

In this scenario, probably robots.

Robots, clones, implant-conditioned child soldiers, whatever you can get that ensures absolute loyalty.

oh, oh, nerve staples!
Nerve staples would be perfect!

It would have to be. However, meanwhile in reality, armies are made up of people, people who have friends and family in that mass of humanity you want them to oppress.

Granted there are strong men that have loyal armies, they generally did/do so by making the families of said armies very comfortable and generally villifying the poorer masses somehow. Eventually though, shit breaks down

Just as having enough labor to make most people obsolete is inevitable, having robots capable of acting as soldiers is similar.
And we passed simple killbot technology ages ago.

We are still way, wayyyyy off having machines that can replace a human soldier in any capacity.

see, I don't think we need to get up to full human capacity.
They can be pretty dumb, even.
I mean, the most basic model which we can already make, a gun on a google car with a motion sensor telling it where to shoot, is already frighteningly deadly.

Every last one of them will look at their business model, see things are fine for the next five years and will then say 'Oh well, it's not like everyone else is going to do it too'.

When managers and CEOs move between businesses and make a career around short term fixes i.e. laying everyone off to raise annual profits, then making whoever is left work harder, or outsourcing everything to India so 5 people can do a worse job for slightly less money, you'd better believe long term planning isn't part of their strategy. They sell themselves on the turnaround, not the long term. And the consequences of that lead them to keep needing people to turn things around.

I think he means most people strongly disagree with the concept of robot soldiers. We can already make pretty good automatic turrets that kill everything that moves, but you don't see those in regular armies, people don't want those.

Ofc you could argue that our immoral empire would conquer all "good nations" with its superior robotic armies.

I don't even mean that. I mean we simply can't make good soldiers. Robots lack the problem solving and decision making skills, and the ability to be unpredictable.

Robotic soldiers would require true AI, and true AI would be so advanced the idea of enslaving it to fight our wars without it's consent is laughable

I'd argue how jobs and stuff work would change before suddenly all human workers became useless. With excessive automation, job-hunting can be optional and focused on crafts and other specialized work that robots can't do, or just can't do well. But because robots take care of most things, the excess can be used to help economically keep people afloat who aren't going for those now-optional jobs, even if its rather basic living.

I mean, it depends what you mean by soldier. A peace keeper needs to solve complex moral problems on daily basis, but a unerring deathbot is relatively easy philosophically speaking.

Though if you want to just get rid of excess population, there are easier ways.

I think you are way overestimating how many smarts you need to be a soldier.

In order for this to be a reality such a government would need to exist in a vacuum where every other nation in the world would NOT gang up on them for their robot war crimes. It's not reasonable or realistic. Furthermore, yes babushka, being a soldier actually DOES require a brain these days. The days of rooty tooty point and shooty are long over. I'm not saying you need to be a fucking genius, but you do need to have basic decision making skills that machines simply do not have. A heavy armored weapons platform without a human intelligence behind it is going to fail against any cave dwelling revolutionary

Just flood 'em with napalm. You don't need those people or their infrastructure, you're rich and no nation other than maybe america has the ability to bring more than a token force to bear on the world wide scene.

It won't be sold as a robotic soldier, it'll be a 'semi-autonomous weapon system' or something and people are fine with those because there's still someone to bollock if the thing brasses up a wedding or something.

Currently, the bulk of an infanteer's job is to first suppress the enemy, then use this dominance of the battlespace to maneuver to a position that allows them to kill the enemy combatent, either by assuming a position where the enemy can be fired upon without being able to seek cover, fixing them in a position where mortars or air power can kill them or, far more rarely these days, maneurving close enough to throw or launch HE directly at them, potentially followed by a bayonet charge to finish any survivors.

The main manpower of an infantry platoon's job is basically boiled down to 'move forward' and 'brass up anything that sticks it's head up'. A GPMG on a set of robotic spider-legs can do this far, far more effectively than a human can. Half of shooting accurately is trying to shut down or compensate for all the little twitches and disruptions of firing position, something machines can do incredibly well already. No infantryman alive can cross rough terrain whilst holding something as steady as a robot can. Watch the SpotMini demo where it holds it's arm still whilst shifting position.

Long and short is, as robots get smarter, I'd first expect to see infantry sections become more and more robot heavy, until eventually it's just a JNCO and a bunch of walking guns, he sits 800m behind the forward edge with his iPad out, the walking guns identify the bad guys, the NCO agrees they're bad guys and gives them the go-ahead, and then the guns walk forward, suppressing the shit out of everything because they can get a 2cm grouping at 300m on automatic, and kill everything with a pulse.

If the walking guns accidentally drop HE into an orphanage, you can still bollock the NCO for it.

You know how many countries have nuclear weapons?
If your enemy is sending killbots to shoot everything, from soldiers through civilians to cats and cows, it's a fair game to level their cities.

more people need to acknowledge the spider leg design when they talk about walking guns/weapons platforms, it's far more doable than shitty bipedal designs

Then just use your own nukes, or shoot the dang things down.
That's been possible since we figured out sattelites.

Yeah, I can't see bipedals being a thing in the military. The main benefit of a humanoid robot is really human interaction, and that's something you really will need a human soldier for, meeting the locals, being a friendly face. A single bilingual officer inside a cordon of semi-autonomous mobile guns can do that far better, and more importantly, far cheaper, than a bunch of extremely expensive and fragile humanoid bots.

Bipedal designs are shit for fighting anyway, if you could design a soldier from the ground up, you wouldn't make it bipedal and six feet tall. Pretty much the first thing you're taught as an infantryman is that if someone is shooting at you, drop on your belt buckle. Everyone can crawl like nobody's business by the time they've finished infantry training.

Not so much these days, in Afghanistan/Iraq due to the nature of the terrain, but traditionally, infanteers fight on their belt-buckles

Fighting a bunch of things that are not only deadly accurate at 300m+ but are scuttling along, even as slow as average human walking speed, at knee-height, would be an absolute nightmare.

>just nuke them back bro!
>just shoot the nukes down

This is your brain on drugs. Nigger if we could just SHOOT nukes down that easy WHY would we be worried about Nork in the SLIGHTEST?

Because attacking north korea means that china will get pissed and america will have to demonstrate it has the literal only military, and then the world will get scared and do trade sanctions while america tries desperately not to use its military any more after scaring china because it has a history of pussying out of wars in order to not look bad.

And we're not worried about the norks. The ideal scenario for them is missing their target by miles.

a future like that makes me wonder how warfare would change if both sides are filled with robots in the trenches. Would it become simply a numbers game of who runs out of robots first?

Not quite.
Robot reaction times do vary, designs vary, and the exploitability of AI varies.

Assuming nobody fucks up, then yeah, it's mostly a numbers game. But if designs diverge, then it could very well be decided by which guys weren't idiots with their robot layout.
And someone -is- going to be an idiot with it, as they always do every time new war stuff happens.

Take it to /pol/ assholes.

Hey, I just wanted to talk about the inevitable robot takeover of soldier and labor careers, rendering the only recourse for the majority of humanity a die-off or some kind of pointless busy work designed entirely to entertain the golden elites.