Okay Google, engage sport mode!

>Okay Google, engage sport mode!

*loud V8 and gear shift noises through the speakers*

Hot Russian Women are looking to meet you!

>all windows go black for a moment
>steering wheel goes to center, seat lowers and leans back
>windows fade back in, sitting on the start line of the monaco GP
>Jackie Stewart is sitting in a 1970 F1 car, turns to you and yells "FIVE, FOUR, THREE, TWO, ONE"

16 laps later you arrive at work

>I found 3 Starbucks in your area. Would you like to drive to #3 to talk to Stacey? The barista you described as "cute" last night during your 4.2 hours of sleep?

I can play videogames at home. In my underwear. Drunk.
This is a horrible idea.

make a large luxory cadilac autonmus vehicle and ill buy it

now you can do it in your car too, although you will have to deal with decency laws once you reach your destination

>paying for the vintage race car DLC
eww those vehicles are old and outdated for a reason, get a Season Pass already.

I'd like to see one of these try to handle the canyon/mountain roads around here at any acceptable speed. Curvy roads with blind corners will forever be the thing that stops self driving cars from becoming mainstream. Well, that and weather. Nobody will want to drive literally 10 miles per hour just because there could be a car and a deer suddenly appear at the same time. In the same regard, if one of these cars does manage to get in a collision with an object which isn't a motor vehicle because it was driving at a reasonable speed, there will be no end of legal bs about it.

Also, how would they deal with no lane markings and judge where they should be? What about when the camera(s) get a bit of road grime on it and can no longer clearly see the cliff edge? Will the car shut down in the middle of the road and force you to get out to clean it? Some roads are dirt and full of holes as well, and if it isn't programmed to avoid the dips it will damage itself and get stranded. At the same time, that takes the lane selection software to a whole new level of "where the heck am I?"

Bad weather is another killing factor. The car has two options: either drive at an absurdly low speed whenever ice is even remotely possible, or risk losing grip and killing the occupants. This will result in a whole lot of going nowhere in the winter. Also, if it's raining and the owner has allowed the tires to get worn down, will it simply refuse to go anywhere and not allow you to do some extremely important thing? Or will there be a certain amount of rain that it will cut out at, leaving the occupants parked on the side of some random road?

Self driving cars absolutely -can- work in the city or on the freeway, but once you get out of dry, sunny civilization they'll absolutely crap the bed. Possibly someday those sorts of problems will be worked out, but there's a very long way to go before they can become completely standardized.

>"I'm sorry James, I cant let you do that"
>"The current speed limit is 55 mph"

>click all boxes with cupcakes to activate sports mode

>Ok google engage incognito mode

>reading in a car

Good luck doing that for more than a minute.

Just shut up and buy fucking bitcoin.

you have literally zero knowledge about self driving cars, try to learn a thing or two before sperging against them

what its not haed to read in a car unless your on a dirt road

I didn't read his entire post but from skimming the first sentence of every paragraph, he does bring up a valid concern. The blindspot monitoring and radar cruise control functions on my Hyundai Sonata straight up don't work when it is raining/snowing. Isn't that the time I would need the assistance of those systems the most?

Nobody has really answered how self-driving works in inclement weather yet.

level 2-3 autonomous car are nothing in terms of captors and capacities compared to 4-5 level that should be on the market next decade

the combination of lidar/radar/visible spectrum and infrared sensors in prototypes is already vastly superior to human sight in basically any conditions
people should stop taking the embryos of self driving cars the buy today and the little gadgets they come up with now for actual self driving cars

It's hard to read through puke though

this
no matter how straight the road and good the pavement, after 2 minutes reading in a car I puke
I'm going the be so bored in self driving cars, not able to do anything but look straight ahead

You two are complete faggots.

>getting travel sick
>being a faggot
how are the two related exactly?

>getting motion sick
Please kill yourself.

Sounds expensive user, and unreliable.

Same here, i would never own a self driving car because i get sick from even slightly winding roads unless im driving myself.

the driver master race genes in me are so strong that my body physically rejects being a plebeian passenger, why should I kill myself?

>a shit load of sensors that go into the invisible spectrum coupled with a powerful computer
vs
>a being that gets tired, distracted, bored, zones out, etc etc etc

yeah obvs the human bean is superior

Uh because you get car sick and are dense af

The biggest issue with self driving cars is people like you being completely retarded.
There is literally nothing you can see or do that an autonomous car can't do equally or, most of the time, better. It's just a matter of the right hardware and software, with redundancies and safety modes in case something goes wrong.

the average Veeky Forumstard still clings to the notion that they're le hentai epic driver that can outdrive and outsmart a computer or even react in a better way to unexpected situations

this thread comes up every once in a while and it quickly devolves into NUH UH A LIDAR/SONAR/RADAR/CAMERA SET UP WILL NEVER BE BETTER THAN MUH SIQ MAD SKILLZ BRO

Did you make this thread just to make people angry? Only one angry is you now

>muh ad hominems

>It's just a matter of the right hardware and software.

This would have sufficed instead of raping my feelings, user.

I want to see a fleet of autonomous cars take on a bunch of dilemmas, see how they would react.

...

You're a complete retard. It's just a matter of 1s and 0s.

Nice use of ad hominems. Besides, I said I wanted to see how they would react if they were put in tough situations.

so stop sperging.

what makes you think a human bean could be better than a computer solving a dilemma?

>what makes you think a human bean could be better than a computer solving a dilemma?

I never said they were better...? Now you're just putting words into my mouth. Either that or you have poor reading comprehension.

was lurking here till i saw this picture and had to comment that i have unironically watched this video. 7.8/10 voyeur

The point that I'm making is that even if the issues are addressed, there are dozens of situations in which an autonomous car would have to choose between being a hazard and being a nuisance to it's owner. In every case it would be forced to take the safe and horribly slow and frustrating option. People will never like that, humans are impatient by nature.

Plus, one thing that is absolutely unfixable is the fact that sensors and electronics wear. There is not a single computerized system which is absolutely safe from glitches, and all it takes is one time to kill an innocent helpless family.

Moral dilemmas are another overmemed yet valid concern. Each owner will have their own opinion on whether to splatter fido or hop a curb, and that just can't be decided by the manufacturer without someone throwing a fit.

It's much easier to just call me dumb without any evidence towards your side rather than addressing my points though, isn't it?

again, what makes you think that a human will be better at making hard choices?

> In every case it would be forced to take the safe and horribly slow and frustrating option

No, if you program it well (ie "run over the dog rather than hit the other car") any computer will outperform a human's reacting time by orders of magnitude. The reaction time for a computer is like a million times faster than a human

> thing that is absolutely unfixable is the fact that sensors and electronics wear

humans get tired, bored, distracted. Sensors can be replaced too, btw

>There is not a single computerized system which is absolutely safe from glitches, and all it takes is one time to kill an innocent helpless family.

all it takes is a million accidents a year thanks to human recklessness (DUI, distracted, hooning, etc) to prove you wrong

>Moral dilemmas are another overmemed yet valid concern. Each owner will have their own opinion on whether to splatter fido or hop a curb, and that just can't be decided by the manufacturer without someone throwing a fit.

again, this can be easily programmed

why is it that people think that unless computers are ABSOLUTELY PERFECT 100% OF THE TIMES, self driving will never catch on? all it takes its for computers to be BETTER than humans to literally pave the way for self driving cars. Even if a computer is 15% better at preventing accidents, that's all it takes

Stupid premise. There will be no "morality" programmed into the car that "chooses" some complex emotional outcome. Self driving cars don't need to be able to do that to be 100x safer and better than human drivers. Those "what if the car has to choose" scenarios
A. don't occur frequently enough to be a consideration
B. are not likely to be better handled by human drivers anyway
C. will be FAR less likely to occur to begin with to a self driving car because the car won't be in a position caused by poor judgement prior to the "decision," especially as cars in the future will broadcast their intent to eachother

Humans are stupid chimpanzees that can barely operate a motor vehicle to begin with. Same stupid argument over and over against autonomous "well what if a 1 in a million glitch happens at the same time a stroller rolls into the street and a bowling ball transport truck explodes so the cars radar sees 200 objects?! If there's a single tiny chance an autonomous car isn't perfect we have to stick with human drivers who are currently killing millions"

>*hacker known as fourchan takes control of your vehicle*
>nothin personnel kid

Sorry on your post I was reading the other guys post that I only skimmed over, you seem to understand what's going on

A human is not nessesarily objectively better at making decisions, but a human has accountability and morality to make decisions with. A person is capable of deciding what they themselves want, and accepting the associated risks.

Programming it well is totally subjective. Maybe someone would prefer to be free to choose to hit a telephone pole and suffer minor injuries rather than killing a dog. Or kid for that matter. What one person or committee deems to be the correct choice is not nessesarily the course of action that every owner would want to take.

So self driving cars are prone to error just as humans are. Humans can take a nap and be alert again. Yes humans are far more error prone, but at the same time, only they are accountable for their actions rather than being helpless victims to a rouge computer.

Millions of accidents caused by humans prove that there is such a thing as an error free computer system? My point again is not that humans are statistically better, it is that computers lack the ability to make a decision that they have not or cannot been programmed for.

Computers cannot adapt and learn in the same way that a human can, and honestly we do not want them to be able to. Computers cannot be programmed for every possible driving situation any more than a human can be. With a computer doing the driving, there absolutely should be zero tolerance for error. Self driving cars do have a use. They're great for predicable, slow, or simple driving, but will never be a perfect solution. At least not within our lifetime.

The big disagreement that seems to exist is whether to allow a computer to kill people or not. If it is not allowed to, then self driving cars cannot exist. There must be a tolerance, however small, for human death at the decision of a machine in order to make self driving cars a commonplace reality. Some are willing to accept that, many are not.

...

>but a human has accountability and morality to make decisions with
So do auto manufacturers. You already put your life in the hands of auto engineers every time you are in a vehicle.

>Programming it well is totally subjective
No, it's really not

>Computers cannot adapt and learn in the same way that a human can,
They don't have to, autonomous cars will use fleet learning interpreted by engineers and programmers whose brains actually work, unlike most humans. This will allow further development of the cars ability, while most humans have demonstrated they cannot learn by repetition. With autonomous cars if a single car "learns" something (meaning a difficult situation arises, is interpreted and solved how to better address it in the future), all cars do. This is why autonomous will exponentially be safer, more efficient, and more reliable as it's adoption increases.

>With a computer doing the driving, there absolutely should be zero tolerance for error
Complete stupidity. You put your life in the hands of computers already, every day. Every time you walk down the street, take a plane, train, drive, there are either computers making decisions or systems designed through computer simulation.

>The big disagreement that seems to exist is whether to allow a computer to kill people or not.
Stupid premise. A car will never have to make that decision for reasons explained in my last post.
>but what if (implausible situation that wouldn't be solved by a human anyway?!)

>Alexa, smoke that bitch at the next green light

>Humans are stupid chimpanzees that can barely operate a motor vehicle to begin with
speak for yourself retard. humans are the "dumb chimps" that are MAKING the cars and AI.

Right, but also wrong. They got highly trained programmers developing the AI. They just don't give this job to an amateur, or in your words "dumb chimps".

I already addressed this in this post
Engineers and programmers are indeed part of our dumb chimp family. But if a single engineer, while taking a break from throwing his own feces and eating ants with a bamboo straw, has a moment of genius that solves a large scale problem, this fix can improve the entire autonomous fleet, forever. If a driver has a genius realization about driving it's useless, it only helps that one driver, when he's paying attention and has perfect focus, maybe, until he dies.

I trust that the auto engineers have done a good job because I have seen every last inch of the car that I'm driving and it is proven from time to be safe. Plus, there's a direct connection between the steering wheel and wheels, turning the key to off physically breaks the circuit allowing the engine to run, and there's a mechanical cable to the rear brakes. Ultimately, no matter what computer system whacks out, I still have a variety of options to lessen the risk of collision instead of being presented with a blank piece of plastic.

Tell me how the decision between suffering minor injuries and destroying a car versus killing an animal is not a subjective one that people would disagree on.

Having a team of engineers is a good method of doing that, but at the same time, there will be a learning curve. It's a good solution to one of multiple issues.

I'm not quite sure how a computer simulation that is then approved by an engineer is the same as a computer directing and controlling a motor vehicle without any way for a person to interfere. Even if a light goes green both ways, I can still do something about not hitting the other cars, even if that something isn't enough, it isn't putting a human at the complete mercy of a computer. If an airplane's autopilot fails, there's a pilot there to correct it. In that form, self driving could be useful, provided there are failsafes that allow the driver to override it.

A car that is fully autonomous may at some point have to make that decision, because situations that couldn't be solved by a human anyways do exist. Yes they are stupidly unlikely to happen, but the reality is that at some point out of billions of instances an autonomous vehicle will have to make a decision regarding whether someone lives or dies. I'm okay with being unable to save myself from injury or death due to some crazy, stupid, unlikely situation, but I at least want half a chance instead of merely being along for the ride.

>I trust that the auto engineers have done a good job because I have seen every last inch of the car that I'm driving and it is proven from time to be safe. Plus, there's a direct connection between the steering wheel and wheels, turning the key to off physically breaks the circuit allowing the engine to run, and there's a mechanical cable to the rear brakes. Ultimately, no matter what computer system whacks out, I still have a variety of options to lessen the risk of collision instead of being presented with a blank piece of plastic.
Fly by wire is much more robust than a cable. It has redundancy and is able to self diagnose. You're using fudd logic.

>Tell me how the decision between suffering minor injuries and destroying a car versus killing an animal is not a subjective one that people would disagree on.
The car would not be making a decision at all

> I'm okay with being unable to save myself from injury or death due to some crazy, stupid, unlikely situation, but I at least want half a chance instead of merely being along for the ride.

So this is your logic - you can choose one of two cars. Car number 1 is 100x safer than can number 2. You are 100x less likely to die in car 1. BUT, if a situation arises that has never happened before, in car number 2 you can use human input that is equally likely to make the situation worse on your way to death, but car number 1 you can't do anything. This makes car number 2 better? Think a bit more before you post. All of your focus is on one hypothetical decision that will likely never be made, while you are at the same time scoffing at the actual decision you can make to improve your chances of survival (buying an autonomous car).

Ok google, i had enough of this, run the car off the cliff.
OK GOOGLE
GODDAMN IT YOU PIECE OF SHIT REEEEEEEEEEEE

Electronics by nature are prone to failure. Things that exist are prone to failure. I like having a car that gives me plenty of options in case a few fail. In the case of self driving, all it takes is a single failure to leave no options. I currently have a brake pedal, a steering linkage, a clutch pedal, an ignition switch, and a handbrake. That's a lot of different ways to get a car to stop acting in an unwanted way. I can personally test each one of them every time I drive and know if something seems a bit off, then decide whether it is worth the risk to still get where I want to go.

Regardless of who would be making the decision, it is one that should be made by the individual. Moral dilemmas should be solved by each individual person rather than mandated by a committee. Yes it's completely the trolley dilemma meme, but just as people disagree there, people will disagree on how their car should behave. For that reason, the behavior should be left to the choice of the driver/owner.

Car #1 is 100 times safer, (I'll play the statistics from nowhere game) but gives me literally zero control of what happens, leaving my fate in the hands of a computer. Car #2 gives me responsibility for my own life. I would personally rather have freedom and responsibility for myself than be nannied around because it's "safer for me." If I'm screwed either way, I'd rather not be terrified and helpless in my final moments. I prefer a fighting chance.

>Electronics by nature are prone to failure.
Versus mechanical linkages? That's laughable. Solid state electronics are hundreds of times more reliable than moving parts. Massive redundancy and self diagnosis is going to be a requirement for autonomous (just like it already is on planes), that isn't possible with linkages. Your fear of electronics is just ignorance as to how they are implemented in modern cars.

>If I'm screwed either way, I'd rather not be terrified and helpless in my final moments. I prefer a fighting chance.
So you have never been a passenger in a car? Never taken any kind of bus, train, plane? You are already helpless either way, because even in your own car you are at the mercy of everyone around you, who you have 0 control of. I can see by you posts you have control issues, but they make no sense when really broken down.

Tell that to my power steering that shuts off occasionally, maybe you'll convince it that since it's electronic, it really is more reliable than the massive metal rod that connects the steering wheel to the steering rack. You can probably talk my O2 sensor into sending the right signal too with those facts. It's much easier to tell when something mechanical is whacking out than something electronic, and even if something is obviously wrong, sometimes I have somewhere to be. Some days I drive to work without power steering, but if I had a self driving car exhibiting the same level of gremlins it'd simply tell me "no, you are staying at home and losing your job." And before you say I don't keep up on maintenance I have done literally every fix in the book short of modifying and hardwiring the whole system to make it work consistently.

I do occasionally ride as a passenger in cars or on planes, but I make a decision each individual time to trust those people based on what I know about them. I greatly prefer to be the one operating the vehicle. Regardless of what other drivers do, I can make an effort to avoid their stupidity, and I've been rather successful in that regard. If "control issues" is the medical diagnosis for being confident in my own abilities then sure, we can go with that. I have nothing against people who would rather not be responsible for themselves, as long as they don't try to push legislation on me.

Your power steering is not electronic, it's electric. Both of your failures are mechanical failures, wiring issue and fouling of your o2 sensor by a poorly conceived mechanical device spewing soot onto it. Your experience is a great example why much more integrated, redundant, and self diagnosing electronics will be the norm for all cars, autonomous or not. Your car is extremely simple, yet you do not know and cannot figure out what is wrong with it, similar to most humans. You are stumped by your car, yet drive it anyway despite safety issues (strange decision for someone so obsessed over control of his automobile), and blame a magic entity "gremlins." Doesn't really seem in line with anything else you say. You don't trust autonmous cars because you don't know how they work, but you trust your own car even though you don't know how it works AND it's demonstrating it is not working correctly.

It has a computer module, there is an electronic component to it. A wire from a camera on a self driving car is no different from a wire within this system. The 02 sensor is not fouled due to combustion, it is even the post catalytic converter sensor which is subjected to far less extreme abuse. The code is a low voltage on the circuit, meaning that it has either failed internally or the connection has been broken somewhere, a problem that no wiring, whether it is on a normal or a self driving car, is immune from. As it is a recent and rather non-critical thing which has broken, I haven't been that bothered to replace it. The car is still plenty functional and not unsafe in any manner.

The power steering is a problem which has supposedly been diagnosed by it's own system, but upon testing the diagnosis it gave that was proven to be false. The real problem lies elsewhere, so the system clearly doesn't know any better. I'm not sure how lack of power steering is a specific safety issue. There's a great big metal rod connecting my input to the output, so I still have full control over the car when I have to drive like that, I just can't drive with such little effort. I know exactly how it works, and gremlins is used not so much as a magical entity as it is a description for a problem which defies even the logic of its own diagnostic system. Somewhere in the various wiring and modules is a point of improper connection, and even after locating and fixing that point multiple times the problem persists.

Source on the video?

KEK