Self-driving Cars

How far are we from them?

Other urls found in this thread:

arstechnica.com/cars/2017/09/hacking-street-signs-with-stickers-could-confuse-self-driving-cars/
theverge.com/2017/11/7/16615290/waymo-self-driving-safety-driver-chandler-autonomous
twitter.com/SFWRedditVideos

reminder that this completely BTFOs any self-driving car for the foreseeable future.

Loss, right?

>How far are we from them?
They already exist.
>Being less than perfect is a problem
Sure glad we have human drivers on the road keeping those perfect 0% injury and fatality stats year after year.
Meanwhile in the real world Uber already bought a fleet of self-driving Volvos for 2019.
Also:
arstechnica.com/cars/2017/09/hacking-street-signs-with-stickers-could-confuse-self-driving-cars/
>To repeat: these kinds of attacks worked on the specific machine vision system the researchers trained, and the altered signs in the gallery above would not fool any cars on the road today.

Ready now. The thing is you run it mostly AI but have remote human error handlers. Waymo is already deployed and running in Arizona.

There is not a huge rush though as the big tech behind it doesn't care about hitting market asap as much as getting it perfect.

That's because all the real money is placing it into self driving semi trucks and eliminating the trucking industry.

>Self-driving
programmers wrote their programs

general intelligence doesn't exist you morons

Good point. OTOH we might not need to rely on visual driving signs in the future.

That's actually not the problem. The problem is that we'll likely have the same algorithm working in every car in the self-driving fleet. Which means if you can break one, you can break all of them. (As opposed to humans, who just randomly crash.)

Your point of self-driving cars being safer than humans is well-taken, but is not relevant to the issue of adversarial attacks.

When that semi accident happens, the company will lose all support.

...

>if you can break one, you can break all of them. (As opposed to humans, who just randomly crash.)
That's not a new situation, trains and planes are both more centralized forms of transportation than cars too and train wrecks and plane crashes have resulted in large numbers of people dying from a single problem many times in recent history.
>programmers wrote their programs
It's more nuanced than that. There are some explicit instructions that were implemented to comply with laws (e.g. you probably want to make sure behavior in response to traffic lights are more pinned down to absolute directives). But the majority of how self-driving cars drive isn't anything a programmer planned out in advance. What the programmer does is write the method for how the self-driving car program will learn, and that's different from writing how the self-driving car will drive.

>2018
>slapping a sticker on a stop sign is now terrorism
lol
i think 20 years OP. gradually more and more will become automatic. all new cars will have lane assist AI in 10 years

>How far are we from them?

I suppose that depends on where you live. There's something like 20 companies trying to get their shit out the door and on the road ASAP in the USA alone.

Basically, people with paintball guns can go around and whammy signs with like 3-5 shots and ruin all self-driving cars' ability to read the signs. Currently that is.

Very far. Autists will keep claiming they're 20 years away just like they did 20 years ago even though we've seen minimal progress.

t. increasingly nervous truck driver

20 years

Are you all confusing "when will self-driving cars exist" with "when will self-driving cars be the only sort of vehicle in existence?"
They already exist. They've existed for years now. And Uber already bought 24,000 self-driving Volvos for 2019.

>24,000 self-driving Volvos for 2019
>0.000001% of market
>relevant
20 years

>That's not a new situation, trains and planes are both more centralized forms of transportation than cars too and train wrecks and plane crashes have resulted in large numbers of people dying from a single problem many times in recent history.

Not from a controller point of view (and we are talking about the controller, since we're discussing self-driving). You've had different pilots and drivers and so on. On the other hand, all of our ML/AI systems are much more centralized than almost anything before. Single point of failure is a real concern.

So you *are* confusing "when will self-driving cars exist" with "when will self-driving cars be the only sort of vehicle in existence," got it.

guys please no one take this autists bait

You're the autist who has some baseless personal "definition" of existence that depends on the existing thing being everywhere. I guess family owned small town diners don't exist because they aren't McDonalds, fuck off.

...

I think it's a concern, but not anything that's going to keep them off the roads. We live with all sorts of horrible consequences from technology constantly, insofar as this is a new horror it'll just be another one on the list, and more work will probably be done to mitigate that sort of problem after it starts happening in practice and not just as a hypothetical.

Okay, retards, then put in some RFID/QR type code/transceiver on the sign that works as a "stop sign" on the car's logic, and that won't be affected by paint or bird crap

>b-but they can just remove the sign

They can do that to real signs currently too

Work on parallel attacks are actualyl going on concurrently. No good progress yet, but it's a topic that's increasingly gotten attention, so that's good.

In all honesty I don't see complete driver-less (including remote drivers) self-driving cars on regular roads for a while. Most probably some city (in Asia/US) will start modifying roads/signs/infrastructure to support 100% self-driving cars.

>I don't see complete driver-less (including remote drivers) self-driving cars on regular roads for a while.
theverge.com/2017/11/7/16615290/waymo-self-driving-safety-driver-chandler-autonomous
>“Fully self-driving cars are here,” Krafcik said, according to a copy of his speech provided by Waymo.
They're starting out with people still in the car, but not in the driver's seat. Also California DMV is allowing autonomous cars without steering wheels or human drivers. There are potential legal fights to be had over whether California is allowed to do that or if that needs to be a federal decision, but I think it's a pretty strong sign these sorts of vehicles will be getting very popular very soon.

>A Waymo employee will remain in the vehicle for now.

Also I mean in wide deployment, not within a testing neighborhood. It's once it goes big that hackers will actively try attacking it.

Ideally any system would have a access to an always up-to-date, GPS-based knowledge of all road signs, etc., and only have to rely on image recognition (or RFID) as a backup/failsafe.

>A Waymo employee will remain in the vehicle for now.
Why are you quoting that? I clearly addressed that already:
>They're starting out with people still in the car, but not in the driver's seat.
>not within a testing neighborhood
That's why I mentioned the new California DMV rule. It's not for a test neighborhood, it's for California. The state. All of it. And it's effective this year, though again you have the potential issues with the federal government.

Multiple backups would be needed.

Would it be easier if there was ONLY a population of self driving cars on a road? Seems to me like cohabitation of the roads with self driving vehicles and manually operated vehicles would be a greater challenge. Not to mention the whole question of legal liability and public safety.

>Would it be easier if there was ONLY a population of self driving cars on a road?
Yes, but people driven cars do exist so self-driving cars are all being trained to operate with that state of affairs.