Driverless Car General- because Veeky Forums doesn't general

Starting a thread for the development, news and discussion of driverless cars in the United States and the world. This will include topics of electric and hybrid electric vehicles as they are being developed as one and the same.

What industries/jobs will be automated?
What industries/jobs will be created?
What laws need to be passed?
Who is pushing for this?
How do you make people accept driverless cars and get over Luddite thinking, stop worrying and learn to love the machine?

Other urls found in this thread:

detroitnews.com/story/business/autos/2016/05/25/bills-aimed-make-michigan-leader-autonomous-cars/84937046/
wsj.com/articles/group-seeks-to-pave-way-for-nationwide-adoption-of-driverless-cars-1463588817
youtube.com/watch?v=rYqRpUzDnwU
wired.com/2016/06/self-driving-cars-will-power-kill-wont-conscience/
nytimes.com/2016/07/30/business/tesla-faults-teslas-brakes-but-not-autopilot-in-fatal-crash.html
twitter.com/NSFWRedditImage

detroitnews.com/story/business/autos/2016/05/25/bills-aimed-make-michigan-leader-autonomous-cars/84937046/
>Michigan moves to lead driverless car testing
>Sweeping legislation was introduced Wednesday in the state Senate that aims to make Michigan the nation’s leader in autonomous vehicle testing by allowing manufacturers to produce and sell self-driving cars here and clearing the way for their use on state roadways.
>This week, a University of Michigan study found the public isn’t completely sold on autonomous vehicles. Less than 20 percent of respondents in a poll said they would prefer to ride in a fully self-driving car. Most wanted to retain full control while driving.
>John Simpson, privacy project director at the Santa Monica, California-based Consumer Watchdog group, cautioned that Michigan’s law should ensure a driver can take control of a self-driving auto if it runs into problems. He pointed to a recently proposed California law that would require the presence of a licensed driver in an autonomous vehicle who could take control.
>“That’s the thing we think is necessary because the technology is just not ready yet to go out in into the streets without the ability to intervene,” Simpson added. “While it may be the case down the road, meaning decades, that the technology may be able to work, but we’re not there.”

In before trolley problem.

wsj.com/articles/group-seeks-to-pave-way-for-nationwide-adoption-of-driverless-cars-1463588817
>Group Seeks to Pave Way for Nationwide Adoption of Driverless Cars
>Advocates for autonomous cars call for federal laws to supplant local regulatory barriers
>A group of business and former military leaders wants to limit states’ ability to regulate driverless cars, calling for sweeping federal legislation to avoid a patchwork of rules they believe could hinder adoption of the technologically advanced vehicles.
>Executives including FedEx Corp. Chief Executive Fred Smith and retired U.S. generals associated with a Washington group that lobbies to reduce America's oil dependence plan to meet with politicians in the nation's capital on Thursday. . .

I love this argument. You act like cars don't have breaks.

Computers don't need a moral compass to stop when an obstacle is in the road. Problem solved.

youtube.com/watch?v=rYqRpUzDnwU
>DustClean - Road Sweeping Robot
>DustClean is a road sweeping robot designed and developed by RoboTech srl in the framework of the DustBot project, a project funded by the European Commission aimed at designing, developing and demonstrating an innovative system based on robotic and information and communication technologies for the improvement of the management of urban hygiene.

>>>/G/

>What industries/jobs will be automated?
Predominantly the transport sector, obviously. predominantly. To a lesser degree, it will affect the construction sectors by integrating autonomous construction vehicles into the industry. It will be interesting as to how the military adopts autonomous vehicles.

>What industries/jobs will be created?
Nothing significant that comes to mind except construction workers for adjusting infrastructure (in the distant future) where all vehicles are automated in their entirety.

>What laws need to be passed?
Inevitably laws will be passed that it becomes illegal to drive autonomous on highways, and then eventually that it becomes illegal to drive on roads manually (when automated vehicles achieve full public consensus and integration).

A crash that occurs while not under autonomous control will not incur insurance cover (inevitable insurance company policy)

>Who is pushing for this?
The associated vehicle corporations, and some policy makers.

The issue is when the car is put into a situation where it will either kill a pedestrian or crash and kill its occupants.
What if there are two pedestrians vs. one driver? One pedestrian vs. three passengers? Two pedestrians vs. another car from the same manufacturer?

wired.com/2016/06/self-driving-cars-will-power-kill-wont-conscience/
here's an article for you, it's written by a similarly stupid person. The "trolley problem" is an ethical problem for people, not a self-driving car problem.

This technology would sense an accident or people in the roads and stop straight. If people suddenly jumped in front of the car it would sense them and stop straight. If there was an accident, the car would be alerted and slow, avoid, take an alternate route or stop straight.

These cars will not have to "decide" who "gets to live or die".

The trolley problem is anthropomorphizing this technology. People in the comment section know what's up.

>The issue is when the car is put into a situation where it will either kill a pedestrian or crash and kill its occupants.
Sounds like a human problem. These cars will just stop and not kill anyone. Computers don't panic or make "split second decisions".
>something is in the road
>apply break

>It will be interesting as to how the military adopts autonomous vehicles.
They are pushing for it just as hard. Goodbye supply vehicles and truck drivers.
>A crash that occurs while not under autonomous control will not incur insurance cover (inevitable insurance company policy)
I've always wondered how this will happen. If an autonomous car crashes you'd probably get free lawyers from the manufacturer defending their product and a team of engineers looking at your vehicle. They might even pay to replace or fix it. They would take it very personally.

While I don't think they are as terrible as the media portrays them, these are the issues worth discussing, and potentially solving.

Say the car is driving at 140km/h. It comes across a sharp turn in the road. Person jumps in front. Assuming it starts breaking immediately it would take the car about 100 meters to come to a full stop. People really underestimate inertia. Now, there is an alternative option. It can avoid the person. The problem is you would still be riding at 140km/h and say you avoid the person but the car hits the wall, and kills the person sitting in the car.

This brings us back to that problem of who it should kill. Ideally it would find the way to avoid both, but what if it runs into such problems? Since we need to program the cars for every conceivable situation, these things need to be programmed in, otherwise the car would have no idea what it should do.

As far as I'm aware, there's readings on the vehicle to indicate whether the car was autonomous at the time of the crash, or under manual control. I'd expect it to be akin to a plane recording whether it was autopilot at the time of the crash, or not.
nytimes.com/2016/07/30/business/tesla-faults-teslas-brakes-but-not-autopilot-in-fatal-crash.html
>article related

>While I don't think they are as terrible as the media portrays them, these are the issues worth discussing, and potentially solving.
The problem is the premise is wrong. This question arises from people not understanding the technology at hand. This wouldn't happen with a self driving car. If the self driving car swerved and hit a pedestrian it is because it didn't sense the pedestrian for some reason. This is an engineering issue, not a moral issue.

The car would just stop.
>Say the car is driving at 140km/h
The cars MUST follow the law. Never would the car exceed the speed limit. Ever. I don't know of places in America where you can drive that fast. These cars will drive like old ladies.
>It comes across a sharp turn in the road.
It would follow its GPS map, slow down and take the turn safely.
>Person jumps in front.
If someone wants to commit suicide by tricking these cars then I don't know what to tell you. That doesn't sound like a mistake of the technology.
I don't know what the car would do except stop straight, not drive into a wall or into the "unknown" off the road. You won't see these cars wrapped around trees because they will follow the road and ALL road laws.
>This brings us back to that problem of who it should kill.
Again, the premise is incorrect and is ignorant of the technology at hand. It wouldn't make a decision of "Who to kill". The car would just stop. People jumping in front of cars on a busy highway to commit suicide or people jay walking on a busy highway is more of a safety issue with the people rather than a moral issue with a car's computer.
"USE AS DIRECTED".

The car. Will just. Apply the break.

Might as well just put a snow-plough on the front and program it not to even slow down. If people are in the road other than at a crossing where traffic is stopped then they deserve to die.
This also applies to children, and all animals except cats.

>Since we need to program the cars for every conceivable situation, these things need to be programmed in, otherwise the car would have no idea what it should do.
This is more on the line of better thinking about this technology. You are correct. Someone once said to me "Driving laws are written in blood" meaning, people had to die from careless or dangerous driving for a law to be written. Same goes for regulations on road construction. All of it is to protect life.

I think with automated cars it'll be "code" written in blood. If ANY accidents happen, it's an engineering issue to be resolved. There would be a software patch, rarely a hardware upgrade, and all cars on the road would be updated so the problem would never happen again.

That is much safer, more effective, faster and extremely more reliable than passing a law for years and hoping all humans follow the law all of the time.

>If people are in the road other than at a crossing where traffic is stopped then they deserve to die.
No it'll very most likely stop. But Jay walking is a crime for a reason and is the fault of the person not the cars. Expect these cars to hit pedestrians jay walking 10,000 times less or better than human drivers.

I have a better scenario:
>A self driving car has sophisticated AI.
>There is a fork in the road with two shoe boxes in the middle of each road.
>One box has a kitten the other has a puppy in it.
>The car cannot stop, because.
>Which box does the car run over? Who does the car kill?

Kill the vehicle occupants. Fuck deathcages and the entitled degeneracy they entail.

>what cagers actually believe

you don't leave your home do you?

>This is an engineering issue
Well, yea, it's all an engineering issue. The car isn't actually deciding who has more or less worth, and who it should kill. Like I previously mentioned, these things need to be programmed in. What it should do in these specific situations. Not sure what you know about programming, but you're looking from a human perspective. A human just decides what it should on on the spot, based on his previous experience. A computer needs to have everything declared, every object, every action, every reaction to action, etc. Until we have a sufficiently advance AI with good heuristics, these are the things we need to worry about.

The slobs in the vehicle should assume all the risk. Innocent bystanders should be protected at all costs including the sacrifice of all vehicle occupants.

A fun game to play with autonomous cars and trucks is " nudging "

You can replace all the cars and trucks at once so there will be a good 10 to 20 year period of half and half .
The face when I nudge a truck into guard rails .
The face when cars cost 2 times as much and sensors break all the time .

I am totally fine with this

>Until we have a sufficiently advance AI with good heuristics, these are the things we need to worry about.
no. this is a stupid question and you are the one humanizing a machine.

These cars need to avoid hitting humans the same way they need to avoid hitting anything. It doesn't need to get fucking philosophical about it.