Driverless Cars, Good Idea, Horrible Idea?

Language: JP EN DE FR
2010-09-08
New Items
users online
Forum » Everything Else » Culture and Media » Driverless Cars, Good Idea, Horrible Idea?
Driverless Cars, Good Idea, Horrible Idea?
First Page 2 3 4 5 6 7
 Ragnarok.Sekundes
Offline
Server: Ragnarok
Game: FFXI
user: Sekundes
Posts: 4189
By Ragnarok.Sekundes 2015-12-17 21:34:43
Link | Quote | Reply
 
And on that note...

Relevant I'd think:
YouTube Video Placeholder
Offline
Posts: 3299
By Clinpachi 2015-12-17 21:35:21
Link | Quote | Reply
 
I'm 100% for it. I just want it to be done right and feel safe about it, which is asking a LOT.
 Siren.Kyte
Offline
Server: Siren
Game: FFXI
Posts: 3331
By Siren.Kyte 2015-12-17 21:42:07
Link | Quote | Reply
 
Rooks said: »
Hi, random programmer here.

No, we should not have driverless cars. If you feel otherwise, then you have a much rosier picture of software and the people who make it than I do.

Hi, random human here. If you feel otherwise, you have a much rosier picture of humans than I do.
[+]
 Ragnarok.Sekundes
Offline
Server: Ragnarok
Game: FFXI
user: Sekundes
Posts: 4189
By Ragnarok.Sekundes 2015-12-17 21:52:04
Link | Quote | Reply
 
Clinpachi said: »
I'm 100% for it. I just want it to be done right and feel safe about it, which is asking a LOT.
While I do agree, we've set a pretty damn low bar to make driving safer than it is now.
[+]
 
Offline
Posts:
By 2015-12-17 21:56:49
 Undelete | Edit  | Link | Quote | Reply
 
Post deleted by User.
[+]
 Phoenix.Dabackpack
MSPaint Winner
Offline
Server: Phoenix
Game: FFXI
Posts: 2011
By Phoenix.Dabackpack 2015-12-17 22:01:58
Link | Quote | Reply
 
Rooks said: »
Hi, random programmer here.

No, we should not have driverless cars. If you feel otherwise, then you have a much rosier picture of software and the people who make it than I do.

Seconding this

When you've been in the industry long enough, you see some ***
[+]
 Phoenix.Dabackpack
MSPaint Winner
Offline
Server: Phoenix
Game: FFXI
Posts: 2011
By Phoenix.Dabackpack 2015-12-17 22:10:36
Link | Quote | Reply
 
On a serious, ethical note -----

The problem is rather easy in a toy scenario when all cars are driverless. Everyone acts consistently and could potentially follow the same set of rules.

But in reality, if such a technology were to become legal, then we'd see both driver'd and driverless cars on the streets. As you might imagine, this is more complicated, because the agents need to both reason against human and non-human actors in the world. Not an insurmountable problem, but not a trivial one either. I am not aware of how the car developers are addressing this problem.

However, this is the real problem that these companies are facing:

In a real life scenario, you, as a driver, might have to solve moral dilemmas. In an extreme case, you might have to decide between saving your own life and saving a pedestrian's life. In the real world, the decision is on you, the driver.

With driverless cars, Google and Tesla programmers now have to decide whether the passengers or the pedestrians should be saved in these situations. You now add the politics of driver morality to the actual production and deployment of these cars! It's extremely messy since these decisions are no longer made ad hoc, but are programmed directly into the logic of the vehicles themselves (or perhaps by some appendage software).

This is a real problem facing the industry (as somebody actually in AI research)
[+]
Offline
Posts: 4394
By Altimaomega 2015-12-17 22:11:23
Link | Quote | Reply
 
Asura.Floppyseconds said: »
These things are not issues.

Yeah.. These things are issues. Soon as a comp car kills someone, that someones family is going to be suing the person who owned it, the person who designed the software, and the company that sold it.

I'm sure the person who owned the car signed something that says they would take all responsibility. I and many millions of other people have not. The legality alone will keep these cars off the roads.

Asura.Floppyseconds said: »
Cars already autobrake, and have for years.
Not on ice.
Offline
Posts: 4394
By Altimaomega 2015-12-17 22:15:19
Link | Quote | Reply
 
Phoenix.Dabackpack said: »
n a real life scenario, you, as a driver, might have to solve moral dilemmas. In an extreme case, you might have to decide between saving your own life and saving a pedestrian's life. In the real world, the decision is on you, the driver.

This was already covered on page one.

Asura.Floppyseconds said: »
The rest is silly to debate about as there is not going to be a school bus of kids and a cliff and such an accident all happening. That is just a ridiculous notion.

I dunno if Floppy lives in the real world however..
 
Offline
Posts:
By 2015-12-17 22:19:48
 Undelete | Edit  | Link | Quote | Reply
 
Post deleted by User.
 Phoenix.Dabackpack
MSPaint Winner
Offline
Server: Phoenix
Game: FFXI
Posts: 2011
By Phoenix.Dabackpack 2015-12-17 22:20:33
Link | Quote | Reply
 
Altimaomega said: »
Phoenix.Dabackpack said: »
n a real life scenario, you, as a driver, might have to solve moral dilemmas. In an extreme case, you might have to decide between saving your own life and saving a pedestrian's life. In the real world, the decision is on you, the driver.

This was already covered on page one.

Asura.Floppyseconds said: »
The rest is silly to debate about as there is not going to be a school bus of kids and a cliff and such an accident all happening. That is just a ridiculous notion.

I dunno if Floppy lives in the real world however..

I know, I just wanted to state that this literally is not a solved problem in the industry.

And @Floppy, it doesn't matter "if it will never happen", because the agent needs to be able to handle those situations if they do occur. This isn't a trivial point. Autonomous car producers literally need to consult and decide how to prioritize safety between passengers, pedestrians, and other cars. I'm not bullshitting you, this is real life, not theorycrafting.

All of the above is for a truly autonomous vehicle, though.

EDIT: Reiterating that the above is for pure automation.

If the driver still has agency over the car (regarding actual driving and split-decision making) then the above points are less important. However, manufacturers are attempting pure automation in the future. (And by "pure automation", I mean having all humans in the car either serving as passengers or giving directions, not actually affecting the motion of the automobile)
[+]
 Phoenix.Dabackpack
MSPaint Winner
Offline
Server: Phoenix
Game: FFXI
Posts: 2011
By Phoenix.Dabackpack 2015-12-17 22:25:05
Link | Quote | Reply
 
Asura.Floppyseconds said: »
Yeah, okay.

Is that the same real world that envisions a cartoon scenario of a bus of children, a car, a cliff? That the only two options are to either die off a cliff or hit the bus? How *** realistic is that?

This particular scenario is cartoonish, but there are very real scenarios that are similar:

If you're driving and a deer jumps in front of you. What do you do? Common knowledge would tell you to hit the deer, even if it kills it. But what if it's a toddler? (Both real things that have happened to me)
[+]
Offline
Posts: 3299
By Clinpachi 2015-12-17 22:26:46
Link | Quote | Reply
 
To be honest with you, I feel like the videos for both Google and Tesla are hiding 1 simple fact. Somebody was out of view of the camera with their hand on a kill switch or safety mechanism. Even more likely the driver was probably instructed to carefully watch and wait to intervene at any given moment. We're just being showed the good parts and strides.

Marketing and brand recognition are playing key roles to interest you in this and think great things about their company.

Also the legal issues Dabackpack outline are on point too.
[+]
 
Offline
Posts:
By 2015-12-17 22:27:21
 Undelete | Edit  | Link | Quote | Reply
 
Post deleted by User.
 Phoenix.Dabackpack
MSPaint Winner
Offline
Server: Phoenix
Game: FFXI
Posts: 2011
By Phoenix.Dabackpack 2015-12-17 22:32:30
Link | Quote | Reply
 
Asura.Floppyseconds said: »
Phoenix.Dabackpack said: »
Asura.Floppyseconds said: »
Yeah, okay.

Is that the same real world that envisions a cartoon scenario of a bus of children, a car, a cliff? That the only two options are to either die off a cliff or hit the bus? How *** realistic is that?

This particular scenario is cartoonish, but there are very real scenarios that are similar:

If you're driving and a deer jumps in front of you. What do you do? Common knowledge would tell you to hit the deer, even if it kills it. But what if it's a toddler? (Both real things that have happened to me)

The toddler much like Bambi, would be dead without the system in place. So it is not like we need to blame the system for being faced with an event that can't be stopped.
It doesn't matter if kittens, babies, or hail falls in front of your car as you are driving. With or without a system if you can't stop then you can't stop and nothing can be done.

Therefore I do not believe it is a factor in the way it is being presented.

I should have specified an alternative.

Usually, it's "you either hit the deer, or you swerve the car and endanger yourself."

In a pure autonomous situation, the cognition is redirected from human to machine. So yes, the manufacturer is now directly involved in the moral dilemma, a priori.
[+]
Offline
Posts: 3299
By Clinpachi 2015-12-17 22:34:15
Link | Quote | Reply
 


Wake me up when we have holodeck technology :3
[+]
 Phoenix.Dabackpack
MSPaint Winner
Offline
Server: Phoenix
Game: FFXI
Posts: 2011
By Phoenix.Dabackpack 2015-12-17 22:35:09
Link | Quote | Reply
 
Programmatically it's not difficult to implement these decisions. But the auto producers need to say somethign like "Okay, we will always prioritize the driver over any pedestrians in this situation." But that can be a huge legal liability, and I think it's a discussion we as a society need to have.
[+]
 
Offline
Posts:
By 2015-12-17 22:37:40
 Undelete | Edit  | Link | Quote | Reply
 
Post deleted by User.
[+]
 Phoenix.Dabackpack
MSPaint Winner
Offline
Server: Phoenix
Game: FFXI
Posts: 2011
By Phoenix.Dabackpack 2015-12-17 22:42:53
Link | Quote | Reply
 
Asura.Floppyseconds said: »
Phoenix.Dabackpack said: »
Asura.Floppyseconds said: »
Phoenix.Dabackpack said: »
Asura.Floppyseconds said: »
Yeah, okay.

Is that the same real world that envisions a cartoon scenario of a bus of children, a car, a cliff? That the only two options are to either die off a cliff or hit the bus? How *** realistic is that?

This particular scenario is cartoonish, but there are very real scenarios that are similar:

If you're driving and a deer jumps in front of you. What do you do? Common knowledge would tell you to hit the deer, even if it kills it. But what if it's a toddler? (Both real things that have happened to me)

The toddler much like Bambi, would be dead without the system in place. So it is not like we need to blame the system for being faced with an event that can't be stopped.
It doesn't matter if kittens, babies, or hail falls in front of your car as you are driving. With or without a system if you can't stop then you can't stop and nothing can be done.

Therefore I do not believe it is a factor in the way it is being presented.

I should have specified an alternative.

Usually, it's "you either hit the deer, or you swerve the car and endanger yourself."

In a pure autonomous situation, the cognition is redirected from human to machine. So yes, the manufacturer is now directly involved in the moral dilemma, a priori.

Doesn't matter because the human is supervising the car. They are still "driving". So it is not the fault of the manufacturer until there is no longer human control over the vehicle. This includes the ability to intervene. As I said, "driving".

As for the swerve. Considering the ability of sensors. In an improved system I would expect it to be much better than a human reacting to the situation. First off it can calculate the actions MUCH faster than a human, and then it can react MUCH faster. All that is left would be improving the system and sensors. If there is room, swerve, and if there is not then you don't swerve.

That's why I specified pure autonomy.
And I am absolutely certain that will be a prospect Google or Tesla will act towards, so this is a problem they will need to face.
If the driver is still 100% engaged, not listening to music or reading a book and is still able to take total control at any time, then there isn't a problem (yet).

It doesn't matter if this toy example is avoided with stronger sensors. This is a decision-making problem, not a technological one. Any decision-making system needs to be able to make decisions according to any domain of perceptual data. It might decide "well, I'll just do nothing if a child runs across", but programmers absolutely need to make those judgments in order to give defined behavior in every scenario.
[+]
Offline
Posts: 4394
By Altimaomega 2015-12-17 22:48:27
Link | Quote | Reply
 
Asura.Floppyseconds said: »
I really don't care about sue happy Americans.

There is a person behind the wheel because the system is not perfect. It is the drivers fault and not the automaker.

Don't matter if you care or not, in real life this ***happens.
We are talking about automated cars here, nobody is behind the wheel, please try and keep up.

Asura.Floppyseconds said: »
Cars themselves without the system can't even handle unplowed or icy roads. So what is your point?
I seem to have had zero problems in the almost twenty years I have been driving. It would piss me off to no end when/if a comp car hit me on icy roads.

Asura.Floppyseconds said: »
Is that the same real world that envisions a cartoon scenario of a bus of children, a car, a cliff? That the only two options are to either die off a cliff or hit the bus? How *** realistic is that?
It's called an analogy, but it could actually happen.

Asura.Floppyseconds said: »
The toddler much like Bambi, would be dead without the system in place. So it is not like we need to blame the system for being faced with an event that can't be stopped.
It doesn't matter if kittens, babies, or hail falls in front of your car as you are driving. With or without a system if you can't stop then you can't stop and nothing can be done.

Therefore I do not believe it is a factor in the way it is being presented.



Asura.Floppyseconds said: »
Doesn't matter because the human is supervising the car. They are still "driving". So it is not the fault of the manufacturer until there is no longer human control over the vehicle. This includes the ability to intervene. As I said, "driving".

Why have an automated car that you have to "drive"..
 
Offline
Posts:
By 2015-12-17 22:52:51
 Undelete | Edit  | Link | Quote | Reply
 
Post deleted by User.
 Phoenix.Dabackpack
MSPaint Winner
Offline
Server: Phoenix
Game: FFXI
Posts: 2011
By Phoenix.Dabackpack 2015-12-17 22:52:53
Link | Quote | Reply
 
I really do see the utility of having "semi-driverless" cars in which cognitive load is placed off humans and onto machines, but I don't think that's the topic of discussion at the moment.
[+]
 
Offline
Posts:
By 2015-12-17 22:56:24
 Undelete | Edit  | Link | Quote | Reply
 
Post deleted by User.
 
Offline
Posts:
By 2015-12-17 22:59:33
 Undelete | Edit  | Link | Quote | Reply
 
Post deleted by User.
 Phoenix.Dabackpack
MSPaint Winner
Offline
Server: Phoenix
Game: FFXI
Posts: 2011
By Phoenix.Dabackpack 2015-12-17 23:01:58
Link | Quote | Reply
 
Asura.Floppyseconds said: »

In a completely autonomous situation? As of now they are programmed to stop. If the calculations come out that it is not possible to stop I imagine it could be programmed in the same breath to examine options around the car for a distance to swerve. This is already better than leaving it up to a human in this semi-hypothetical situation. If it can't then it just brakes while it hits it. Cars already autobrake to this situations than humans so hit-for-hit it will be better.

(Resetting the quote train)

Yes, I think that is the proper course of action in a low-level assessment of the environment (low-level meaning low levels of abstraction). For a purely autonomous situation, I think the automobiles just need to behave predictably almost all the time, at least to start.

But in the quest of "making things better and safer" (which is always on our minds as scientists), developers will want to have the agents think at a higher level of awareness: "well, if I'm about to hit a kid on the highway but there's a car behind me, what should I do?" instead of "well, there's a kid in front of me, what should I do?", for example.

And eventually at higher levels of abstraction: "do I save the passenger or two pedestrians?"

For purely autonomous vehicles, the developers are in a pinch because they will need to make decisions about subjective analyses regarding human safety. It might be really easy: "Well, we'll always stop in X scenario". But that will reduce the appeal of the vehicles and will reduce the likelihood of autonomous vehicles been accepted legally and socially. It's a difficult situation.
VIP
Offline
Posts: 21757
By Kalila 2015-12-17 23:05:42
Link | Quote | Reply
 
I think I'd rather have pilotless flying transport vehicles / aircraft, than driverless cars. It just seems to me that there is less room for error up in the air where you navigate in a three dimensional plane, than on the ground where you really only navigate on a two dimensional plane but with incline/decline slopes.

If there was a driverless car, and they detect a car in front of them slowing down that has a real driver in it, it will detect that and slow down as well, most likely at a faster reaction rate than a human could >>>IF<<< there wasn't anything wrong with the detection software, which there will most likely be cases where the software/hardware doesn't detect anything and it just ends up running into something.

However, in a perfect software world, lets say there isn't anything that could go wrong with the detection software/hardware. Then what about other drivers? We don't have patience, at all. Some of us do, or try to practice it, but really when it comes to drivers on the road, we are very impatient. If there was a driverless vehicle in front of us, they would most likely be going the EXACT speed limit. People would rage, hard. They would constantly be passing it because well, when it comes to the highway, it's very rare that people are going the exact speed limit, and when someone is everyone is passing them.

Also, what about crash detection. An oncoming driver about to hit the driverless vehicle. Is the software going to come to an automatic stop as soon as it detects a threat of collision? That might not be the best thing, it might get false threats and make random stops on the highway. That would be horrible. Also, what if it does get hit, what does it do then? Does it go into hyper-detection mode and try to come to a safe stop while avoiding other drivers in other lanes?

To me, it would seem safest if we had roads meant only for driverless vehicles, and once you're about to get onto a driver only road, the car warns you to take controls while its coasting down the off-ramp or whatever. Once the driver responds and the car recognizes that they have full control over the car, it stops controlling it completely.

But what if the driver doesn't respond? What if they ended up falling asleep? Not the craziest thing that would happen, probably would be quite common honestly. Do they just stay on the ramp and go back on an on-ramp back to the driverless road and just circle? Does it pull over to a stop? A rest stop? There would need to be solutions to problems like this.

Though it could have a backup system where it keeps its detection software/hardware running to assist the driver from accidents, which I think luxury cars have been doing for awhile now. I could be wrong but I think I've seen commercials advertising stuff like that for Mercedes-Benz or w/e.
[+]
 Phoenix.Dabackpack
MSPaint Winner
Offline
Server: Phoenix
Game: FFXI
Posts: 2011
By Phoenix.Dabackpack 2015-12-17 23:05:43
Link | Quote | Reply
 
I think the problem is that it's "not enough" to say "well, we're better than human drivers!"

Crashes will still happen, obviously. And even in these scenarios, they will likely still be better than human drivers. But part of the process of reaching that "final result" is making decisions on the road. That's a fundamental part of driving.

Safe drivers make decisions and the developers understand that the decisions they make in programming the agents are really important.

EDIT: Driving is one part "knowing the rules of the road" and one part "knowing how to make safe driving decisions and how to handle unique scenarios", or something like that.

The drivers are great at the former, but special attention needs to be given to the subjective latter portion. The subjective stuff is always the stuff that we have problems with.
 Phoenix.Dabackpack
MSPaint Winner
Offline
Server: Phoenix
Game: FFXI
Posts: 2011
By Phoenix.Dabackpack 2015-12-17 23:09:47
Link | Quote | Reply
 
Kalila said: »
I think I'd rather have pilotless flying transport vehicles / aircraft, than driverless cars. It just seems to me that there is less room for error up in the air where you navigate in a three dimensional plane, than on the ground where you really only navigate on a two dimensional plane but with incline/decline slopes.

This is also a very controversial prospect in the AI community! Well, for reasons regarding airspace laws and the like.

Any drone that flies at a certain altitude is subject to aviation law. You need a permit or some kind of allowance in order to fly at certain altitudes. I don't know what the specifics are. But someone at NIST said something along the lines of "Amazon's drone service is not going to happen on a grand scale because it's a huge safety liability, regarding both airplanes and national security."
[+]
Offline
Posts: 4394
By Altimaomega 2015-12-17 23:09:57
Link | Quote | Reply
 
Phoenix.Dabackpack said: »
I think the problem is that it's "not enough" to say "well, we're better than human drivers!"

Crashes will still happen, obviously. And even in these scenarios, they will likely still be better than human drivers. But part of the process of reaching that "final result" is making decisions on the road. That's a fundamental part of driving.

Safe drivers make decisions and the developers understand that the decisions they make in programming the agents are really important.

Now that Floppy has caught back up. That brings us to this.

Altimaomega said: »
Soon as a comp car kills someone, that someones family is going to be suing the person who owned it, the person who designed the software, and the company that sold it.

I'm sure the person who owned the car signed something that says they would take all responsibility. I and many millions of other people have not. The legality alone will keep these cars off the roads.
First Page 2 3 4 5 6 7
Log in to post.