Self-driving Uber car killed a pedestrian

Status
Not open for further replies.
Originally Posted By: ZeeOSix
Even if the gal was "monitoring" the self driving car instead of looking at her cell phone, she might have had 1~2 seconds max to react. Still might not have made any difference. I know if I was cursing along in the dark and all of a sudden someone came walking out of the shadows 50 ft in front of me, I'd be pretty lucky to react in enough time to not run over them.


The video camera doesn't have the dynamic range that the eyes have. When have you been driving at night and not been able to see in good weather? That's what headlights are for. While unexpected that's just every day driving. People do crazy things all the time. The report claims she should have been able to stop if she had been driving.

Also there's a cut off for survival of a car accident. I think if it's under 25, they're more likely to survive. So even if you can't bring the car to a complete stop, getting the speed down could have been the difference between life and death.
 
Originally Posted By: Astro14
Originally Posted By: Shannow
Problem with pointing at the safety "driver" is that they are tasked with a job that the human mind can't do.

Sit there and do nothing at all, but be prepared to step in and take control in seconds...for the duration of your shift.


Sounds a lot like my job when the airplane is in cruise flight, crossing an ocean...

However, I AM ready to step in and take control in seconds.


Definitely, but you also have alarms and things...things which were programmed out of the Uber control system, such that the controls thought she was a paper bag.

We did an Advanced Error Reduction programme at work, focussed on trying to take human innate behaviour out of the error equation. you can't expect a human to instantly insert themselves into an automated process expect them to assess the current state, and take appropriate action.

When quizzed about the plane and Hudson incident, lecturer went to great pains that Sulley had to follow protocols to make the correct assessment of the plane's status and capabilities, while those who successfully landed the simulator, after their first failure had all of that knowledge and thus a head start in the process.

As an aside, the lecturer asked how many of us leaders in the room could completely and accurately multi-task...a few said yes.

Response was that it was unlikely, as they didn't have the physique (or the career) of a fighter pilot...he stated that's the only type of person who can genuinely be relied on to multitask.
 
Originally Posted By: Shannow
Originally Posted By: Astro14
Originally Posted By: Shannow
Problem with pointing at the safety "driver" is that they are tasked with a job that the human mind can't do.

Sit there and do nothing at all, but be prepared to step in and take control in seconds...for the duration of your shift.


Sounds a lot like my job when the airplane is in cruise flight, crossing an ocean...

However, I AM ready to step in and take control in seconds.


Definitely, but you also have alarms and things...things which were programmed out of the Uber control system, such that the controls thought she was a paper bag.

We did an Advanced Error Reduction programme at work, focussed on trying to take human innate behaviour out of the error equation. you can't expect a human to instantly insert themselves into an automated process expect them to assess the current state, and take appropriate action.

When quizzed about the plane and Hudson incident, lecturer went to great pains that Sulley had to follow protocols to make the correct assessment of the plane's status and capabilities, while those who successfully landed the simulator, after their first failure had all of that knowledge and thus a head start in the process.

As an aside, the lecturer asked how many of us leaders in the room could completely and accurately multi-task...a few said yes.

Response was that it was unlikely, as they didn't have the physique (or the career) of a fighter pilot...he stated that's the only type of person who can genuinely be relied on to multitask.


I'll make the argument that there really wasn't a full automated process in place. What's the difference between driving on cruise control and what that car was doing? Lane keeping. Right?

You'd be able to hit the brakes and take evasive action on cruise control so why not in the uber case? Because he was being lazy and overtrusting the system. Its as simple as that.

That's my fear with these new automated collision systems. People will think they don't need to drive the car anymore. Heck, just text or whatever because the car will keep you out of trouble, right?

The video we keep seeing is deceptive. The biker just appears suddenly. The camera can't see ahead as well as a human. And in this case seems it has a limited field of view. A human should have been able to see her much sooner and farther out ahead of the car. If not there is something wrong with the headlights on that car.
 
Last edited:
Originally Posted By: turtlevette
I'll make the argument that there really wasn't a full automated process in place. What's the difference between driving on cruise control and what that car was doing? Lane keeping. Right?


No, it's not lane keeping...Full "autonomous driving", which is why they are doing it in the nearly rainless fogless desert.

Thus not responding to the rest of the post, as it's not advanced cruise control...and the "biker"...you aren't paying attention.
 
A word on multitasking. I've seen this creep in to the work place over the years. A good employee has a lot on their plate and can handle a wide variety of things at once.

The more tasks someone has to do, the less well they can do any one of those tasks. Asking an engineer to do this tends to make them less of an engineer. A service employee, sure. They can make the fries, flip the burgers, pour the drinks and so on. But I can't be creative as an engineer when I'm trying to solve multiple problems, design multiple systems, answer the phone, write application guides and write technical papers all at the same time.

The problem is we have "McDonalds" MBA types running most of businesses the days. They don't understand what their people do and think its no different from cooking fries and flipping burgers. We've all heard the line "its all just widgets". I have very little respect for the bulk of management types these days. There are almost no inquisitive, introspective, technically competent people in high level management and CEO level. Its killing creativity.

I mean, I don't want some MBA suit who's never flown a plane training astro how to fly a fighter into combat. Doesn't that sound incredibly stupid?
 
Last edited:
Originally Posted By: Wolf359


The video camera doesn't have the dynamic range that the eyes have. When have you been driving at night and not been able to see in good weather? That's what headlights are for. While unexpected that's just every day driving. People do crazy things all the time.


Not really. People will adapt their speed to the visibility conditions. It’s there in the driving manual.

They also have to take an eyesight exam periodically. If they can’t see well enough they don’t get a driving license (renewed).

Apparently these self driving wonders are exempt from such laws.
 
Originally Posted By: nap
Originally Posted By: Wolf359


The video camera doesn't have the dynamic range that the eyes have. When have you been driving at night and not been able to see in good weather? That's what headlights are for. While unexpected that's just every day driving. People do crazy things all the time.


Not really. People will adapt their speed to the visibility conditions. It’s there in the driving manual.


The cars can "see" in visible light, IR, radar, and all sorts of abilities that are beyond human capabilities, headlights on or off.

Programming the car to think that a human walking a bike across the road is less than a breeze blown paper bag negates every single one of these abilities.

The programmers and people deploying these technologies on public roads are the responsible party.
 
Programmers don’t have a “professional engineer” program. The onus is completely on the guys who decided to deploy these things in the streets.
 
Originally Posted By: nap
Programmers don’t have a “professional engineer” program. The onus is completely on the guys who decided to deploy these things in the streets.


That's why self learning AI, tagging along with "fallible humans" is the only way that this can work.

A perfectly programmed vehicle will be infinitely better than a human, but being programmed by humans can never be perfectly programmed...at least a tagalong self learning AI, sharing the global (cloud) knowledge of every ambiguous incident, abherent behaviour teaches every other connected vehicle.
 
Originally Posted By: Shannow
Originally Posted By: turtlevette
I'll make the argument that there really wasn't a full automated process in place. What's the difference between driving on cruise control and what that car was doing? Lane keeping. Right?


No, it's not lane keeping...Full "autonomous driving", which is why they are doing it in the nearly rainless fogless desert.

Thus not responding to the rest of the post, as it's not advanced cruise control...and the "biker"...you aren't paying attention.






NO, if emergency braking is disabled, it ain't fully automated.

They should train the safety drivers to be skeptical of the system, not to trust it so much you nap or watch TV. Gross negligence on the part of the driver.
 
Agreed...uber was the operator...not the human "safety backup" plan that they lumbered with the charges.
 
Originally Posted By: Shannow
Agreed...uber was the operator...not the human "safety backup" plan that they lumbered with the charges.


I don't know how you criminally charge a corporation, but it needs to get figured out. Lock up the CEO. Corporations have been getting away with murder for way way too long.
 
Originally Posted By: Shannow
Originally Posted By: nap
Programmers don’t have a “professional engineer” program. The onus is completely on the guys who decided to deploy these things in the streets.


That's why self learning AI, tagging along with "fallible humans" is the only way that this can work.

A perfectly programmed vehicle will be infinitely better than a human, but being programmed by humans can never be perfectly programmed...at least a tagalong self learning AI, sharing the global (cloud) knowledge of every ambiguous incident, abherent behaviour teaches every other connected vehicle.



I hope you’re familiar with this:

https://en.m.wikipedia.org/wiki/Correctness_(computer_science)

and the fact that in practice non-trivial computer programs cannot be proven correct.
 
That's why I like the self learning "tag along" AI...

We can't even fathom what the programme that arise IS, or even know that they are similar between two parallel processes with the same learning inputs...let alone critique them.


Like the deep learning medical AI that can predict mental illness episodes (BTW, if you are a GP or accountant, chances are your job is about to go to AI...neither are as complicated as driving a car)
 
All these AI based things have a greater than zero error rate. In many cases they perform worse than a human expert. What they bring to the table is speed - i.e. they can sift through a humongous set of data in hours as opposed to a
lifetime for a human.
 
Agree that the error rate is non zero...and with expert being 10,000 hours, or 600,000 miles, many of us are becoming expert as we cognitively decline.


AI cloud car starts their first trip with the accumulated knowledge, experience, and "muscle memory" of every mile that's been driven before them, and finishes it's last trip having uploaded every strengthened neural network to the cloud.


When THAT system says that a paper bag is blowing across the road, it's error rate is clearly going to be lower than what this Uber car did.
 
Question is who and how will correct the AI’s impression that some people are paper bags. As, left uncorrected, that classification may be self inforcing in future cases.

It’s all nice and cool until you realize that you may be the next paper bag and nobody would take any responsibility for that.
 
Originally Posted By: nap
It’s all nice and cool until you realize that you may be the next paper bag and nobody would take any responsibility for that.


Who's taking responsibility for this woman being programmed as being a paper bag and thus driven through ?
 
Status
Not open for further replies.
Back
Top