Tesla Crash (autopilot maybe) --Automatic Braking?

Status
Not open for further replies.
I guess we have to determine why the sensors didn't see this very large object, autopilot on or off, because I assume "Intelligent Emergency Braking" (Nissan's name for it) is active whether the autopilot is on or off.
I don't know if the driver is telling the truth about the autopilot being on.

I just read the https://forums.tesla.com/forum/forums/automatic-emergency-braking-autopilot-turned discussion.
One guy in there thinks the Automatic Emergency Braking feature in the Tesla only slows the car by delta 25 mph! If true, that would explain why the car did NOT look like it hit at 65 mph. It does indeed look more like a 40 mph hit, starting to maybe make sense now.
 
Originally Posted By: Shannow
Originally Posted By: 555
Person behind the wheel at fault.
Driver is always supposed to be observant and attending to the controls. How does one not see a fire truck and slow down?


There's a hundred years of experience in industrial automation that says that your attitude towards the driver's responsibility is unfounded, as soon as you start removing the process from them.

Re read my previous post...when something else is doing most of the thinking, once alerted the operator takes MUCH longer to be come aware of the problem, analyse and take effect.



https://www.amazon.com/Managing-Risks-Organizational-Accidents-Reason/dp/1840141050

Good book for anyone to read before blaming the operator.

Remember, the semi, Teels blamed the open area under the load for their car not "seeing it"...there's no excuse for them in this case (until they come up with one).

Are Tesla using everything they could (e.g. LIDAR), before blaming the operator, who THAY say is fully in control ?

I agree with you and the problems that arise from removing the operator from the process.
This situation is a slippery slope legally because of shifting responsibilities. Maybe by keeping the operator at the center of legal responsibility the appeal of this technology will be limited or drivers will start paying attention......I know I know....they won't .
I'm bummed that so many vehicle owners would prefer not to drive their purchase. What is better than controlling your own destiny?
This technology could provide mobility for those that otherwise would have none. I see that as one of its greatest benefits.
If a driver, operator, "bag of plasma with a wallet" doesn't have to focus on driving then odds are they will be looking at a screen, whether on the instrument panel or mobile device. If the driver is looking at that then one can sell them stuff and/or "drive" them to where the desired product is located. So now that vehicle you're making payments on for the next 5 years is also a stream of digital junk mail i.e." Mr Johnson we at Cheapo Rim Protectors have noticed that you have driven on your tires for more than 40,000 miles and today we have a sale on some tires we think you'll like better" etc. That's why I want the operator to be responsible, because maybe just maybe, there will be some legal ground to keep this sort of marketing distraction on a leash.
Going to buy that book. Looks like my type of reading. Thanks for the recommendation.
 
No probs at all.

Think we are both on the same page.

The book details why the various models of culpability create different outcomes. e.g. follow the rules and you are blameless. Through aircraft climb rate meters that were mandated by some airlines, causing pilots to focus on that meter solely, even if the others were all telling him of impending disaster.

Summary (from his book) is this model.

decision-tree-lg.jpg


As a manager, I like it. Using it gets institutional problems to be fixed.

The problem that you describe is in the "substitution test". Would other people of similar background and training make the same error. If the answer is "Yes, they'll get comfortable, take their eyes off the road and start texting", then the autopilot needs to be flawless.
 
I think that "deep learning", or self programming devices

https://devblogs.nvidia.com/explaining-deep-learning-self-driving-car/

Are the only answer. They are proving themselves in medical diagnosis already.

The program that evolves can't even be understood by the program designers, as the networks that are generated can't be known...probably makes them hard to hack as well.

And accumulate the sum total of human inputs observed, which if linked with cloud based sharing would also give every other car on the road perfect examples of the "wrong answer".
 
re the NVidia thing...re the self driving semi trailers....


They can add or subtract snow to a known route, and make other sensors an interupt/fine tune.
 
Originally Posted By: oil_film_movies

I just read the https://forums.tesla.com/forum/forums/automatic-emergency-braking-autopilot-turned discussion.
One guy in there thinks the Automatic Emergency Braking feature in the Tesla only slows the car by delta 25 mph! If true, that would explain why the car did NOT look like it hit at 65 mph. It does indeed look more like a 40 mph hit, starting to maybe make sense now.


If that's true, then the person who programmed it not to come to a dead halt is the one responsible...for the initial collision...if they programme a 40MPH collision, then it's their choice.

Consequential accidents arising are the next rung down the ladder, but preventing the initiator is primary...and what the autopilot is offering, PLUS being advertised as safer than you or I
 
Originally Posted By: Shannow
Originally Posted By: oil_film_movies
I just read the https://forums.tesla.com/forum/forums/automatic-emergency-braking-autopilot-turned discussion.
One guy in there thinks the Automatic Emergency Braking feature in the Tesla only slows the car by delta 25 mph! If true, that would explain why the car did NOT look like it hit at 65 mph. It does indeed look more like a 40 mph hit, starting to maybe make sense now.
If that's true, then the person who programmed it not to come to a dead halt is the one responsible...for the initial collision...if they programme a 40MPH collision, then it's their choice. Consequential accidents arising are the next rung down the ladder, but preventing the initiator is primary...and what the autopilot is offering, PLUS being advertised as safer than you or I


I'd like to verify what was written on the Tesla forum, about the car only having the automatic authority to slow by delta 25 mph. Might be true.
I could see how the Systems Engineer (algorithm writer, not the programmer BTW), might have said delta-25mph is enough, since Tesla presents the "Autopilot" as something that requires the human to constantly monitor with their hands near the steering wheel, but for this accident we'll look for an orange rolling arouund in the cabin, partly smashed from the accident. (Rescue workers were oddly seen eating an orange at the scene.... kidding.)
 
Originally Posted By: Nick1994
Where was the driver during this? Hands have to be on the wheel for autonomous driving, maybe one hand was holding a book or a newspaper?

Or he purposely let it crash to try for a lawsuit against Tesla.

You really think that?... 65 mph into the back of a firetruck?
 
Originally Posted By: jeepman3071
You really think that? 65 mph into the back of a firetruck.
Actually 40 mph. The crash certainly doesn't look like a 65 mph crash. Remember kinetic energy goes up as the square of the speed, so a 65 mph crash has 2.6 times more energy to dissipate than a 40 mph crash.
In fact, the IIHS runs their small overlap frontal collision at 40 mph, and you can see the similarity. ... 65 mph would be 2.6 times as bad!
 
This also happened: A driver on Jan. 19 in CA was driving drunk, passed out, behind the wheel of .... you guessed it.... a Tesla on autopilot!
 
People have blamed auto innovation for their lack of responsibility from the beginning of time.

Automatic transmissions would lull the the inattentive driver to sleep for lack of interaction.

Cruise control would be blamed for countless accidents after its inception " the car just took off".

Anti lock brakes were the thing to blame in my time - "they dont have the power to lock" and "people are slamming into walls" because they cant stop.

Its isnt called "auto driver" - its called "auto pilot" - agreed the name is misleading (especially for the sheeple who always ant to blame someone else) but the name does make a point.

The pilot or driver is always in charge - should the family of air france 447 sue airbus because flight envelope protection couldn't overcome the iced up/ failed sensors ? - or are the pilots who couldn't actually fly the plane manually in a storm to blame?

People often don't even know what pedal they are stepping on and laughably blame the auto on everything (audi 5000) claiming that the car just took off at full speed and the braked failed at the same time and magically repaired themselves after the accident.

This latest clown is no different blaming the car for the crash when all he had to do was steer a little bit to the right.

Lets see what the data says - and like all other accidents learn from it and improve.

Nvidia cant deliver pcie cards on time (they are the old SGI guys for those that don't remember) they have a long way to go.

UD
 
The NTSB has taken interest in investigating this. My observation is they are the most commonsense people to investigate such incidents. I'd like to see what their determinations and suggestions would be.
 
Originally Posted By: Kestas
The NTSB has taken interest in investigating this. My observation is they are the most commonsense people to investigate such incidents. I'd like to see what their determinations and suggestions would be.

They likely looked over the basic operational control laws already before it ever was installed. Or the NHTSA should have.
Similar to, but not as intensive, as airliner flight control laws the FAA looks over for certification.
Given that, I wonder if the NTSB or NHTSA approved the current Tesla autopilot operation under at some formal document. It issues a visual and audible warning to wake up the driver. A voice calls out "Put down the newspaper and brake!!!". hee-hee, or something like that. Maybe chiming over the speaker system.

Of course they could revise their control law opinions. Wait and see if they like or don't like the whole control strategy & warnings.
Or, maybe there was a hardware (sensor or computer or wiring) failure, although even that should be subject to BIT warnings (BIT=Built-in-Test, like OBDII errors).
 
Last edited:
Originally Posted By: UncleDave
Nvidia cant deliver pcie cards on time

You make a great point, they should get into bed with TESLA, eh, who can't deliver cars on time.

The Nvidia link was to how these things should be being programmed...nothing more.
 
Originally Posted By: oil_film_movies
Originally Posted By: jeepman3071
You really think that? 65 mph into the back of a firetruck.
Actually 40 mph. The crash certainly doesn't look like a 65 mph crash. Remember kinetic energy goes up as the square of the speed, so a 65 mph crash has 2.6 times more energy to dissipate than a 40 mph crash.
In fact, the IIHS runs their small overlap frontal collision at 40 mph, and you can see the similarity. ... 65 mph would be 2.6 times as bad!



I was asking if he really thinks the guy would purposely cause a crash in order to sue Tesla. Granted, I worked in insurance and saw a lot of fraud and people doing stupid things, but none of them would be as severe as that. The guy's risk of injury is a bit higher at those speeds especially with something bigger like a firetruck where the car could go under the vehicle.

My bets are 100% on him setting it on autopilot and being distracted enough to where he wasn't paying attention. I've seen accidents where the driver had a passenger hold the wheel while they did something, so I can easily see this happening.
 
Originally Posted By: Shannow
Originally Posted By: UncleDave
Nvidia cant deliver pcie cards on time

You make a great point, they should get into bed with TESLA, eh, who can't deliver cars on time.

The Nvidia link was to how these things should be being programmed...nothing more.


Match made in heaven right?

Actually I read Nvidia was making a whole generation of GPU hardware as the brains for the code- and writing some of the AI (that parts harder than designing the hardware)
Note they indicate making source code libraries available they arent providing a turn key package ready to go but a framework.

https://www.nvidia.com/en-us/self-driving-cars/drive-px/

It would be interesting to see who's autonomous scheme has the most miles under its belt - Id guess that would be teslas but dont really know.

UD
 
Status
Not open for further replies.
Back
Top