FSD/Tesla driving software

DOGE just fired most of the NHTSA group responsible for regulating FSD. So I'm sure Tesla's camera - only FSD will be given the full green light moving forward.
I've had this concern for a long time with the position Musk has gotten into. He can't accomplish it by normal means.
 
But you said to judge against the driver. That’s an open ended argument. What is better? 75%? 50%? Many folks who have had to deal with auto fatalities would probably say just one person would be a win. Yet then you go down a slippery slope of nanny requirements.

I agree if someone doesn’t maintain a system and it doesn’t work. That sounds like another opportunity to litigate and let lawyers earn massive salaries while the schmuck working a regular job can’t afford the vehicle, registration and insurance. Mean while the affected parties are never actually made whole. What a racket.
Until you have it released out in the wild for a few years, nobody will have real world data on this. The problem with self driving or AI is we don't have an objective way to measure something as pass or fail yet. You can't control what a toddler would do, you can't control what other drivers would do, you can't control if there is a bad sign for a road construction, etc.

I do think eventually we will get there and we will have roads design improved to make self driving easy, and we will likely have some designated roads with little to no pedestrians like highway / autobahn to make it easier to do self driving, and many believes that long haul trucking self driving would start in the south with little to no snow and very predictable weather, instead of a road that's up north. A longer self driving road may end up being shorter if there is no need for a trucker to rest and no need to pay for a trucker.

If we have a consistent way to judge a human driver the insurance companies would have done it by now. So far they can only do guessing but they can't predict who in the 100 of candidate would never have an accident record in the next 20 year accurately yet.
 
If the death rate is similar to human driven cars, then it's not worth going autonomous. But fatality rate drops by 75% going to autonomous vehicles, you're opposed to that? If the fatality is due to a flaw in the software, then sue the manufacturer. If the flaw is due to a broken sensor and the owner didn't repair it, sue the owner.
It would in the commercial world due to cost savings. I think most likely if they can lobby enough of it at identical safety as human driver they would roll out eventually. It would likely be a completely new type of liability insurance for the adventurous insurance companies and then eventually they will drop policies if they find bugs and the manufacturers refuse to fix them, forcing them to update after each accident is investigated.

Every new types of transportation go through this, nobody has it perfect before rolling out, but over time things get better.
 
Not that easy when it turns to a liability situation, passing blame, etc.

Someone’s kid gets killed by an automous EV. Is that ok because its death rate is similar to all motor vehicle fatalities? No thanks. Sounds like a way for deregulation advocates and profit mongers to play sea lawyer amidst their failures.
Same can be said for anyone who had one prior at fault accident record. Nobody gets their license revoked after 1 accident anywhere in the world.

Also about self driving, it is not like it is always going to be more dangerous. What about situation where in 9/10 cases it is safer but you don't know about that 1/10? Like if someone misplace the road construction sign or a young punk decided to jaywalk as a dare in front of a self driving car? A human driver may see a young punk trying to do something dangerous 300 feet away but not a computer. That human may call 911 or pull a baseball bat out to shout at the kids to go away, but not a computer, and the young punks would know and constantly try to find ways to mess around.

Only until after a few of these court cases will things turn into common sense. People today would assume you are committing a suicide if you play dare in the middle of a highway with 65 mph speed limit, but when these 65mph highway first show up I'm sure some young punks were trying to mess around until one get killed and some drivers got sued.
 
Personally I find it hard to believe that FSD could be anywhere as bad as human drivers. At least FSD could follow the rules...
"Could". Since it learns from drivers apparently, it had picked up the propensity to get impatient and run red lights two updates ago. I get making mistakes, but that shouldn't be able to happen. It's slightly better than inattentive drivers at times. I've debated just intentionally hitting people that just pull out in front of me. They already aren't paying attention when they are driving, I'm not sure they're capable of even supervising FSD.
 
"Could". Since it learns from drivers apparently, it had picked up the propensity to get impatient and run red lights two updates ago. I get making mistakes, but that shouldn't be able to happen. It's slightly better than inattentive drivers at times. I've debated just intentionally hitting people that just pull out in front of me. They already aren't paying attention when they are driving, I'm not sure they're capable of even supervising FSD.

We don't have a standardized way to test self driving yet. If we do we probably need to spec how the road should be, traffic sign should be, road construction cones and signs should be, how pedestrians should behave, etc. Other than freeway / highway I haven't seen human follow the rules to the dot with construction and street crossing. We probably need to tighten some rules over time to make self driving easier. But then you can't prevent people trying to taunt self driving like you can have a human driver calling cops on those guys.

One thing I think self driving can do better than human is using more than human eyes' spectrum for driving. Maybe infra red? maybe UV? Maybe network connection to a local traffic camera to view from the next intersection?
 
We don't have a standardized way to test self driving yet. If we do we probably need to spec how the road should be, traffic sign should be, road construction cones and signs should be, how pedestrians should behave, etc. Other than freeway / highway I haven't seen human follow the rules to the dot with construction and street crossing. We probably need to tighten some rules over time to make self driving easier. But then you can't prevent people trying to taunt self driving like you can have a human driver calling cops on those guys.

One thing I think self driving can do better than human is using more than human eyes' spectrum for driving. Maybe infra red? maybe UV? Maybe network connection to a local traffic camera to view from the next intersection?
That would all be great. None of that is what is happening with Tesla though.
 
Presumably FSD would have at least slowed down for the fog. I don't know if that's true or not.
It likes to just disengage. Basically it’ll just throw it in your lap. Then again the cruise control on the car does the exact same thing.
 
Back
Top Bottom