Originally Posted By: ZeeOSix
Major screw up with the programming logic that was too dumbed down to distinguish a plastic bag from someone walking a bicycle across the road. Wow ...
You see that in any kind of industrial automation. I was using an optical reader that judge the quality of some characters read and give it a "score". Most of the time it is pretty good and when things doesn't work as expected, we adjust the position and angle through trial and error.
It is a lot more trial and error (aka statistics) than mathematical (aka deterministic) than you think. Unfortunately this is the incident when they have problem, the training session went bad and hit someone.
Reflective based technologies will always have uncertainty and risk involved, compare to through beam (elevator door light curtain) or dedicated transponder (i.e. your smart phone emitting signal telling driver less car you are there, and magnet on the road to guide the cars). We will probably get there when self driving cars become standardized and well understood, and when all the roads are retired and replaced to work better with self driving cars.
In the mean time, buckle up and drive safely.
Originally Posted By: Shannow
You guys just don't get what automation does to the "worker", who these manufacturers stat it "always ultimately in control"...
You can't take away 100% of the input/decision/output for the 99% of the easy decisions, and then require the "driver" to instantaneously react to the most difficult ones.
If you can stop the line and wait for a supervisor to take a look, yes.
Unfortunately on the road this may not work, unless you are tail gating another car and just follow a close distance. Which is why I think that will be the 1st step in self driving cars until a majority of the infrastructure (road tags, safety transponder on human / smart phones, traffic laws regarding to right of way) are redesigned to handle them.
Originally Posted By: ArrestMeRedZ
My prediction is that soon after self driving cars are deployed, lawyers will litigate them out of existance. Every branch in the AI decision making process will be repeatedly evaluated by unqualified jury members led on by emotionally persuasive litigators. Legal fees and award costs will overwhelm even the deep pocketed developers.
You have to remember laws are written by the wealthy and influential, and products liability insurance can cover a majority of the incidents. Unless a company intentionally cheat and lie about a known issue, it will be unlikely to get awards more expensive than say, a drunk driver fatality.