Understanding through Discussion


Welcome! You are not logged in. [ Login ]
EvC Forum active members: 87 (8857 total)
Current session began: 
Page Loaded: 08-19-2018 5:57 AM
206 online now:
PaulK, Phat (AdminPhat), Tangle (3 members, 203 visitors)
Chatting now:  Chat room empty
Newest Member: rldawnca
Post Volume:
Total: 837,051 Year: 11,874/29,783 Month: 896/1,642 Week: 4/306 Day: 4/28 Hour: 0/0


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Prev1234567
8
Author Topic:   Self-Driving Cars
Percy
Member
Posts: 17581
From: New Hampshire
Joined: 12-23-2000
Member Rating: 2.3


(2)
Message 106 of 112 (830436)
03-29-2018 10:05 AM
Reply to: Message 104 by Phat
03-29-2018 9:02 AM


Re: Ubers no good very bad week
Phat writes:

...which tasks should we entrust to the robotic computers and which should we keep for ourselves...

My own opinion: Level 2 is as far as we should go. To me that means safe distance maintenance and crash avoidance. No wheel control.

...that annoying Siri.

To me the most annoying thing about Siri was that in 2011 (release as a native part of iPhones) she was just starting 1st grade, and now 7 years later she's only halfway through 1st grade.

--Percy


This message is a reply to:
 Message 104 by Phat, posted 03-29-2018 9:02 AM Phat has acknowledged this reply

    
Phat
Member
Posts: 10975
From: Denver,Colorado USA
Joined: 12-30-2003
Member Rating: 1.2


Message 107 of 112 (830439)
03-29-2018 10:33 AM
Reply to: Message 105 by jar
03-29-2018 9:15 AM


Re: Ubers no good very bad week
Perhaps they could someday have a lane on major highways dedicated solely to driverless vehicles.

Chance as a real force is a myth. It has no basis in reality and no place in scientific inquiry. For science and philosophy to continue to advance in knowledge, chance must be demythologized once and for all. RC Sproul
"A lie can travel half way around the world while the truth is putting on its shoes." Mark Twain "
~"If that's not sufficient for you go soak your head."~Faith
Paul was probably SO soaked in prayer nobody else has ever equaled him.~Faith :)

This message is a reply to:
 Message 105 by jar, posted 03-29-2018 9:15 AM jar has responded

Replies to this message:
 Message 108 by jar, posted 03-29-2018 10:57 AM Phat has acknowledged this reply

  
jar
Member
Posts: 30712
From: Texas!!
Joined: 04-20-2004
Member Rating: 1.7


Message 108 of 112 (830441)
03-29-2018 10:57 AM
Reply to: Message 107 by Phat
03-29-2018 10:33 AM


Re: Ubers no good very bad week
Phat writes:

Perhaps they could someday have a lane on major highways dedicated solely to driverless vehicles.

Better yet, have a separate and walled off lane for the non-driverless cars.


My Sister's Website: Rose Hill Studios My Website: My Website

This message is a reply to:
 Message 107 by Phat, posted 03-29-2018 10:33 AM Phat has acknowledged this reply

  
Percy
Member
Posts: 17581
From: New Hampshire
Joined: 12-23-2000
Member Rating: 2.3


(1)
Message 109 of 112 (832678)
05-07-2018 10:09 PM
Reply to: Message 101 by Percy
03-27-2018 2:39 PM


Re: Video of the Pedestrian Collision
Percy writes:

It was a software problem, possibly in the GPU, but most likely in the computer software dedicated to scene analysis.

Did I call it or what: Uber vehicle reportedly saw but ignored woman it struck:

quote:
The only possibilities that made sense were:

A: Fault in the object recognition system, which may have failed to classify Herzberg and her bike as a pedestrian. This seems unlikely since bikes and people are among the things the system should be most competent at identifying.

B: Fault in the cars higher logic, which makes decisions like which objects to pay attention to and what to do about them. No need to slow down for a parked bike at the side of the road, for instance, but one swerving into the lane in front of the car is cause for immediate action. This mimics human attention and decision making and prevents the car from panicking at every new object detected.

The sources cited by The Information say that Uber has determined B was the problem. Specifically, it was that the system was set up to ignore objects that it should have attended to; Herzberg seems to have been detected but considered a false positive.


Percy


This message is a reply to:
 Message 101 by Percy, posted 03-27-2018 2:39 PM Percy has acknowledged this reply

Replies to this message:
 Message 110 by Stile, posted 05-09-2018 11:15 AM Percy has responded

    
Stile
Member
Posts: 3223
From: Ontario, Canada
Joined: 12-02-2004
Member Rating: 2.1


Message 110 of 112 (832746)
05-09-2018 11:15 AM
Reply to: Message 109 by Percy
05-07-2018 10:09 PM


Re: Video of the Pedestrian Collision
The plot thickens!

Percy's Article writes:

The sources cited by The Information say that Uber has determined B was the problem. Specifically, it was that the system was set up to ignore objects that it should have attended to; Herzberg seems to have been detected but considered a false positive.

I wonder where the issue lies here.

Options I can think of (can certainly be more than 1 going on at a time):

1 - Hardware (radar, lidar, visions system, sensors...) was not purchased at a level it should have been for the application.
That is, the company saved money on "cheaper equipment" that could only be right most-of-the-time instead of all-the-time for this scenario.
-fault is on designers

2 - Programmers were not very good. Bad programmers = bad programming = they simply "didn't think" that this scenario would come up.
-fault is on programmers

3 - Programmers were good, but not given enough time to setup the system to the levels the equipment is capable of... they were pushed to get something out that was "good enough" even though it could have been better given more time/money.
-fault is on leaders (owners/managers...)

Taking the quote literally "the system was set up to ignore objects that it should have attended to" implies to me that it's more on the programmers and/or leaders. But it's possible this wording is not meant to be taken that literally and it's still a design problem.


This message is a reply to:
 Message 109 by Percy, posted 05-07-2018 10:09 PM Percy has responded

Replies to this message:
 Message 111 by Percy, posted 05-10-2018 9:55 AM Stile has acknowledged this reply

    
Percy
Member
Posts: 17581
From: New Hampshire
Joined: 12-23-2000
Member Rating: 2.3


(1)
Message 111 of 112 (832779)
05-10-2018 9:55 AM
Reply to: Message 110 by Stile
05-09-2018 11:15 AM


Re: Video of the Pedestrian Collision
If you're breaking things down generally then yeah, sure, an accident with a self-driving car could be hardware or software or a combination.

I should mention one little wrinkle about the hardware. While there is at some level a clear and explicit line of demarcation between hardware and software, that line is probably not one that Uber programmers have any control over or visibility into. I'm not speaking from direct knowledge about the Lidar and Radar and Camera systems that the self-driving car companies employ, but in all likelihood their software doesn't interact directly with the perception systems. Rather, the those systems probably have their own APIs, and Uber links those APIs into its own software and makes calls to them. This raises questions. How well are those APIs documented (i.e., how easy would it be for programmers to misunderstand what an API routine is doing)? What is the quality level of those API routines?

And concerning Uber specifically, their self-driving cars happened so suddently that I bet Uber did not write the scene composition software. They likely bought it from someone. In fact, they likely bought a lot of their software from other sources, then attempted to integrate it. That's the way large software systems happen these days. They aren't written from scratch - they're amalgamations of software from many different sources.

In some ways this is the realization of a dream. When I started programming nearly a half century ago it was already understood how low productivity was when every new software system began from scratch, and there was already the hope of drawing upon preprogrammed modules. c was an early example - the basic language had no I/O, and that was added with an included header file to provide the interface API and a module to link in that implemented the routines of the API.

Over time this dream has become a reality, but all dreams have the possibility of becoming nightmares, and that is the case here, at least in part. Huge software systems can be built almost overnight simply by combining software garnered from many sources, but at the cost of a loss of control. You can't enforce quality of the acquired software. When there's a new release of the acquired software, will it be the same quality as the previous version? Have new bugs been introduced (undoubtedly).

And the acquired software will likely do most of what you need it to do, but not all, and it will not do it in ways that you would prefer. The data provided by one set of software will often not be precisely what is required by other sets of software. In the end a great deal of glue software must be written. There's ample opportunity for mistakes of all types.

Programmers are never given enough time to do a good job on the software, and software testing is almost always inadequate because QA departments are not profit centers, and so when layoff time rolls around the departments that get hit the hardest are personnel, finance, program management and QA. Programmers frequently get stuck testing their own software, which is a major no-no because the guy who programmed it has major blindspots about where the weaknesses in his software lie. Also, QA is it's own speciality, and just because you're a crack programmer doesn't mean you're any good at QA.

Much software is released prematurely, meaning that the customers become a reluctant adjunct to the software company's QA efforts. The standard estimate is that a bug is ten times more costly to fix when detected in the field (i.e., by a customer) than when detected before release, but this fact is rarely heeded. Companies get burned, fix their policies, then over time they pick away at those policies to speed up release cycles, and pretty soon they're back where they started.

If the software in question is something like a schedule calendar or photo album then the consequences of bugs are minor, but if the software is for a nuclear power plant or a space shuttle or a self-driving car then the consequences of bugs can be deadly.

Programmers have little leverage. Vaguely expressed concerns about possible remaining problems in the software will always go unheeded, because the programmer can't know the specific consequences, like that under certain circumstances it could fail to properly classify an object as something to be avoided and will plow right into it, and if that object is a person then it could kill them. Rather, all he can say is, "If we don't delay the release x weeks (thereby delaying revenue) and spend y dollars on more testing (thereby increasing the cost center's debits and making managers look bad) then something bad might happen." And managers will comfort themselves that there's always a driver in the car monitoring things and ready to take over in a split second, though obviously they should know that's just overoptimistic bullshit.

Completely self-driving cars are a utopian dream for the foreseeable future. What they can already do is amazing, but what they can't do is formidable and frightening. Google and Tesla and Uber and all the rest can do all the development and testing they want, but for a long time people are still going to have to drive their own cars. But just crash avoidance systems alone will significantly reduce injuries and deaths due to accidents.

--Percy


This message is a reply to:
 Message 110 by Stile, posted 05-09-2018 11:15 AM Stile has acknowledged this reply

    
Percy
Member
Posts: 17581
From: New Hampshire
Joined: 12-23-2000
Member Rating: 2.3


Message 112 of 112 (833695)
05-25-2018 1:18 PM


Blowing own horn again...
I already practically broke my arm patting myself on the back in Message 109:

Percy in Message 109 writes:

Percy writes:

It was a software problem, possibly in the GPU, but most likely in the computer software dedicated to scene analysis.

Did I call it or what: Uber vehicle reportedly saw but ignored woman it struck:

But the NTSB report about the Uber crash just came out (at bottom of NTSB: Uber Self-Driving Car Had Disabled Emergency Brake System Before Fatal Crash) and it says:

quote:
According to data obtained from the self-driving system, the system first registered radar and LIDAR observations of the pedestrian about 6 seconds before impact, when the vehicle was traveling at 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path. At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision (see figure 2).

So the problems with scene analysis were far worse than I could ever have imagined, and far slower, too. It detected an unidentified object 6 seconds before impact, but didn't figure out it was a bicycle until 1.3 seconds before impact. What was it doing for 4.7 seconds?

But it gets worse, though this next part has nothing to do with scene analysis:

quote:
According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.

The automatic emergency braking was disabled, so the driver had to do the braking. But as the video shows, the cyclist was practically invisible until the car was on top of her. The system wasn't designed to alert the driver to brake, but even if it had the driver would have had only 1.3 seconds to hit brakes and turn the wheel.

So Uber's system was bad in many ways.

But Tesla isn't covering themselves with glory, either. Did everyone hear about the Tesla that killed its driver (who wasn't paying attention, a major no-no, but still) by colliding with an already-collapsed crash cushion? From Tesla says Autopilot was on during deadly California crash:

quote:
There was a major announcement from Tesla Friday evening about last week's crash in Mountain View, California, that killed an engineer from Apple. The company confirms the autopilot "was" engaged when the Model X slammed into a collapsed safety barrier.

Thirty-eight-year-old Apple engineer and father of two, Walter Huang, died one week ago on his way to work when his Tesla Model X slammed into a crash cushion that had collapsed in an accident eleven days before -- basically like hitting a brick wall, the experts say.
...
Friday morning, a science director at an environmental startup took the ABC7 News I-Team's Dan Noyes in his Model X on the same route Huang drove last week to Apple. He was heading to the 85 carpool lane off 101 in Mountain View. "I see what the issue is," said Sean Price. "That line in the pavement could potentially be a problem," he said, pointing out a break between the asphalt and concrete and two white lines.


So possibly the Tesla thought the joint between pavement sections was the lane divider line and followed what it thought was the lane right into a collapsed crash cushion. This not only seems likely to me, I'm certain of it. My minimally-self driving car (cruise control with auto-distance maintenance) also has "crossing out of lane without signaling" detection (it beeps). This lane maintenance detection regularly goes off when the car moves across pavement joints. All the pothole patching crews are out right now, so it also regularly goes off these days as I cross actual dividing lines to go around repair crews.

These repair crews have men stationed at each end of the repair area with "Stop/Slow" signs that I'm still very doubtful these self-driving cars can properly handle. And a policeman with his hand up? Not a prayer.

Then a week or two ago there was the Tesla crash at 60 mph into the rear of a firetruck stopped at a red light. From Tesla in Autopilot mode sped up before crashing into stopped fire truck, police report says:

quote:
Data from the Model S electric sedan show the car picked up speed for 3.5 seconds before crashing into the fire truck in suburban Salt Lake City, the report said. The driver manually hit the brakes a fraction of a second before impact.

Police suggested that the car was following another vehicle and dropped its speed to 55 mph to match the leading vehicle. They say the leading vehicle then probably changed lanes and the Tesla automatically sped up to its preset speed of 60 mph without noticing the stopped cars ahead.


I've experienced the same thing in my minimally self-driving car. The cruise control feature is only to be used on the highway, but I use it everywhere. You can't trust it when the car in front of you moves out of your lane and the car tries to sync up with the next car in front. A sudden acceleration is common. Sometimes it detects the next car up and slows down again, sometimes not. And if the next car up is stopped at a light? Nothing. No braking. Sounds a lot like that Tesla that hit the rear of the firetruck. In my previous post I wrote about software systems today being amalgamations of software pieces from a variety of vendors. It isn't impossible that the bit of software responsible for that firetruck crash is the same as in my own car.

--Percy


    
Prev1234567
8
Newer Topic | Older Topic
Jump to:


Copyright 2001-2015 by EvC Forum, All Rights Reserved

™ Version 4.0 Beta
Innovative software from Qwixotic © 2018