|
QuickSearch
Welcome! You are not logged in. [ Login ] |
EvC Forum active members: 67 (9079 total) |
| |
Test Moose | |
Total: 895,222 Year: 6,334/6,534 Month: 527/650 Week: 65/232 Day: 4/38 Hour: 0/0 |
Thread ▼ Details |
|
Thread Info
|
|
|
Author | Topic: Self-Driving Cars | ||||||||||||||||||||||||||||||||||||||||||||||
jar Member Posts: 33957 From: Texas!! Joined: Member Rating: 2.0 |
Or here to serve you...
|
||||||||||||||||||||||||||||||||||||||||||||||
Percy Member Posts: 20977 From: New Hampshire Joined: Member Rating: 3.4 |
This article contains the video of the pedestrian collision: Tempe Police release footage of fatal crash from inside self-driving Uber
It looks to me like the vehicle failed to detect the pedestrian. —Percy
|
||||||||||||||||||||||||||||||||||||||||||||||
Stile Member Posts: 4085 From: Ontario, Canada Joined: Member Rating: 5.2
|
Yes, I agree. Did we learn of any specifics along the way? If so, this seems like a failure on the AI that should be corrected, to me. That is, I can understand why a human may not have been able to stop (or slow down) at all in that allotted time frame. Could an AI have completely stopped the vehicle? I'm not sure. Depends on how good detection can/should be during night conditions. 1 - Possible issues in low-light (night) conditions. Perhaps a monitor of 'how bright' the headlights needs to be implemented in order to verify that the AI is safe to be in control during low-light environments. 2 - Perhaps other ways to detect pedestrians/objects needs to be implemented. Not just "visual" camera-type monitoring... but perhaps radio-like density checks, or radar-like bounce-back-signals. Multiple means of detection would mean you can set up the computer to have overlapping verification for object detection. I still like the direction of AI vehicles. On the face of it, this problem (not slowing down at all in this scenario) seems to be something that should have been solved "in the lab" before allowing the vehicle out on actual roads with actual people around. Like with moving AI vehicles into bad rain or snowy weather. That is, with the information we have currently available on what happened, anyway.
|
||||||||||||||||||||||||||||||||||||||||||||||
Percy Member Posts: 20977 From: New Hampshire Joined: Member Rating: 3.4
|
My slightly self-driving car (cruise control distance maintenance) does not detect anything moving across its path. If a car runs a stop sign my car will not brake. The Tesla accident in Florida where a man was killed also involved a vehicle moving across the car's path.
My car also will not brake for stopped cars. If I'm traveling down the highway on cruise control with no cars in front of me and then chance upon a traffic jam that isn't moving, I have to brake myself - the car will not brake. If I don't brake then at the last minute it does flash up a big red "BRAKE" visual warning on the dashboard and sounds an audio warning chime. In the same situation on the highway with no cars in front of me, if I catch up to moving cars, even cars moving very slowly, my car will brake. It is only stationary cars it won't brake for. My car also gets "confused" when one lane becomes two and the car that was in front of me chooses the other lane. My car seems slow to detect the new car in front and begins accelerating toward it. From watching ads on TV I know other cars have better crash avoidance systems, and in fairness to my car what it has is not a crash avoidance system - it's a cruise control distance maintenance system. My opinion is beginning to lean toward not having fully autonomous vehicles. Crash avoidance systems alone should greatly reduce vehicle fatalities. I wonder if an autonomous vehicle can pull into a parking place at the supermarket or pull into my garage? The systems used by Uber and Waymo and so forth use cameras, radar and lidar. From what I've read they would have had no trouble detecting that pedestrian, so software was likely responsible. --Percy
|
||||||||||||||||||||||||||||||||||||||||||||||
caffeine Member (Idle past 344 days) Posts: 1800 From: Prague, Czech Republic Joined:
|
A lot of talk in this thread seems to talk about computers as if they're magic - they can do everything at once instantaneously while a human is still distracted with changing the radio station.
Thing is, I think about day to day experience of working with computers. Sure - they can do amazing things compared to us humans. I can feed in a dataset and get a load of statistics back that would be totally impractical to ever calculate by hand. But it's not instant. If the dataset's big enough or the cross-referencing complicated enough I have to sit there for a few minutes while my screen goes grey, the little blue circle wiggles around and everything becomes unresponsive. Sometimes this lasts a looooong time. I'd hope the computer driving me around town is more powerful than my work laptop of course, but what sort of processing power and software is necessary to do all this magical tracking of everything that's happening around us while simultaneously downloading GPS satellite data and communicating with other smart cars; then using all this data to calculate optimum routes and speeds and initiate urgent evasive manouvres when necessary? Do we have consumer computers that can fit in cars and do that reliably? Not a rhetorical question - I'm curious if anyone knows.
|
||||||||||||||||||||||||||||||||||||||||||||||
Stile Member Posts: 4085 From: Ontario, Canada Joined: Member Rating: 5.2
|
A normal computer does many things. This is very different from an industrial computer. We are in an age where a "normal computer" can be used as an "industrial computer" when designed and programmed correctly as well.
Surprisingly little. The typical 40 robot, 300 motion side-rail example I mentioned above would run off a program taking up less than 1 MB of file size.
No. Not consumer computer. My personal experience: I've worked as what's called a "Controls Specialist" for the last 17 years. I design, create, develop, commission, upgrade or fix industrial computers. I've worked in: Computers are not magic. But taking inputs, making decisions, and setting outputs can be extremely fast. On the order of micro-seconds. Perhaps they're not using industrial computers (or something similar) to run AI in cars.
|
||||||||||||||||||||||||||||||||||||||||||||||
Percy Member Posts: 20977 From: New Hampshire Joined: Member Rating: 3.4
|
Adding to what Stile says, the car's computer has to build a model of the world around it, as shown in this TED talk by Waymo's former director. I've positioned this to begin at the right spot. If you watch a minute or two you'll get the idea:
How often does the computer have to update the model? I'll make some basic assumptions. The fastest speed the system has to deal with is 200 mph, and the model must be updated often enough that nothing moves more than an inch between one frame of the model and the next. A simple calculation tells us that the model must be updated around 3500 times per seconds, or once roughly every .3 milliseconds. 3D graphical modelling is a solved problem, but as every gamer knows if you're particular about the quality of the game simulations then you pay the big bucks for some serious graphical power. A standard PC can't smoothly handle the graphical requirements without a special graphics processor (GPU), the part of your computer made by Gigabyte, NVidia, etc. Could a GPU deliver an updated view of the world every .3 milliseconds? I guess so, because NVidia has a webpage about their hardware: NVIDIA DRIVE: Scalable AI platform for Autonomous Driving And here's a page about NVidia's Xavier GPU: Introducing Xavier, the NVIDIA AI Supercomputer for the Future of Autonomous Transportation These webpages make references to AI and deep learning. The term "AI" resurfaces every few years as a marketing/promotional tool, even though nothing approaching true AI has been achieved. That being said, the NVidia pages are referring to learning through neural nets, a way of configuring autonomous car behavior by driving the car around. Feedback from the driver when he takes over would be especially valuable. One article I read about Uber's autonomous efforts said that situations they found particularly difficult were construction zones and being adjacent to tall vehicles. I can imagine the difficulty of construction zones. Just yesterday I came across one of the most common construction situations in my neck of the woods. A utility vehicle was taking up half the road. Cones were out, and two men were stationed at opposite ends of the vehicle with "Slow/Stop" signs so that traffic can alternately proceed in the single remaining lane. I believe it will be many years before autonomous vehicles can handle this situation. And how well will autonomous vehicles follow the dictum, "Don't hit the damn pothole that will throw my front end out of alignment"? I don't know why Uber has trouble with tall vehicles. I imagine that's the case where you're next to a tractor trailer. Maybe they can't tell the difference between that and being inside a tunnel or driving along a tall hedgerow (think Britain) or along a tall wall. Can these GPU's tell the difference between an adjacent truck versus an adjacent wall? --Percy
|
||||||||||||||||||||||||||||||||||||||||||||||
Percy Member Posts: 20977 From: New Hampshire Joined: Member Rating: 3.4 |
Waymo and Uber and GM and all the rest have human drivers busily tooling about the country's roads logging hundreds of thousands of miles in an effort to fine tune the programming of their autonomous vehicles. It may be an exercise doomed to failure. More serious than the mere technical obstacles may be the human one: humans just aren't reliable instructors.
Waymo vehicles require human intervention on average every 5000 miles. During periods when their intervention isn't required, which it almost always isn't, they are required to sit watchful, attentive and ready at an instant's notice to intervene. But research shows that the on average people can only maintain vigilance for a monotonous task for less than half an hour. After the first half hour an autonomous vehicle's safety driver becomes of less and less utility. He is likely allowing situations go by where intervention would have improved the software. Source: The Unavoidable Folly of Making Humans Train Self-Driving Cars In the full repertoire of situations a vehicle might encounter, if even only one is unmanageable for autonomous vehicles then they are not autonomous, since they will require a safety driver. If all autonomous vehicles require safety drivers, then how are they any different from normal Ubers, Lyfts, Curbs and taxis? I think it will be many, many years before Waymo and the rest conquer the human-directing-traffic problem. Yet despite this many companies seem to have targeted Level 5, which means full autonomy. Why? Because of the handoff problem. This is the interval between when an autonomous vehicle decides it requires human intervention and when the human is actually ready to take over. Accepting control of a formerly autonomous vehicle is harder than it seems at first glance. Say your're fully absorbed in your iPad game helping your hero maneuver along a cliff face while battling a dragon when the alert buzzer goes off. How long before you even become aware of the buzzer? And when you finally do, how long before you register what it means, set the iPad down, look out at the road, put your hands on the wheel, position your foot on the pedals, and assess what is happening? Or even worse, say you're wearing your virtual reality helmet? And so because of the handoff problem, many autonomous vehicle companies are pursuing Level 5 autonomy. I think this is a mistake and can see two possible outcomes in the near term (the next few years):
Some companies like Audi think they can solve the handoff problem and are pursuing Level 3 autonomy. Sources:
In my opinion there will never be any such thing as a car with no steering wheel or pedals, because without them there would be no way to do simple maneuvers, like moving the car in the driveway that is blocking Uncle Fred. I have read that Waymo has some driverless cars out there. This could only be possible on roads mapped down to the finest detail on which they know there is no construction or accidents. As soon as there's construction or an accident then driverless cars must reroute or pull to the side and park. But how fast can Waymo know that there's been an accident and that a passing motorist is already directing traffic? Here's a video that says Waymo has solved the problems of construction zones and emergency vehicles. I'm skeptical myself: --Percy Edited by Percy, : Typo.
|
||||||||||||||||||||||||||||||||||||||||||||||
Minnemooseus Member Posts: 3884 From: Duluth, Minnesota, U.S. (West end of Lake Superior) Joined: |
It's been a few days since I watched the video and my computer is being balky about playing it now.
My impression: At least what the camera captured, the headlights didn't seem to be covering the area very well, especially to the left side where the pedestrian was.
It also looks like the pedestrian failed to detect the vehicle. Why would someone walk out in front of a moving vehicle like that??? Busy texting??? Moose Edited by Minnemooseus, : Typo. Professor, geology, Whatsamatta U Evolution - Changes in the environment, caused by the interactions of the components of the environment. "Do not meddle in the affairs of cats, for they are subtle and will piss on your computer." - Bruce Graham "The modern conservative is engaged in one of man's oldest exercises in moral philosophy; that is, the search for a superior moral justification for selfishness." - John Kenneth Galbraith "Yesterday on Fox News, commentator Glenn Beck said that he believes President Obama is a racist. To be fair, every time you watch Glenn Beck, it does get a little easier to hate white people." - Conan O'Brien "I know a little about a lot of things, and a lot about a few things, but I'm highly ignorant about everything." - Moose
|
||||||||||||||||||||||||||||||||||||||||||||||
NoNukes Inactive Member |
The solution for this seems rather obvious. Put several folks in the car and allow them to switch off as needed. This problem is not of a nature that makes solving the problem impossible; just a little more expensive. If needed, you can enforce the limitation in software by preventing the machine from learning lessons when the supervisor is not at his best.
I think unfortunate rather than mistaken fits better here. I don't see a viable alternative.
I agree. But if that were the only limitation, then wow, what a fantastic change that could be for all of us! All of that driving time spent in productive activities while still retaining the point to point convenience of operating your own vehicle. Under a government which imprisons any unjustly, the true place for a just man is also in prison. Thoreau: Civil Disobedience (1846) "Give me your tired, your poor, your huddled masses yearning to breathe free, the wretched refuse of your teeming shore. Send these, the homeless, tempest-tossed to me, I lift my lamp beside the golden door!” We got a thousand points of light for the homeless man. We've got a kinder, gentler, machine gun hand. Neil Young, Rockin' in the Free World. Worrying about the "browning of America" is not racism. -- Faith I hate you all, you hate me -- Faith
|
||||||||||||||||||||||||||||||||||||||||||||||
Percy Member Posts: 20977 From: New Hampshire Joined: Member Rating: 3.4 |
Yeah, weird. Seems like the headlights on the vehicle had a very short range. But Lidar all by itself should have detected the pedestrian. I saw some comments by industry experts who seemed to agree that it looked more like a software than a sensor problem.
We still have so little information on the pedestrian beyond her name. I saw some speculation that she was homeless. Maybe more information will become available soon. --Percy
|
||||||||||||||||||||||||||||||||||||||||||||||
Percy Member Posts: 20977 From: New Hampshire Joined: Member Rating: 3.4 |
I've read that safety drivers are supposed to keep their hands within an inch of the steering wheel, their foot next to the brake, and their eyes on the road. I can see driver rotation alleviating attention fatigue the first week or two on the job, but might not the sheer passivity and monotony of the exercise for week after week make the effective attention period for each shift shorter and shorter?
To pursuing Level 5? Wouldn't Level 2 be as effective in saving lives?
It'd be wonderful, but I wonder if the problem isn't more like AI than chess in difficulty. --Percy
|
||||||||||||||||||||||||||||||||||||||||||||||
Stile Member Posts: 4085 From: Ontario, Canada Joined: Member Rating: 5.2
|
I completely agree that, from the video anyway, it seems the lady could have very easily saved her own life. As my Dad told me when talking about the right/wrong and traffic legalities: "You don't want to be dead right."
|
||||||||||||||||||||||||||||||||||||||||||||||
Stile Member Posts: 4085 From: Ontario, Canada Joined: Member Rating: 5.2 |
I would be aiming for something like this:
A level of autonomy where the user is not legally obligated to control the vehicle. Now, that said, the vehicle will still have the ability for manual driving. Although perhaps wheel/pedals can be stored away and become available when required. Vehicle identifies need for manual? Pull over and stop as safely as possible. If manual is activated before then... well, fine. If not, vehicle sits at side of road until manual is engaged and problem is dealt with. This situation in question with the lady being hit - would be under the control of the programming. I think technology/programming can deal with this situation. Construction? -Any time you can identify that the car doesn't "know" exactly what it's doing... swap to manual by slowing down and pulling over and waiting for person to take over. How does the car "know" exactly what it's doing? Can't program your vehicle so that it can safely pull over and stop? That would be my very zoomed-out scope and goals, anyway. What level of autonomy would that be? Level 3? Level 4?
|
||||||||||||||||||||||||||||||||||||||||||||||
Stile Member Posts: 4085 From: Ontario, Canada Joined: Member Rating: 5.2 |
Just a few quick notes on this: The video may or may not be what "the program" is seeing. Ever notice that sometimes things look darker/lighter in pictures you take than in reality?
I find it strange that she doesn't seem to be aware of the vehicle at all. I don't see her turn to look at the camera/car before it hits her. Who wouldn't notice bright lights and big noise bearing down on you? Or is this one of those more-silent vehicles? Not that this moves the legal responsibility off the vehicle... but it does seem that "something" isn't right with this pedestrian. Perhaps a mental issue? Perhaps headphones? Perhaps an indignant "vehicles stop for me!" attitude? Perhaps a family emergency or other super-important issue is on this lady's mind at the moment? Something else?
|
|
|
Do Nothing Button
Copyright 2001-2018 by EvC Forum, All Rights Reserved
Version 4.1
Innovative software from Qwixotic © 2022