Register | Sign In


Understanding through Discussion


EvC Forum active members: 65 (9162 total)
1 online now:
Newest Member: popoi
Post Volume: Total: 915,817 Year: 3,074/9,624 Month: 919/1,588 Week: 102/223 Day: 0/13 Hour: 0/0


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   Self-Driving Cars
jar
Member (Idle past 394 days)
Posts: 34026
From: Texas!!
Joined: 04-20-2004


Message 76 of 142 (830095)
03-21-2018 2:21 PM
Reply to: Message 75 by 1.61803
03-21-2018 2:15 PM


Re: How Safe Are Autonomous Vehicles?
Or here to serve you...

My Sister's Website: Rose Hill Studios My Website: My Website

This message is a reply to:
 Message 75 by 1.61803, posted 03-21-2018 2:15 PM 1.61803 has not replied

  
Percy
Member
Posts: 22392
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.3


Message 77 of 142 (830129)
03-21-2018 10:09 PM


Video of the Pedestrian Collision
This article contains the video of the pedestrian collision: Tempe Police release footage of fatal crash from inside self-driving Uber
It looks to me like the vehicle failed to detect the pedestrian.
Percy

Replies to this message:
 Message 78 by Stile, posted 03-22-2018 10:17 AM Percy has replied
 Message 84 by Minnemooseus, posted 03-24-2018 8:10 PM Percy has replied

  
Stile
Member
Posts: 4295
From: Ontario, Canada
Joined: 12-02-2004


(1)
Message 78 of 142 (830136)
03-22-2018 10:17 AM
Reply to: Message 77 by Percy
03-21-2018 10:09 PM


Re: Video of the Pedestrian Collision
Percy writes:
It looks to me like the vehicle failed to detect the pedestrian.
Yes, I agree.
Did we learn of any specifics along the way?
I seem to remember that the vehicle did not slow down at all before the collision... does anyone else remember that or am I making that up?
If so, this seems like a failure on the AI that should be corrected, to me.
That is, I can understand why a human may not have been able to stop (or slow down) at all in that allotted time frame.
But a computer? A computer should have at least started to hit the brakes before the pedestrian was hit in this scenario, if you ask me. Otherwise, something is wrong. Non-detection of object or no decision was made (threshold for what an "object" is? program corruption or error?) or output (connection to brakes) malfunction.
Could an AI have completely stopped the vehicle? I'm not sure. Depends on how good detection can/should be during night conditions.
But begin to stop? Absolutely. And a big problem if it did not.
1 - Possible issues in low-light (night) conditions. Perhaps a monitor of 'how bright' the headlights needs to be implemented in order to verify that the AI is safe to be in control during low-light environments.
2 - Perhaps other ways to detect pedestrians/objects needs to be implemented. Not just "visual" camera-type monitoring... but perhaps radio-like density checks, or radar-like bounce-back-signals. Multiple means of detection would mean you can set up the computer to have overlapping verification for object detection.
Some will work better than others in different situations. Like optical-cameras may work best in daytime, nice weather conditions. Perhaps radar-like technology would work better at night. Or maybe infra-red-camera or even heat-detection.
I still like the direction of AI vehicles.
But I think this should be taken as a serious issue that requires serious re-programming or tweaking of the system as a whole before continuing.
"Low visibility at night" isn't exactly a new problem for driving. I would have suspected that this would have been accounted for before allowing AI vehicles to operate at night.
On the face of it, this problem (not slowing down at all in this scenario) seems to be something that should have been solved "in the lab" before allowing the vehicle out on actual roads with actual people around.
Like with moving AI vehicles into bad rain or snowy weather.
I like to assume that the solution isn't "well, it works good when it's sunny... let's send it out on busy streets and see what happens in a storm!"
Such things need to be tested in a lab-setting to at least reach certain acceptable thresholds before releasing them into live situations where people can get hurt.
Seems like basic due-diligence.
That is, with the information we have currently available on what happened, anyway.
Maybe we'll learn more soon.

This message is a reply to:
 Message 77 by Percy, posted 03-21-2018 10:09 PM Percy has replied

Replies to this message:
 Message 79 by Percy, posted 03-22-2018 11:47 AM Stile has seen this message but not replied
 Message 80 by caffeine, posted 03-22-2018 2:53 PM Stile has replied

  
Percy
Member
Posts: 22392
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.3


(1)
Message 79 of 142 (830140)
03-22-2018 11:47 AM
Reply to: Message 78 by Stile
03-22-2018 10:17 AM


Re: Video of the Pedestrian Collision
My slightly self-driving car (cruise control distance maintenance) does not detect anything moving across its path. If a car runs a stop sign my car will not brake. The Tesla accident in Florida where a man was killed also involved a vehicle moving across the car's path.
My car also will not brake for stopped cars. If I'm traveling down the highway on cruise control with no cars in front of me and then chance upon a traffic jam that isn't moving, I have to brake myself - the car will not brake. If I don't brake then at the last minute it does flash up a big red "BRAKE" visual warning on the dashboard and sounds an audio warning chime.
In the same situation on the highway with no cars in front of me, if I catch up to moving cars, even cars moving very slowly, my car will brake. It is only stationary cars it won't brake for.
My car also gets "confused" when one lane becomes two and the car that was in front of me chooses the other lane. My car seems slow to detect the new car in front and begins accelerating toward it.
From watching ads on TV I know other cars have better crash avoidance systems, and in fairness to my car what it has is not a crash avoidance system - it's a cruise control distance maintenance system.
My opinion is beginning to lean toward not having fully autonomous vehicles. Crash avoidance systems alone should greatly reduce vehicle fatalities.
I wonder if an autonomous vehicle can pull into a parking place at the supermarket or pull into my garage?
The systems used by Uber and Waymo and so forth use cameras, radar and lidar. From what I've read they would have had no trouble detecting that pedestrian, so software was likely responsible.
--Percy

This message is a reply to:
 Message 78 by Stile, posted 03-22-2018 10:17 AM Stile has seen this message but not replied

  
caffeine
Member (Idle past 1025 days)
Posts: 1800
From: Prague, Czech Republic
Joined: 10-22-2008


(1)
Message 80 of 142 (830149)
03-22-2018 2:53 PM
Reply to: Message 78 by Stile
03-22-2018 10:17 AM


Re: Video of the Pedestrian Collision
A lot of talk in this thread seems to talk about computers as if they're magic - they can do everything at once instantaneously while a human is still distracted with changing the radio station.
Thing is, I think about day to day experience of working with computers. Sure - they can do amazing things compared to us humans. I can feed in a dataset and get a load of statistics back that would be totally impractical to ever calculate by hand. But it's not instant. If the dataset's big enough or the cross-referencing complicated enough I have to sit there for a few minutes while my screen goes grey, the little blue circle wiggles around and everything becomes unresponsive. Sometimes this lasts a looooong time.
I'd hope the computer driving me around town is more powerful than my work laptop of course, but what sort of processing power and software is necessary to do all this magical tracking of everything that's happening around us while simultaneously downloading GPS satellite data and communicating with other smart cars; then using all this data to calculate optimum routes and speeds and initiate urgent evasive manouvres when necessary? Do we have consumer computers that can fit in cars and do that reliably? Not a rhetorical question - I'm curious if anyone knows.

This message is a reply to:
 Message 78 by Stile, posted 03-22-2018 10:17 AM Stile has replied

Replies to this message:
 Message 81 by Stile, posted 03-22-2018 3:27 PM caffeine has replied

  
Stile
Member
Posts: 4295
From: Ontario, Canada
Joined: 12-02-2004


(1)
Message 81 of 142 (830151)
03-22-2018 3:27 PM
Reply to: Message 80 by caffeine
03-22-2018 2:53 PM


Re: Video of the Pedestrian Collision
caffeine writes:
If the dataset's big enough or the cross-referencing complicated enough I have to sit there for a few minutes while my screen goes grey, the little blue circle wiggles around and everything becomes unresponsive. Sometimes this lasts a looooong time.
A normal computer does many things.
Even if you've only requested it do one thing... Windows (or whatever operating system, some are better, some are worse) is always doing many things in the background simply sustaining the system and all the other things it's capable of.
This is very different from an industrial computer.
Or, really, a computer that's specifically programmed/constructed to read inputs, make decisions, and then write outputs.
It doesn't do anything else, and it's very good at what it does.
A typical scan time (reading inputs, doing the decisions, and then writing outputs) for a very large system (say, controlling 40 robots and 300 motions in order to build a side rail for a vehicle) would be less than 100ms. Sometimes as quick as 10ms or even faster.
This is a "typical" system that is not meant for high-speed applications.
A system built for high-speed applications (think of fast conveyor belts flying by very quickly) can be designed using interrupts (specifically run a motion due to an input regardless of the rest of the program) that would result in being 100-to-1000 times faster than that (maybe even more?), depending on how much money you'd like to put in.
We are in an age where a "normal computer" can be used as an "industrial computer" when designed and programmed correctly as well.
I'd hope the computer driving me around town is more powerful than my work laptop of course, but what sort of processing power and software is necessary to do all this magical tracking of everything that's happening around us while simultaneously downloading GPS satellite data and communicating with other smart cars
Surprisingly little.
The bulk of your computer's processing is running stuff it doesn't really need to run... just stuff you want it to be running so that it's ready for your 'day-to-day' stuffs at any time.
Strip all that away, design and program it specifically for the task at hand and nothing more, and things get much faster. A lot faster.
The typical 40 robot, 300 motion side-rail example I mentioned above would run off a program taking up less than 1 MB of file size.
The maximum storage of the entire industrial computer would even be in the 2-to-4 MB range.
It's not a very "powerful" computer, it's simply a specifically, efficiently designed one used for the tasks it's programmed for and nothing else.
Do we have consumer computers that can fit in cars and do that reliably?
No. Not consumer computer.
But industrial computers can do it very reliably and very quickly.
My personal experience:
I've worked as what's called a "Controls Specialist" for the last 17 years.
I design, create, develop, commission, upgrade or fix industrial computers.
I've worked in:
-the auto industry (robots, welding, pneumatic motions and basic sensors).
-the packaging industry (robots, high-speed movement, labelling, vision-detection/sensing).
-the food and beverage industry (basic process control - temperature, pressure, fluid levels..., government verification of certain procedures such as milk pasteurization)
-the pharmaceutical industry (robots, high-speed rotating tablet presses, granulation, process control)
Computers are not magic.
But taking inputs, making decisions, and setting outputs can be extremely fast. On the order of micro-seconds.
I've seen systems react to things much, much faster than that person showing up on the street in front of the car.
Perhaps they're not using industrial computers (or something similar) to run AI in cars.
Maybe there is a bunch of bloat in it that will cause the types of delays you're talking about.
I've never worked on AI before.
But I don't think so. The industry of AI is so big and has so much potential... they have to have experts doing these sorts of things who would have to be aware of what computers are capable of when used correctly and specifically.

This message is a reply to:
 Message 80 by caffeine, posted 03-22-2018 2:53 PM caffeine has replied

Replies to this message:
 Message 82 by Percy, posted 03-24-2018 12:23 PM Stile has seen this message but not replied
 Message 94 by caffeine, posted 03-26-2018 4:34 PM Stile has replied

  
Percy
Member
Posts: 22392
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.3


(1)
Message 82 of 142 (830183)
03-24-2018 12:23 PM
Reply to: Message 81 by Stile
03-22-2018 3:27 PM


Re: Video of the Pedestrian Collision
Adding to what Stile says, the car's computer has to build a model of the world around it, as shown in this TED talk by Waymo's former director. I've positioned this to begin at the right spot. If you watch a minute or two you'll get the idea:
How often does the computer have to update the model? I'll make some basic assumptions. The fastest speed the system has to deal with is 200 mph, and the model must be updated often enough that nothing moves more than an inch between one frame of the model and the next. A simple calculation tells us that the model must be updated around 3500 times per seconds, or once roughly every .3 milliseconds.
3D graphical modelling is a solved problem, but as every gamer knows if you're particular about the quality of the game simulations then you pay the big bucks for some serious graphical power. A standard PC can't smoothly handle the graphical requirements without a special graphics processor (GPU), the part of your computer made by Gigabyte, NVidia, etc.
Could a GPU deliver an updated view of the world every .3 milliseconds? I guess so, because NVidia has a webpage about their hardware: NVIDIA DRIVE: Scalable AI platform for Autonomous Driving
And here's a page about NVidia's Xavier GPU: Introducing Xavier, the NVIDIA AI Supercomputer for the Future of Autonomous Transportation
These webpages make references to AI and deep learning. The term "AI" resurfaces every few years as a marketing/promotional tool, even though nothing approaching true AI has been achieved. That being said, the NVidia pages are referring to learning through neural nets, a way of configuring autonomous car behavior by driving the car around. Feedback from the driver when he takes over would be especially valuable.
One article I read about Uber's autonomous efforts said that situations they found particularly difficult were construction zones and being adjacent to tall vehicles. I can imagine the difficulty of construction zones. Just yesterday I came across one of the most common construction situations in my neck of the woods. A utility vehicle was taking up half the road. Cones were out, and two men were stationed at opposite ends of the vehicle with "Slow/Stop" signs so that traffic can alternately proceed in the single remaining lane. I believe it will be many years before autonomous vehicles can handle this situation.
And how well will autonomous vehicles follow the dictum, "Don't hit the damn pothole that will throw my front end out of alignment"?
I don't know why Uber has trouble with tall vehicles. I imagine that's the case where you're next to a tractor trailer. Maybe they can't tell the difference between that and being inside a tunnel or driving along a tall hedgerow (think Britain) or along a tall wall. Can these GPU's tell the difference between an adjacent truck versus an adjacent wall?
--Percy

This message is a reply to:
 Message 81 by Stile, posted 03-22-2018 3:27 PM Stile has seen this message but not replied

  
Percy
Member
Posts: 22392
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.3


Message 83 of 142 (830190)
03-24-2018 2:04 PM


The Downside of Human Instructors
Waymo and Uber and GM and all the rest have human drivers busily tooling about the country's roads logging hundreds of thousands of miles in an effort to fine tune the programming of their autonomous vehicles. It may be an exercise doomed to failure. More serious than the mere technical obstacles may be the human one: humans just aren't reliable instructors.
Waymo vehicles require human intervention on average every 5000 miles. During periods when their intervention isn't required, which it almost always isn't, they are required to sit watchful, attentive and ready at an instant's notice to intervene.
But research shows that the on average people can only maintain vigilance for a monotonous task for less than half an hour. After the first half hour an autonomous vehicle's safety driver becomes of less and less utility. He is likely allowing situations go by where intervention would have improved the software.
Source: The Unavoidable Folly of Making Humans Train Self-Driving Cars
In the full repertoire of situations a vehicle might encounter, if even only one is unmanageable for autonomous vehicles then they are not autonomous, since they will require a safety driver. If all autonomous vehicles require safety drivers, then how are they any different from normal Ubers, Lyfts, Curbs and taxis? I think it will be many, many years before Waymo and the rest conquer the human-directing-traffic problem.
Yet despite this many companies seem to have targeted Level 5, which means full autonomy. Why? Because of the handoff problem. This is the interval between when an autonomous vehicle decides it requires human intervention and when the human is actually ready to take over. Accepting control of a formerly autonomous vehicle is harder than it seems at first glance. Say your're fully absorbed in your iPad game helping your hero maneuver along a cliff face while battling a dragon when the alert buzzer goes off. How long before you even become aware of the buzzer? And when you finally do, how long before you register what it means, set the iPad down, look out at the road, put your hands on the wheel, position your foot on the pedals, and assess what is happening?
Or even worse, say you're wearing your virtual reality helmet?
And so because of the handoff problem, many autonomous vehicle companies are pursuing Level 5 autonomy. I think this is a mistake and can see two possible outcomes in the near term (the next few years):
  1. This one I think unlikely. Realizing belatedly but eventually after the expenditure of many billions of dollars, the autonomous car industry realizes that Level 5 autonomy is not achievable for at least the next 20 or 30 years and backs off to Level 2 autonomy. That's basically lane maintenance, stopping distance maintenance, crash avoidance. Vehicular deaths will plummet.
  2. This is the one I think likely. Because billions of dollars are involved government will be helpless in preventing the industry from moving ahead with Level 5 autonomy. Vehicular deaths will plummet, but construction zones and humans-directing-traffic will be an ongoing problem that will take many years to solve.
Some companies like Audi think they can solve the handoff problem and are pursuing Level 3 autonomy.
Sources:
In my opinion there will never be any such thing as a car with no steering wheel or pedals, because without them there would be no way to do simple maneuvers, like moving the car in the driveway that is blocking Uncle Fred.
I have read that Waymo has some driverless cars out there. This could only be possible on roads mapped down to the finest detail on which they know there is no construction or accidents. As soon as there's construction or an accident then driverless cars must reroute or pull to the side and park. But how fast can Waymo know that there's been an accident and that a passing motorist is already directing traffic?
Here's a video that says Waymo has solved the problems of construction zones and emergency vehicles. I'm skeptical myself:
--Percy
Edited by Percy, : Typo.

Replies to this message:
 Message 85 by NoNukes, posted 03-24-2018 8:36 PM Percy has replied
 Message 89 by Stile, posted 03-26-2018 12:36 PM Percy has replied

  
Minnemooseus
Member
Posts: 3941
From: Duluth, Minnesota, U.S. (West end of Lake Superior)
Joined: 11-11-2001
Member Rating: 10.0


Message 84 of 142 (830199)
03-24-2018 8:10 PM
Reply to: Message 77 by Percy
03-21-2018 10:09 PM


Re: Video of the Pedestrian Collision
It's been a few days since I watched the video and my computer is being balky about playing it now.
My impression: At least what the camera captured, the headlights didn't seem to be covering the area very well, especially to the left side where the pedestrian was.
It looks to me like the vehicle failed to detect the pedestrian.
It also looks like the pedestrian failed to detect the vehicle. Why would someone walk out in front of a moving vehicle like that??? Busy texting???
Moose
Edited by Minnemooseus, : Typo.

Professor, geology, Whatsamatta U
Evolution - Changes in the environment, caused by the interactions of the components of the environment.
"Do not meddle in the affairs of cats, for they are subtle and will piss on your computer." - Bruce Graham
"The modern conservative is engaged in one of man's oldest exercises in moral philosophy; that is, the search for a superior moral justification for selfishness." - John Kenneth Galbraith
"Yesterday on Fox News, commentator Glenn Beck said that he believes President Obama is a racist. To be fair, every time you watch Glenn Beck, it does get a little easier to hate white people." - Conan O'Brien
"I know a little about a lot of things, and a lot about a few things, but I'm highly ignorant about everything." - Moose

This message is a reply to:
 Message 77 by Percy, posted 03-21-2018 10:09 PM Percy has replied

Replies to this message:
 Message 86 by Percy, posted 03-24-2018 8:53 PM Minnemooseus has seen this message but not replied
 Message 88 by Stile, posted 03-26-2018 12:22 PM Minnemooseus has seen this message but not replied

  
NoNukes
Inactive Member


Message 85 of 142 (830200)
03-24-2018 8:36 PM
Reply to: Message 83 by Percy
03-24-2018 2:04 PM


Re: The Downside of Human Instructors
But research shows that the on average people can only maintain vigilance for a monotonous task for less than half an hour. After the first half hour an autonomous vehicle's safety driver becomes of less and less utility. He is likely allowing situations go by where intervention would have improved the software.
The solution for this seems rather obvious. Put several folks in the car and allow them to switch off as needed. This problem is not of a nature that makes solving the problem impossible; just a little more expensive. If needed, you can enforce the limitation in software by preventing the machine from learning lessons when the supervisor is not at his best.
And so because of the handoff problem, many autonomous vehicle companies are pursuing Level 5 autonomy. I think this is a mistake and can see two possible outcomes in the near term (the next few years):
I think unfortunate rather than mistaken fits better here. I don't see a viable alternative.
In my opinion there will never be any such thing as a car with no steering wheel or pedals, because without them there would be no way to do simple maneuvers, like moving the car in the driveway that is blocking Uncle Fred.
I agree. But if that were the only limitation, then wow, what a fantastic change that could be for all of us! All of that driving time spent in productive activities while still retaining the point to point convenience of operating your own vehicle.

Under a government which imprisons any unjustly, the true place for a just man is also in prison. Thoreau: Civil Disobedience (1846)
"Give me your tired, your poor, your huddled masses yearning to breathe free, the wretched refuse of your teeming shore. Send these, the homeless, tempest-tossed to me, I lift my lamp beside the golden door!
We got a thousand points of light for the homeless man. We've got a kinder, gentler, machine gun hand. Neil Young, Rockin' in the Free World.
Worrying about the "browning of America" is not racism. -- Faith
I hate you all, you hate me -- Faith

This message is a reply to:
 Message 83 by Percy, posted 03-24-2018 2:04 PM Percy has replied

Replies to this message:
 Message 87 by Percy, posted 03-24-2018 9:08 PM NoNukes has replied

  
Percy
Member
Posts: 22392
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.3


Message 86 of 142 (830204)
03-24-2018 8:53 PM
Reply to: Message 84 by Minnemooseus
03-24-2018 8:10 PM


Re: Video of the Pedestrian Collision
Minnemooseus writes:
My impression: At least what the camera captured, the headlights didn't seem to be covering the area very well, especially to the left side where the pedestrian was.
Yeah, weird. Seems like the headlights on the vehicle had a very short range. But Lidar all by itself should have detected the pedestrian. I saw some comments by industry experts who seemed to agree that it looked more like a software than a sensor problem.
It also looks like the pedestrian failed to detect the vehicle. Why would someone walk out in front of a moving vehicle like that??? Busy texting???
We still have so little information on the pedestrian beyond her name. I saw some speculation that she was homeless. Maybe more information will become available soon.
--Percy

This message is a reply to:
 Message 84 by Minnemooseus, posted 03-24-2018 8:10 PM Minnemooseus has seen this message but not replied

Replies to this message:
 Message 90 by Stile, posted 03-26-2018 12:44 PM Percy has seen this message but not replied

  
Percy
Member
Posts: 22392
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.3


Message 87 of 142 (830205)
03-24-2018 9:08 PM
Reply to: Message 85 by NoNukes
03-24-2018 8:36 PM


Re: The Downside of Human Instructors
NoNukes writes:
But research shows that the on average people can only maintain vigilance for a monotonous task for less than half an hour. After the first half hour an autonomous vehicle's safety driver becomes of less and less utility. He is likely allowing situations go by where intervention would have improved the software.
The solution for this seems rather obvious. Put several folks in the car and allow them to switch off as needed. This problem is not of a nature that makes solving the problem impossible; just a little more expensive. If needed, you can enforce the limitation in software by preventing the machine from learning lessons when the supervisor is not at his best.
I've read that safety drivers are supposed to keep their hands within an inch of the steering wheel, their foot next to the brake, and their eyes on the road. I can see driver rotation alleviating attention fatigue the first week or two on the job, but might not the sheer passivity and monotony of the exercise for week after week make the effective attention period for each shift shorter and shorter?
And so because of the handoff problem, many autonomous vehicle companies are pursuing Level 5 autonomy. I think this is a mistake and can see two possible outcomes in the near term (the next few years):
I think unfortunate rather than mistaken fits better here. I don't see a viable alternative.
To pursuing Level 5? Wouldn't Level 2 be as effective in saving lives?
In my opinion there will never be any such thing as a car with no steering wheel or pedals, because without them there would be no way to do simple maneuvers, like moving the car in the driveway that is blocking Uncle Fred.
I agree. But if that were the only limitation, then wow, what a fantastic change that could be for all of us! All of that driving time spent in productive activities while still retaining the point to point convenience of operating your own vehicle.
It'd be wonderful, but I wonder if the problem isn't more like AI than chess in difficulty.
--Percy

This message is a reply to:
 Message 85 by NoNukes, posted 03-24-2018 8:36 PM NoNukes has replied

Replies to this message:
 Message 95 by NoNukes, posted 03-26-2018 5:25 PM Percy has replied

  
Stile
Member
Posts: 4295
From: Ontario, Canada
Joined: 12-02-2004


(1)
Message 88 of 142 (830288)
03-26-2018 12:22 PM
Reply to: Message 84 by Minnemooseus
03-24-2018 8:10 PM


Re: Video of the Pedestrian Collision
Minnemooseus writes:
It also looks like the pedestrian failed to detect the vehicle. Why would someone walk out in front of a moving vehicle like that??? Busy texting???
I completely agree that, from the video anyway, it seems the lady could have very easily saved her own life.
Of course, though, the vehicle would be "at fault" in this vehicle-pedestrian collision as pedestrians always have the right-of-way (as far as I'm aware?).
As my Dad told me when talking about the right/wrong and traffic legalities: "You don't want to be dead right."

This message is a reply to:
 Message 84 by Minnemooseus, posted 03-24-2018 8:10 PM Minnemooseus has seen this message but not replied

  
Stile
Member
Posts: 4295
From: Ontario, Canada
Joined: 12-02-2004


Message 89 of 142 (830290)
03-26-2018 12:36 PM
Reply to: Message 83 by Percy
03-24-2018 2:04 PM


Re: The Downside of Human Instructors
I would be aiming for something like this:
A level of autonomy where the user is not legally obligated to control the vehicle.
You get in, tell it where you want to go, and it takes you there.
Now, that said, the vehicle will still have the ability for manual driving. Although perhaps wheel/pedals can be stored away and become available when required.
Vehicle identifies need for manual? Pull over and stop as safely as possible. If manual is activated before then... well, fine. If not, vehicle sits at side of road until manual is engaged and problem is dealt with.
This situation in question with the lady being hit - would be under the control of the programming. I think technology/programming can deal with this situation.
Perhaps I'm wrong, but I certainly think it's a goal we can aim for in the very short term.
Construction?
Person directing traffic?
Maybe autonomy only works on highways and exit is coming up?
-Any time you can identify that the car doesn't "know" exactly what it's doing... swap to manual by slowing down and pulling over and waiting for person to take over.
How does the car "know" exactly what it's doing?
-program parameters to verify vision system is working well enough (lighting? identification of horizon? identification of current lane and surrounding objects?)
-program parameters to verify inputs are working (redundant cameras/sensors... compare input and if they are off by too much... not good enough to continue)
-program parameters to verify outputs are functioning (when used, monitor state of output and verify it's within acceptable parameters)
-program parameters to verify code is not corrupted (run two identical logic routines... stored in different locations, this way system can verify it's running itself correctly... like a redundant system... this is how "safety computers" work in industrial automation).
Can't program your vehicle so that it can safely pull over and stop?
-Then you should not be making an autonomous vehicle in the first place
-This should be highest/top priority to figure out. If you can't slow, pull over and stop under various/unknown conditions... how do you expect to go 200 mph and make reliable decisions?
That would be my very zoomed-out scope and goals, anyway.
What level of autonomy would that be? Level 3? Level 4?

This message is a reply to:
 Message 83 by Percy, posted 03-24-2018 2:04 PM Percy has replied

Replies to this message:
 Message 92 by Percy, posted 03-26-2018 3:16 PM Stile has replied

  
Stile
Member
Posts: 4295
From: Ontario, Canada
Joined: 12-02-2004


Message 90 of 142 (830291)
03-26-2018 12:44 PM
Reply to: Message 86 by Percy
03-24-2018 8:53 PM


Re: Video of the Pedestrian Collision
Percy writes:
Seems like the headlights on the vehicle had a very short range.
Just a few quick notes on this:
The video may or may not be what "the program" is seeing.
The video may be stored in a lesser quality so that review can be possible without overloading the system with too much data storage.
While a full "hi-res" version might be used in the program.
Ever notice that sometimes things look darker/lighter in pictures you take than in reality?
-Something like this may be going on.
-Perhaps the headlights do light up a lot more, if you were in the car and looking out the windshield, but maybe this is all the camera was able to pick up
We still have so little information on the pedestrian beyond her name. I saw some speculation that she was homeless. Maybe more information will become available soon.
I find it strange that she doesn't seem to be aware of the vehicle at all. I don't see her turn to look at the camera/car before it hits her. Who wouldn't notice bright lights and big noise bearing down on you? Or is this one of those more-silent vehicles? Not that this moves the legal responsibility off the vehicle... but it does seem that "something" isn't right with this pedestrian. Perhaps a mental issue? Perhaps headphones? Perhaps an indignant "vehicles stop for me!" attitude? Perhaps a family emergency or other super-important issue is on this lady's mind at the moment? Something else?

This message is a reply to:
 Message 86 by Percy, posted 03-24-2018 8:53 PM Percy has seen this message but not replied

Replies to this message:
 Message 91 by ringo, posted 03-26-2018 1:04 PM Stile has seen this message but not replied

  
Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024