Understanding through Discussion


Welcome! You are not logged in. [ Login ]
EvC Forum active members: 83 (8942 total)
31 online now:
Diomedes, DrJones*, GDR, jar, JonF, vimesey (6 members, 25 visitors)
Chatting now:  Chat room empty
Newest Member: John Sullivan
Post Volume: Total: 863,484 Year: 18,520/19,786 Month: 940/1,705 Week: 192/518 Day: 16/50 Hour: 0/4


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   Self-Driving Cars
Percy
Member
Posts: 18871
From: New Hampshire
Joined: 12-23-2000
Member Rating: 2.4


Message 1 of 142 (767510)
08-30-2015 9:32 AM


We've touched on the topic of self-driving cars in several threads recently, and this morning I came across a TED talk by Google's Chris Urmson (director of Self-Driving Cars) this past July:

Urmson describes what Google self-driving cars "see" when they're making decisions about how to react, and explains a little bit about how they're solving some of the more difficult problems. Pretty interesting stuff.

Urmsom makes it seem like they can unleash their self-driving cars anywhere in the country, but this article from a year ago, also based on information from Urmson, says that anywhere Google's cars go must be microscopically mapped, "vastly more effort than what's needed for Google Maps": Hidden Obstacles for Google’s Self-Driving Cars.

How receptive will the American public be to self-driving cars? Right now the answer seems to be, "pretty receptive," but how will public opinion react the first time a self-driving car causes a 13-car pileup with fatalities on the interstate?

Let me put the question in concrete terms. Currently 33,000 people die each year on American roads. Let's say that self-driving cars cut that number to 10,000, but that almost all that 10,000 is because of failures in the self-driving car technology. Will the American public accept that? They've behaved irrationally before, they could again.

--Percy


Replies to this message:
 Message 2 by Capt Stormfield, posted 08-30-2015 11:33 AM Percy has responded
 Message 9 by Jon, posted 08-30-2015 1:42 PM Percy has acknowledged this reply
 Message 25 by Thugpreacha, posted 09-01-2015 2:07 AM Percy has acknowledged this reply

    
Percy
Member
Posts: 18871
From: New Hampshire
Joined: 12-23-2000
Member Rating: 2.4


(1)
Message 3 of 142 (767522)
08-30-2015 12:54 PM
Reply to: Message 2 by Capt Stormfield
08-30-2015 11:33 AM


Capt Stormfield writes:

I don't accept the premise that the choice is irrational. Given the existence of a skill set that helps avoid accidents, it is quite rational for individuals who possess that skill set to choose their own judgement over a random chance of technical failure.

Watch these accidents for a few minutes. Some of the accidents are sheer idiocy, but a number of them are not avoidable no matter how elite one's skill set:

With self-driving cars, not only are you protected against your own idiocy, but everyone else's, too. For example, here's one where the driver is waiting to turn left:

How does one avoid an accident when one isn't moving? Had he suddenly accelerated to the right he would have struck the car passing him on his right. How does the van avoid this one:

How does the dark sedan on the right avoid the little pickup:

How does the oncoming traffic avoid this:

Or how often will someone be vigilant enough to avoid this car going through a red light:

Here's a similar red light situation at night:

--Percy


This message is a reply to:
 Message 2 by Capt Stormfield, posted 08-30-2015 11:33 AM Capt Stormfield has responded

Replies to this message:
 Message 5 by Capt Stormfield, posted 08-30-2015 1:05 PM Percy has responded
 Message 13 by ringo, posted 08-30-2015 3:07 PM Percy has acknowledged this reply

    
Percy
Member
Posts: 18871
From: New Hampshire
Joined: 12-23-2000
Member Rating: 2.4


Message 16 of 142 (767585)
08-30-2015 5:46 PM
Reply to: Message 5 by Capt Stormfield
08-30-2015 1:05 PM


Capt Stormfield writes:

I didn't say all accidents were avoidable, but many are.

I agree with you that many accidents are avoidable, but the accidents I indicated in that video show that many are not avoidable by even the most elite of drivers.

As you state in the part of the post I responded to, there will also be accidents with driverless cars.

But many, many less. I suggested the example of reducing annual motor vehicle deaths from 33,000 to 20,000, and that's very minimal. If self-driving cars were the norm it would be much, much less. Yes, there will still be accidents with self-driving cars, but what do you think that annual fatalities would be if all cars were self-driving? Don't you think it would be a very small number, say in the low thousands? Pedestrian fatalities are included in the motor vehicle fatality figures, but we shouldn't forget that pedestrian fatalities will also experience dramatic reductions. As you saw in that TED video, the Google car does an excellent job tracking pedestrians.

I think the desire to retain as much control as possible over one's fate is one of the better parts of human nature, not an irrationality.

I agree that a desire for control is part of human nature, but it isn't often claimed that human nature is rational. I also agree that many will still want to drive their car even after self-driving cars are available, but they won't be safer than self-driving cars, and if safety is the criteria then wanting to drive anyway won't be rational, either.

--Percy


This message is a reply to:
 Message 5 by Capt Stormfield, posted 08-30-2015 1:05 PM Capt Stormfield has not yet responded

    
Percy
Member
Posts: 18871
From: New Hampshire
Joined: 12-23-2000
Member Rating: 2.4


Message 18 of 142 (767587)
08-30-2015 6:11 PM
Reply to: Message 17 by NoNukes
08-30-2015 5:54 PM


Re: Idiots
NoNukes writes:

Perhaps. But in the meantime, who gets sued when your autobot hits a 1st year medical student, and the damages are calculated to be an entire doctor's career worth of earnings? You? GM?

Interesting question. Insurance will play a role, and the details of car insurance will likely evolve with the arrival of self-driving cars. What could an occupant of a self-driving car do to be held responsible for an accident? GM seems a more likely possibility (assuming it's not a Ford ).

Actually there is a lot of stuff written about this subject already. The discussion here just pokes around at a lot of low hanging fruit.

Do you have some useful links?

--Percy


This message is a reply to:
 Message 17 by NoNukes, posted 08-30-2015 5:54 PM NoNukes has responded

Replies to this message:
 Message 20 by NoNukes, posted 08-30-2015 8:48 PM Percy has responded

    
Percy
Member
Posts: 18871
From: New Hampshire
Joined: 12-23-2000
Member Rating: 2.4


Message 22 of 142 (767633)
08-31-2015 10:20 AM
Reply to: Message 20 by NoNukes
08-30-2015 8:48 PM


Re: Idiots
Thanks for the links, here's one more that one of your links pointed to:

Using Future Internet Technologies to Strengthen Criminal Justice

Here are some interesting issues they raised:

  • Weather. Conditions that a human might manage to navigate could be too much for a self-driving car. I imagine that a self-driving car would be forced to pull over to the side of the road and stop in some conditions, such as glare ice, or blinding rain or snow. Whether that's a good thing or bad thing, it's certainly a safe thing.

    Or would the vision systems of self-driving cars, be they lasers or radar or infrared or ultrasonic or a combination or whatever, be able to see through the weather far better than humans?

  • Inter-car communication, also known as V2V (vehicle-to-vehicle). Will cars be able to communicate with one another? If so, it opens up a wealth of possibilities. In the ultimate degree the need for stop signs, yield signs, traffic lights and even overpasses goes away as self-driving cars communicate with one another to negotiate their way through lane changes, turns and intersections. Highway capacity could possibly increase enormously.

  • Laws governing self-driving cars will likely vary by state. It is entirely possible that the self-driving car you use in your daily commute would be illegal during portions of a trip to visit grandma several states away.

  • Liability. If a self-driving car provides the driver the option of asserting control, who is responsible in the event of an accident? Presumably the data history would reveal whether the car or the driver was in control at the time of an accident, so maybe it's not an issue.

    It does seem to me that whatever the initial legal framework when self-driving cars first start appearing on our roads in significant numbers, their dramatically lower accident rate should drive rapid changes. And as a couple articles note, the share of liability of manufacturers and maintenance/repair businesses will likely increase.

  • Cost. Who knew? Cost will be a big factor in how fast self-driving cars fill our highways. I didn't see any cost estimates, but several comments made in passing lead me to believe that self-driving cars might be very expensive. Perhaps people could afford it because of reduced insurance costs. From Self-Driving Cars and Insurance: "Data from the Institute for Highway Safety (IIHS) and Highway Loss Data Institute (HLDI) already show a reduction in property damage liability and collision claims for cars equipped with forward-collision warning systems, especially those with automatic braking."

  • The human urge to be in control. Self-Driving Cars and Insurance also echos some of the comments in this thread: "In addition, some people who enjoy driving and do not want control to be taken from them may resist the move to complete automation."

  • Ethical issues. An example from How to Help Self-Driving Cars Make Ethical Decisions: "A child suddenly dashing into the road, forcing the self-driving car to choose between hitting the child or swerving into an oncoming van."

    Those who have seen the movie I, Robot might recall that just such an ethical dilemma was at the heart of main character Del Spooner's objection to robots, that after an accident one chose to save him instead of a young child. (It occurs to me now that a problem in that movie is that even given the incredible abilities of the robots, humans still drove their own cars.)

    But is that a child in the street? Or a dog? Or even just a beach ball? Imagine the liability if the car's autonomous systems directed the car into the path of an oncoming van to avoid a beach ball. Just how good will the car's recognition systems be?

    The article concludes by posing this thoughtful quandry: 'Walker-Smith adds that, given the number of fatal traffic accidents that involve human error today, it could be considered unethical to introduce self-driving technology too slowly. “The biggest ethical question is how quickly we move. We have a technology that potentially could save a lot of people, but is going to be imperfect and is going to kill.”'

--Percy

Edited by Percy, : Typo.


This message is a reply to:
 Message 20 by NoNukes, posted 08-30-2015 8:48 PM NoNukes has acknowledged this reply

Replies to this message:
 Message 23 by Stile, posted 08-31-2015 2:30 PM Percy has acknowledged this reply

    
Percy
Member
Posts: 18871
From: New Hampshire
Joined: 12-23-2000
Member Rating: 2.4


(1)
Message 27 of 142 (767712)
09-01-2015 8:30 AM


Google Goes Bladeless, Plus Random Thoughts
The new Google self-driving cars not only have no accelerator, brake pedal or steering wheel, they also have no windshield wipers.

I haven't seen this mentioned explicitly anywhere, but I'm guessing that the Google cars also have no turn signal controls, no windshield wiper controls, no cruise control.

Speaking of cruise control, how closely do self-driving cars obey the speed limit? If they strictly follow the speed limit then they will be very annoying to almost all other cars on the road. Can you set how much over the limit they should go? Can you tell them to increase or decrease their average speed? Can you tell a self-driving car to go 100 mph? I did find this at Google's driverless cars designed to exceed speed limit:

quote:
Google's self-driving cars are programmed to exceed speed limits by up to 10mph (16km/h), according to the project's lead software engineer.

Dmitri Dolgov told Reuters that when surrounding vehicles were breaking the speed limit, going more slowly could actually present a danger, and the Google car would accelerate to keep up.


This means that when self-driving cars are surrounded only by other self-driving cars, they'll all be doing the speed limit. On our local highway where the speed limit is 50 mph, no car is doing less than 60, many are doing 70, and the left hand lane is generally going 75-80.

This means that self-driving cars have the potential to greatly increase travel times. Imagine driving cross country through empty wide-open spaces at 65 mph instead of 80 mph. The travel time for a trip with 400 highway miles would increase from 5 hours to 6 hours 9 minutes.

I wonder how easy it will be to suddenly change your destination in a self-driving car. Say I'm driving home with no intention of stopping, but a couple hundred feet before I reach the turn-in for the grocery store I remember I need pimentos. If I'm driving the car myself this isn't a problem since I have plenty of time to signal and make a safe turn into the grocery store parking lot, but if I'm in a self-driving car I assume I'll be way past the grocery store by the time I even call up the screen to enter my new destination. What is needed is a quick way to instruct the car to "turn right here."

Here's another hypothetical scenario. You don't know the street address of where you're going, but you know how to get there (this makes perfect sense to those of us who are pre-GPS (I still don't have a GPS), but will become less and less common as time goes by). Will there be an easy way to give a self-driving car a series of instructions that boil down to, "Turn left at the next light; turn right in three blocks; turn left at the Exxon station; stop here." When you're in a city you need a "troll for a parking spot" mode.

Speaking of parking, which parking space does a self-driving car choose in a parking lot? For how long does it seek the closest space? If it chooses a space you don't like (say it's raining), how do you tell it to find a better one? When you're in a huge shopping mall parking lot, how do you tell a self-driving car to park by Macy's and not by Sears? When parking for a local sporting event overflows onto a grassy field, how does the self-driving car know where to park? Will it be able to detect the 16-year old kid beckoning to you from 300 feet away that you should drive down to him and park at the end of the row? Or will people attempting to give parking instructions to self-driving cars just give up and let them do whatever they want?

But why even worry about parking? What you'll usually want is for the self-driving car to drop you off at the front door of your destination, then go find a parking spot on its own. But if you're in a city maybe it won't be able to find a parking spot, so it will just keep looking. But what if it runs low on fuel? Will a self-driving car be able to refuel on its own? Maybe we also need auto-refueling stations.

AbE: But what if after your self-driving car drops you off it *does* find a parking space. How will it put coins in the meter? Obviously parking meters will have to have WiFi or Bluetooth.

--Percy

Edited by Percy, : AbE.


Replies to this message:
 Message 28 by New Cat's Eye, posted 09-01-2015 8:39 AM Percy has acknowledged this reply
 Message 29 by AZPaul3, posted 09-01-2015 8:59 PM Percy has responded
 Message 32 by NoNukes, posted 09-02-2015 12:17 AM Percy has responded

    
Percy
Member
Posts: 18871
From: New Hampshire
Joined: 12-23-2000
Member Rating: 2.4


Message 34 of 142 (767791)
09-02-2015 7:50 AM
Reply to: Message 29 by AZPaul3
09-01-2015 8:59 PM


Re: Google Goes Bladeless, Plus Random Thoughts
Yes, precisely, well said, but I do have some pretty strong concerns in one area, and Cat Sci brought up the same issue:

Cat Sci writes:

Voice commands, not clunky touch-screens.

AZPaul3 writes:

"Hey, Lexus2! Take me to that Burger King we just passed." Have it your way.

I think correctly interpreting voice commands is still a ways off, further off than self-driving cars, and that we'll be stuck with touchscreens for a while. I guess it's a debatable point, but to my mind if it were close then we'd already be interacting with our smart phones via voice commands instead of punching away at teensy tiny buttons. Siri and Google Voice are examples of how primitive the technology is today.

I take it Lexus1 is the wife's car?

Yep! Why do you ask?

--Percy


This message is a reply to:
 Message 29 by AZPaul3, posted 09-01-2015 8:59 PM AZPaul3 has acknowledged this reply

    
Percy
Member
Posts: 18871
From: New Hampshire
Joined: 12-23-2000
Member Rating: 2.4


Message 35 of 142 (767793)
09-02-2015 8:06 AM
Reply to: Message 32 by NoNukes
09-02-2015 12:17 AM


Re: Google Goes Bladeless, Plus Random Thoughts
NoNukes writes:

The advantage of not having to drive becomes even more pronounced if some substantial portion of that 400 miles takes you through rush hour traffic during which you cannot drive as fast as the speed limit. Maybe having self driving cars might mean fewer traffic jams.

I'm sure you're right, and it could be even better than this, with traffic jams becoming a thing of the past as automated route choosers direct cars away from traffic jams, as is already happening with GPS apps indicating roads with traffic jams. Automated route selection should mean that non-highways and non-main thoroughfares will see increased traffic.

Another factor reducing traffic, and I think others have already mentioned this, is more efficient use of the available space on roads and highways. There's no need for maintaining long stopping distances (headways) between vehicles when the reaction time is effectively zero. Of course, most people today drive too close, so the savings in road space of smaller headways will be less than ideal, but it should still be substantial.

--Percy


This message is a reply to:
 Message 32 by NoNukes, posted 09-02-2015 12:17 AM NoNukes has acknowledged this reply

    
Percy
Member
Posts: 18871
From: New Hampshire
Joined: 12-23-2000
Member Rating: 2.4


Message 37 of 142 (767823)
09-02-2015 1:35 PM
Reply to: Message 36 by ringo
09-02-2015 12:29 PM


Re: Google Goes Bladeless, Plus Random Thoughts
ringo writes:

You won't even need to stop at intersections. The traffic in both directions will be neatly spaced so that you can cross another vehicle's path, missing it by inches.

Earlier in the thread I suggested that possibly stop signs, yield signs, traffic lights and even overpasses might be rendered unnecessary.

I wonder if anyone has run traffic simulations for self driving cars. Is the efficient use of road space improved so much that even multilane roads become unnecessary? Certainly there's no way to pack cars more densely than this:

But with more efficient use of roadway space combined with automated routing maybe it never gets to this point.

--Percy


This message is a reply to:
 Message 36 by ringo, posted 09-02-2015 12:29 PM ringo has acknowledged this reply

Replies to this message:
 Message 38 by Jon, posted 09-02-2015 6:45 PM Percy has acknowledged this reply

    
Percy
Member
Posts: 18871
From: New Hampshire
Joined: 12-23-2000
Member Rating: 2.4


Message 41 of 142 (768811)
09-14-2015 8:23 AM


Google Weighing Sale of Self-Driving Cars
In the news today: Google moves towards self-driving car sales

The article says they'll be limited to 25 mph. I assume they'll be able to go faster when under manual control.

--Percy


    
Percy
Member
Posts: 18871
From: New Hampshire
Joined: 12-23-2000
Member Rating: 2.4


Message 42 of 142 (772386)
11-13-2015 8:12 AM


Apparently Google is Serious about 25 mph
Reported in today's Washington Post: Google self-driving car pulled over for going too slow.

The Google vehicle was going 24 mph on a road marked 35 mph.

Where I live there are many roads marked 35 mph. I'd estimate that less than 5% of vehicles travel under 35 mph, 10% go 35 mph, 60% go 40 mph, 20% go 45 mph, and the rest go whatever the road will take.

How many cars actually go under 25 mph on roads marked 35 mph? Well, ignoring the occasional tractor and slowing due to construction, obstacles and weather conditions, I'd say none. Since the opportunities to use self driving features that only operate at 25 mph and lower seem minuscule (traffic jams, driveways, some residential streets, etc.), what is Google thinking? I wouldn't myself be interested in such a car.

There's still something I'm curious about, and I think I've asked this before. I'm in my garage in my self-driving car, I give it the destination, and off we go out the three quarters of a mile of residential street to the main road. Ahead I see some neighbors on a walk and I want the car to stop next to them. Will it be able to do this? Yes? No? Sometimes? Sort of?

One can imagine the car being able to follow an instruction to stop at the next set of people, but if there are several sets of neighbors walking and not necessarily on the same side of the road (happens all the time on nice days), and I want to talk to one of them, how do I even tell the car where to stop. If my daughter were driving I could say, "Oh, there's Jane, stop a minute, I need to tell her something." What do I tell the car? I guess the easiest thing is just to revert to manual control.

--Percy


Replies to this message:
 Message 43 by ringo, posted 11-13-2015 11:06 AM Percy has acknowledged this reply
 Message 44 by AZPaul3, posted 11-13-2015 12:46 PM Percy has responded

    
Percy
Member
Posts: 18871
From: New Hampshire
Joined: 12-23-2000
Member Rating: 2.4


Message 45 of 142 (772407)
11-13-2015 2:08 PM
Reply to: Message 44 by AZPaul3
11-13-2015 12:46 PM


Re: Apparently Google is Serious about 25 mph
AZPaul3 writes:

I should hope Google is serious about the 25 mph. California law limits their vehicles to that speed...

I hadn't heard this before and wasn't able to confirm it. I did find this: Self-driving cars now legal in California. But it doesn't mention speed limits. I think it may be a limit Google is imposing on its new car that it designed itself, this one:

Not the older Lexus version:

Ahead I see some neighbors on a walk and I want the car to stop next to them. Will it be able to do this? Yes? No? Sometimes? Sort of?

That's really not important in an R&D test vehicle.

Right. I was thinking about the future, not wondering about the current Google prototype. How, exactly, are they going to solve these niggling little problems which are actually huge when it comes to delivering a successful product. If a self-driving car can safely get you all the way to grandma's and back but can't figure out where you want it to stop halfway down the driveway ("Next to the bicycle lying on the grass, you self-driving dolt!"), will that be good enough? Just wondering.

--Percy


This message is a reply to:
 Message 44 by AZPaul3, posted 11-13-2015 12:46 PM AZPaul3 has responded

Replies to this message:
 Message 46 by AZPaul3, posted 11-13-2015 3:19 PM Percy has responded

    
Percy
Member
Posts: 18871
From: New Hampshire
Joined: 12-23-2000
Member Rating: 2.4


Message 48 of 142 (772414)
11-13-2015 3:55 PM
Reply to: Message 46 by AZPaul3
11-13-2015 3:19 PM


Re: Apparently Google is Serious about 25 mph
You're right, the Mountain View Police Blog contains a link to the Low Speed Vehicle law, but there's nothing in that law about Google. I did finally find a news item on the web from last year that accurately reports the car's speed limitation: Google’s New Self-Driving Car Ditches the Steering Wheel. There's no reference to that law, but obviously Google was aware of it:

quote:
The car is limited to a maximum speed of 25 miles per hour. That speed covers most driving in most cities, Urmson said. It’s legal to drive 25 miles per hour on a street with a 35 mile per hour limit.

The blog also contains a link to California's Minimum Speed Law, which only applies to highways, and the blog correctly concludes that the Google car was operating within the law.

I'm guessing the car can exceed 25 mph but is programmed not to. This incident with the Google car creating a rolling road block says 25 mph is too slow. Even in Manhattan it's too slow. Sure, there are traffic jams in Manhattan, but not everywhere all the time. Unless they've changed it, the lights in Manhattan change green all north/south, then all east/west. When traffic jams aren't an issue there's an expectation that other cars will accelerate briskly and travel at a fair clip so as to make it through as many green lights as possible. The speed limit in Manhattan is 25 mph, but as is true in most parts of the country, if you go the speed limit you're going to annoy people. There are other articles reporting that Google's driverless cars designed to exceed speed limit (the Lexus ones), so that the new car isn't is kind of odd.

--Percy


This message is a reply to:
 Message 46 by AZPaul3, posted 11-13-2015 3:19 PM AZPaul3 has acknowledged this reply

    
Percy
Member
Posts: 18871
From: New Hampshire
Joined: 12-23-2000
Member Rating: 2.4


Message 50 of 142 (772438)
11-14-2015 7:55 AM
Reply to: Message 49 by ThinAirDesigns
11-13-2015 5:37 PM


But it was a Lexus, right?

--Percy


This message is a reply to:
 Message 49 by ThinAirDesigns, posted 11-13-2015 5:37 PM ThinAirDesigns has responded

Replies to this message:
 Message 51 by ThinAirDesigns, posted 11-14-2015 10:28 AM Percy has acknowledged this reply

    
Percy
Member
Posts: 18871
From: New Hampshire
Joined: 12-23-2000
Member Rating: 2.4


(1)
Message 52 of 142 (780728)
03-19-2016 12:58 PM


How Safe Are Autonomous Vehicles?
An article from CBS Detroit (Autonomous Cars Aren’t Perfect, But How Safe Must They Be? is very positive about the potential for saving lives, but the statistics it musters in support are weak:

quote:
A Virginia Tech University study commissioned by Google found that the company’s autonomous cars crashed 3.2 times per million miles compared with 4.2 times for human drivers. But the study had limitations. The human figures were increased to include an estimate of minor crashes that weren’t reported to police. All autonomous car crashes in California, however, must be reported. The study also didn’t include potential crashes that were avoided when human backup drivers took control.

The human crash rate is 4.2/mm (mm=million miles), and Google's is 3.2/mm? And that's not counting the 13 times humans had to intervene in a Google car to avoid a crash, which if included in the 1.5 million miles that Google cars have driven would raise the rate to 11.9/mm, far above the human rate. That's not terribly reassuring.

But the death rate across the US is 1.08/hmm (hmm=hundred million miles), while Google's is 0/hmm. But Google cars haven't even driven a hundred million miles yet - they've got 98.5 million miles to go before they reach their first hundred million miles, and at the rate they're going it will take years. We don't yet have enough evidence to judge their safety in terms of saving human lives, and we won't for a long time.

Another factor is that at least some Google cars drive at a diminished speed of 25 mph according to some articles, and so must avoid the highways and the greater speeds where many fatalities happen, so the Google statistics for vehicle death rates will always be biased on the low side until they start driving a lot of highway miles at speeds of 60, 70 and 80 mph. If car companies want us to start risking our lives in their autonomous cars, then they should at least begin risking their experimental cars in real highway driving situations. Just how well does an autonomous car do on highways at 70 mph, or in New England snow, or even just a heavy downpour?

Imagine you're driving through open countryside on a New England highway on a clear day in winter. Ahead you see a long stretch of snow has been wind-blown onto your lane, but not the adjacent lane. Car's occupy the other lane, and the vehicle behind you is tailgating. What do you do? What would an autonomous car do? What is the best course of action?

Or you're driving through the mountains on Route 89 on a clear night in Vermont at night on your way into ski country, the traffic is heavy, and ahead you see a dance of many spinning headlights and taillights - glare ice. You hit the brakes lightly and discover that you, too, are already on glare ice, with cars still under control ahead of you, behind you and beside you, but you're all about to plow into an unfolding disaster. What's the best course of action? Could an autonomous car handle it at least as well as a human?

Obviously I'm in favor of autonomous vehicles eventually taking over all driving responsibilities, but I think it should happen no faster than the evidence justifies, and when evidence doesn't exist then we shouldn't move ahead. We may be in the experimental stage of complete vehicle autonomy for quite a while.

But in the meantime autonomous features will continue rolling out. We'll be getting autonomous emergency braking in all cars within the next six years, and some cars already have automatic lane maintenance. My current car is getting up there in age and will have to be replaced soon, but I think I might like to try to extend its life until more of these kinds of autonomous features are available.

I can't help adding a pet peeve. While I'm looking forward to autonomous cars, I can't stand much of the technology in newer cars, or at least in my wife's car. One simple thing I can't stand is the door locks. On the outside of the car each door handle has a lock button. When exiting I close the door, hit the button and start walking away. No beep. So thinking I didn't hit the button hard enough I turn and hit the button again, just as the car beeps and locks the car, and so I have now unlocked the car. This happens a lot because I only drive my wife's car a couple times a month and it has a lot of idiosyncratic things to remember about it, and this is one I always forget.

One of the wonderful things about older technology is that when you hit a button, things happen right away. On my old digital radio you hit a program button and it changed to the new station immediately. On my Internet clock radio you hit a program button for another radio station, and the current station keeps playing for another second or two. So thinking you didn't hit the button hard enough you hit it again, starting the delay process all over again. How about shutting off the current station immediately so you know something happened? The necessity for feedback for what you've done is why phone and tablet screens have started issuing a little vibration or buzz every time you hit a button on the screen, but the need for immediate feedback is widely ignored in this digital age.

--Percy

Edited by Percy, : Typos.


Replies to this message:
 Message 54 by Hyroglyphx, posted 03-20-2016 12:43 AM Percy has acknowledged this reply

    
Newer Topic | Older Topic
Jump to:


Copyright 2001-2018 by EvC Forum, All Rights Reserved

™ Version 4.0 Beta
Innovative software from Qwixotic © 2019