Classification - Types of Drivers Essay
787 WordsSep 4th, 20104 Pages
There are three types of drivers in this world: competent, overcautious, and reckless. After driving for many years in frustrating rush hour traffic, one might find there are three types of drivers, competent drivers who keep the flow going, over cautious drivers who cause slow and backed up traffic, and reckless drivers who weave in and out of traffic causing one near death experience after another. Trying to sort out what type of driver a person might be is an extremely challenging task. In a person's own mind, they think they are the aggressive type of driver, or the cautious type, but no one will ever admit that they are reckless kind. In most cases they’re too oblivious to these classifications and all other…show more content…
A final classification of an overcautious driver would be one who was previously in bad accident. Feeling the mental and financial sting of having to buy a whole new car really puts things into perspective, but hinders them on the road.
Now we get into what is known as the reckless driver. A picture comes to mind of a frazzled man or woman driving a beat up Oldsmobile; cigarette dangling from the lip of their mouth, swerving in and out of traffic while others beep their horns in disapproval. This type of driver can occur when a competent driver has had a bad day and is encountering the overly cautious driver. Stereotypically this driver is younger in age and male, but I feel that the reckless driver has no set age or sex. They get so angry for the smallest of reasons that they don’t really care what they do as a result, so long as they go out in a blaze of glory. They tend to disregard most driving signs and have no qualms about risking their life or the lives of others. These are the drivers who barely apply the brake when coming to a stop, more like “tap tap and go”. My uncle Jim is a prime example of a reckless driver. At the age of 82, he drives a boat of a Cadillac, commanding the streets with his led foot and somewhat latent reactions to those around him. Try telling him to ease up or look out, and he’s bound to increase the gas flow to his V8 engine.
After driving for many
(TNS) -- WASHINGTON — As auto accidents go, it wasn’t much: twelve minutes before noon on a cool June day, a Chevrolet Bolt was rear ended as it crawled from a stop light in downtown San Francisco.
What made this fender bender noteworthy was the Bolt’s driver: a computer.
In California, where companies like Cruise Automation and Waymo are ramping up testing of self-driving cars, human drivers keep running into them in low-speed fender benders. The run-ins highlight an emerging culture clash between humans who often treat traffic laws as guidelines and autonomous cars that refuse to roll through a stop sign or exceed the speed limit.
RELATEDAutonomous Vehicles' Future May Be Further Away, Different Than ImaginedHow Beaverton, Ore., Is Preparing for the Onslaught of Autonomous VehiclesDallas-Fort Worth Region Plots Practical Course for Autonomous Vehicles
“They don’t drive like people. They drive like robots,” said Mike Ramsey, an analyst at Gartner who specializes in advanced automotive technologies. “They’re odd and that’s why they get hit.”
Companies are now testing autonomous vehicles from Phoenix to Pittsburgh and developers are closely watching how they interact with their human-driven counterparts as they prepare for a future in which they will be sharing the road.
What they’ve found is that while the public may most fear a marauding vehicle without a driver behind the wheel, the reality is that the vehicles are overly cautious. They creep out from stop signs after coming to a complete stop and mostly obey the letter of the law — unlike humans.
Smoothing out that interaction is one of the most important tasks ahead for developers of the technology, says Karl Iagnemma, chief executive officer of self-driving software developer NuTonomy.
“If the cars drive in a way that’s really distinct from the way that every other motorist on the road is driving, there will be in the worst case accidents and in the best case frustration,” he said. “What that’s going to lead to is a lower likelihood that the public is going to accept the technology.”
Sensors embedded in autonomous cars allow them to “see” the world with far more precision than humans, but the cars struggle to translate visual cues on the road into predictions about what might happen next, Iagnemma said. They also struggle to handle new scenarios they haven’t encountered before.
California is the only state that specifically requires reports when an autonomous vehicle is involved in an accident. The records show vehicles in autonomous mode have been rear-ended 13 times in the state since the beginning of 2016, out of 31 collisions involving self-driving cars in total, according to the California Department of Motor Vehicles.
The collisions also almost always occur at intersections rather than in free-flowing traffic. A Cruise autonomous vehicle was rear-ended last month, for example, while braking to avoid a vehicle drifting into its lane from the right as traffic advanced from a green light.
Waymo’s now-retired “Firefly” autonomous vehicle prototypes were rear-ended twice at the same intersection in Mountain View, Calif., in separate instances less than a month apart in 2016. In both cases, the Waymos were preparing to make a right hand turn before they stopped to yield for oncoming traffic and got hit from behind.
Another time a vehicle was rear-ended by a cyclist after it braked to avoid another car. And a truck racing to pass a slow-moving self-driving vehicle before a stop sign clipped it as it scooted back to the right.
The state’s crash reports don’t assign blame and provide only terse summaries of the incidents, but a few themes are common. They’re almost always low-speed fender benders with no injuries. The Bolt, for example, was traveling at less than 1 mile per hour when it was rear-ended. While they represent a minuscule share of crashes in the state, autonomous vehicles are also a very small share of the vehicles on the road.
“You put a car on the road which may be driving by the letter of the law, but compared to the surrounding road users, it’s acting very conservatively,” Iagnemma said. “This can lead to situations where the autonomous car is a bit of a fish out of water.”
A spokeswoman for Cruise, which was acquired by General Motors Co. last year, said the crash reports speak for themselves.
The company’s chief executive officer Kyle Vogt said in a September blog post that the company’s third-generation autonomous Chevrolet Bolts are “designed to emulate human driving behavior but with the human mistakes omitted.”
San Francisco’s streets are chaotic, but that’s helping Cruise program its cars to learn how to react to those challenges, Vogt said in a separate blog post.
“People put junk in the street. They park everywhere. People don’t obey crosswalks,” Vogt wrote. “Our vehicles must be assertive, nimble, and sometimes a bit creative.”
Waymo, Alphabet’s self-driving car unit, has tried to refine how its vehicles act so that they are more natural. For example, the developer altered its software dictating how the cars handled turns to be more comfortable for passengers, says Duke University robotics professor Missy Cummings.
“They were cutting the corners really close, closer than humans would,” she said. “We typically take wider turns.”
Waymo is also using simulations to try to teach its cars to inch forward at flashing yellow lights. Dmitri Dolgov, Waymo’s technology chief, wrote in a December 2016 blog post that the companies were getting better at navigating the nonverbal dance of interacting with others on the road.
Ford Motor Co. went so far as to put a vehicle on the road along with a driver masked to resemble the car’s seat. The experiment, conducted in cooperation with the Virginia Tech Transportation Institute, was designed to assess how driverless cars could communicate with other roadway users, using light signals to replace the eye contact and other signals that humans use to navigate city streets.
“Humans violate the rules in a safe and principled way, and the reality is that autonomous vehicles in the future may have to do the same thing if they don’t want to be the source of bottlenecks,” Iagnemma said.
A warm, clear climate and hands-off approach to regulations has recently made Phoenix a hotbed of testing. Waymo began offering rides in a fleet of self-driving Chrysler Pacifica minivans to the public there in April.
Sergeant Alan Pfohl, a spokesman for the Phoenix Police Department, says the testing is going smoothly thus far.
The only crash he’s aware of is one last March in which an Uber self-driving Volvo SUV was toppled after being hit by another vehicle that failed to yield. No injuries were reported.
“Technology can always fail, but so can humans,” Pfohl said.
©2017 Bloomberg News Distributed by Tribune Content Agency, LLC.