There are autonomous
cars, and there are drivers’
cars. Now we have something in the middle. Sterling Anderson and Karl
Iagnemma of MIT have created a
semi-autonomous driving system that gives drivers full control of the
vehicle, but kicks when the car gets too close to another object. This sounds
like the adaptive
cruise control found in expensive Mercedes-Benzes, but this software is much
more nuanced and ambitious than anything on the road.
Anderson, a Ph.D. student, and Iagnemma, principle research scientist at the
MIT Robotic Mobility Group, designed the algorithm with Quantum Signal LLC, a
small technology developer in Michigan, where most of the testing was conducted.
Unlike with fully autonomous vehicles, the sensors and software evaluate the
environment with constant adjustments to boundaries. A self-parking car, for
example, works with a planned path of travel among static obstacles, a much
simpler task than adapting to constant variables, including a human driver.
There are certain advantages to synergy. “Automation excels at responding
quickly and precisely to well-defined or repetitive control objectives; humans
tend to make more mistakes as the frequency and complexity of the control task
increase,” Anderson says. In the opening to their paper for the
2012 Intelligent Vehicles Symposium, Anderson elaborates: “Conversely,
humans have the unique ability to detect and contextualize patterns and new
information, reason inductively, and adapt to new modes of operation, whereas
automation typically struggles at these tasks.”
Anderson and Iagnemma’s program evaluates surrounding constraints, then
selects which surroundings need to be acted upon. Using data from a
front-mounted camera and a laser range-finder, the algorithm picks out
“homotopies,” or safe zones, then incorporates them into a map divided into
triangles, with the edge of each shape representing a lane boundary or tree. It
accounts for the vehicle’s limits, such as steering and tire friction, plans an
escape to correct the car, then relinquishes control.
They’ve conducted over 1,200 trials using a Kawasaki Mule
outfitted with LIDAR, inertial gauges, GPS, a Linux PC, and actuators for
steering, acceleration and braking. Drivers sat before a computer monitor with
video feeding from the Mule, and drove as normal with a virtual steering wheel
and pedals. From these test runs, the software and measurers reduced chances of
an accident by 75 percent. They attribute the remaining 25 percent to their
LIDAR reader’s 9.8-foot blind spot.
The end goal would be to squeeze all of this into a dash-mounted smartphone.
Using a quality phone’s high-resolution camera, accelerometers, and gyroscope,
the software could gather all the necessary data to steer a driver away from a
potential accident, making divine robotic intervention an app away.
Source: Wired
Tidak ada komentar