Lyft crowdsources driver data to train its autonomous vehicle systems


In a blog post published early this morning, Lyft announced it has begun leveraging data from its ride-hailing network to improve the performance of its autonomous vehicle systems. A subset of drivers’ cars — currently Select Express Drive vehicles, as well as Lyft’s autonomous vehicles in Palo Alto and select cars that follow the vehicles for safety purposes — are now equipped with inexpensive camera sensors, enabling them to capture challenging scenarios while helping solve problems like generating 3D maps and improving simulation tests.

Lyft, which was among the companies forced to pause driverless vehicle testing as a result of the pandemic, is looking to bolster development as much of its fleet and Palo Alto pilot remain grounded. While the company told VentureBeat in an earlier interview it would “double down” on simulation by developing against data from the roughly 100,000 miles covered by its self-driving cars, there’s a limit to what simulation can accomplish. Partly as a consequence of halted real-world vehicle testing, the coronavirus has delayed Lyft rival Waymo’s work by at least two months. Meanwhile, Ford pushed the launch of its driverless vehicle service from 2021 to 2022. Analysts like Boston Consulting Group’s Brian Collie now believe broad commercialization of autonomous cars won’t happen before 2025 or 2026, at least three years later than originally anticipated.

Lyft says the data from drivers’ vehicles will allow it to continuously update “city-scale” 3D maps the company built using technology developed by Blue Vision Labs, which it acquired in 2018. Like other outfits developing self-driving vehicle systems, Lyft creates high-definition, centimeter-level maps of roads, buildings, vegetation, and other objects to help the vehicles localize. These maps also provide contextual information like speed limits and the location of traffic lanes and pedestrian crossings. Lyft’s backend generates this contextual information from ride-sharing data by using a combination of computer vision and AI to automatically identify traffic objects (e.g., traffic lights), which it pairs with situational data, such as where lanes and traffic lights are, to understand how drivers handle risky situations.

According to Lyft, every driver in the program gets a one-page disclosure detailing information about the camera and the data being collected. The camera isn’t linked to the driver in any way, as it’s forward-facing and doesn’t collect audio.

VB Transform 2020 Online – July 15-17. Join leading AI executives: Register for the free livestream.

Drawing data from Lyft’s network — in tandem with visual localization technology — also helps shed light on human driving patterns, the company says. Lyft tracks the trajectories of real-world drivers on its maps with “great accuracy,” enabling it to ensure, for example, that its autonomous vehicles maintain optimal lane locations. “Thanks to ride-sharing data, our [autonomous vehicle] motion planner does not need to use ad-hoc heuristics like following lane centers when deciding where to drive, which requires various exceptions to handle all possible corner cases,” the company explained in a blog post. “Instead, the planner can rely on the real-world information and … human driving experience that are naturally encoded in the ride-share trajectories … While common sense suggests that staying close to the center of the lane is the safest option, historical [ride-sharing] data proves that this assumption is not always true. Human driving is much more nuanced due to local features in the road (like parked cars or potholes) and other facets, such as road design or road shape and visibility.”

This approach led Lyft to adopt an autonomous systems design paradigm it calls “human-inspired” planning, which it first detailed in a press release last December. Lyft’s planning system uses ride-sharing data to learn things like how to slow down for cars performing high-speed merges, and it validates the safety and legality of planned behaviors before executing them, akin to Nvidia’s Safety Force Field and Intel subsidiary Mobileye’s Responsibility-Sensitive Safety. It also considers the notion of perceived safety, which refers to minimizing passengers’ and other drivers’ perceptions of being unsafe (like increasing the distance to a lead car or ensuring the autonomous car doesn’t get too close to a lane divider). It also considers comfort, like reducing speed turns that might induce nausea, and route efficiency.

Lyft self-driving

“Every day, trips are completed on our network that cover a wide variety of driving scenarios, ranging from pickups and drop-offs to situations that require immediate and critical thinking … But as autonomous vehicles (AVs) become a mainstream transportation option, the need to make such real-time assessments is no longer isolated to human drivers,” wrote Lyft in a blog post. “By leveraging [ride-hailing] data, Lyft is uniquely positioned to develop safe, efficient, and intuitive self-driving systems.”

In some ways, Lyft’s approach is much like that of Tesla, which conducts driverless vehicle testing via simulation, test tracks, and public roads but also “shadow-tests” its cars’ capabilities by collecting billions of miles of data from hundreds of thousands of customer-owned vehicles “during normal driving operations.” Telsa’s Autopilot — the software layer running atop its custom chips — is effectively an advanced driver assistance system (ADAS) that taps machine learning algorithms and an array of cameras, ultrasonic sensors, and radars to perform self-parking, lane-centering, adaptive cruise control, highway lane-changing, and other feats. The company previously claimed that cars with Full Self-Driving Capability, a premium Autopilot package, will someday be ready for “automatic driving on city streets” and to “recognize and respond to traffic lights and stop signs.”

Lyft self-driving

The R&D division behind Lyft’s efforts — Level 5 — was founded in July 2017, and it has developed novel 3D segmentation frameworks, methods of evaluating energy efficiency in vehicles, and techniques for tracking vehicle movement using crowdsourced maps, among other things. Last year, Lyft announced the opening of a new road test site in Palo Alto, California, near its Level 5 division’s headquarters. That development came after a year in which Lyft expanded access to its employee self-driving service in Palo Alto with human safety drivers on board in a limited area.

In November 2019, Lyft revealed that its autonomous cars were driving 4 times more miles on a quarterly basis than they were six months ago and that it has about 400 employees dedicated to development globally (up from 300). And in May, the company partnered with Google parent company Alphabet’s Waymo to enable customers to hail driverless Waymo cars from the Lyft app in Phoenix, Arizona. And Lyft has an ongoing collaboration with self-driving car startup Aptiv, which makes a small fleet of autonomous vehicles available to Lyft customers in Las Vegas.



Source link

Previous post Quarantined in New York, We Escape Skyward
Next post Apple’s new iOS 14 home screen brings Windows Phone Live Tiles back to life