TG's correspondent passengers in a self-driving hatchback heading your way soon
Driving down an East London street. A woman walks along the pavement. Her image, outlined by a green box, shows up on a dashboard video display. Ahead, a zebra crossing, with belisha beacons not traffic lights. She steps onto the crossing. The car smoothly stops. She passes, and the car proceeds. Easy.
But not easy. A towering technical achievement. All this was happening without any input from the driver. This is a fully autonomous Nissan Leaf. Driving on real roads in London. And doing such a great job of it that, paradoxically, it actually feels rather routine rather than what it truly is: absolutely extraordinary.
I’m chatting away with its engineer Tesuya Iijima and he seems pretty relaxed, but he’s got to keep an eye on what’s happening and wrest control if something goes wrong.
But he doesn’t have to. The Leaf just beetles along, following the 20-minute route on its sat nav. Avoiding other cars and cycles. Stopping at red lights and starting on green. Obeying roundabout rules. Keeping clear of kerbs. And yes, giving way on zebras. All without human intervention of any kind.
To be fair, we’re in a pretty quiet part of London, around the docklands. Iijima says it’d take six times this processor power before it can operate in the bedlam of Central London. Autonomous engineers all say European cities are more of a challenge than American or Japanese ones where everyone’s tested up to now.
But get this. In 2020 Nissan will offer for sale a car that can do what this prototype does. (And as an intermediate step, next year’s Qashqai will have ProPilot, an extremely advanced active cruise control that can self-steer even on bendy single carriageways.)
A few beyond that, Nissan expects to have production cars that have the necessary 6x braininess that’ll let the cars operate in city centres autonomously. “We will put that on the market in some territories then,” Nissan’s global head of R&D Takao Asami tells TopGear.com.
Probably not London then. Or most places, because lawmaking won’t have caught up. Nor hi-res mapmaking. So it’ll happen only in special areas that are mapped and where experimental rules and exemptions apply, he explains.
The Leaf prototype has 12 cameras aiming around its body, plus four laser scanners and five radars. They put together a detailed 3D map of what – and who – is around the car. Also, it has super-hi-res maps stored in the car. So the processors can correlate the maps’ road edges, lane markings and street furniture, with the sensor system’s understanding of the world.
A third layer of data comes from the cloud: other vehicles can upload positions of roadworks or unexpected obstructions. But of course – duh – when there’s only one car in Europe, there’s barely any cloud data. If all manufacturers were to co-operate, the cloud data could be detailed and useful, but there’s no protocol yet. “It will come,” Asami says. “But we cannot wait for a perfect standard of data. We’ll proceed anyway.”
Nissan’s Silicon Valley lab is working on what it calls “Seamless Autonomous Mobility”. If an autonomous – possibly driverless – taxi or delivery vehicle can’t figure its way past an obstruction, a remote human technician will tell it what to do. So one person can effectively control a fleet of vehicles.
Nissan’s autonomous cars – all manufacturers’ surely – will have to obey the rules of the road. What if the woman on the zebra crossing felt like winding us up? What, I ask Iijima, if she just decided to stop in the middle of the road? “We would have to wait forever” he laughs.
He acknowledges that autonomous cars won’t just have to follow the rules of the road. Because we humans don’t. Sometimes we jostle, other times we hang back to let another car in from a side road, and always we’re reading the subtle body language of other vehicles to see where they’re headed. If everyone in London stuck punctiliously to the highway code like this Leaf does, there’d be gridlock.
So an autonomous car needs more than simple artificial intelligence, says Iijima. “It needs deep learning. And I don’t know how long that will take.”