By Cade Metz, Ben Laffin, Hang Do Thi Duc and Ian Clontz. Cade and Ian spent six hours riding in a self-driving car in Jacksonville, Fla., to report this story. Nov. 14, 2022
When we decided it was time for lunch, Chuck Cook tapped the digital display on the dashboard of his Tesla Model Y and told the car to drive us to the Bearded Pig, a barbecue joint on the other side of town.
“I don’t know how it’s gonna do. But I think it’s gonna do pretty good,” he said with the folksy, infectious enthusiasm he brought to nearly every moment of our daylong tour of Jacksonville, Fla., in a car that could drive itself.
This is Chuck.
This is Chuck’s Tesla.
This is Cade.
For more than two years, Tesla has been testing a technology it calls Full Self-Driving with Mr. Cook, a 53-year-old airline pilot and amateur beekeeper, and a limited number of car owners across the country.
Tesla has long offered a driver-assistance system called Autopilot, which can steer, brake and accelerate its cars on highways. But Full Self-Driving is something different. It is an effort to extend this kind of technology beyond highways and onto city streets.
This summer, Elon Musk, the company’s chief executive, said the system would be available in more than a million cars by the end of the year. In August, we spent a day driving around with Mr. Cook and his Tesla to assess the progress of this experimental technology.
Over six hours, his car navigated highways, exit ramps, city streets, roundabouts, bridges and parking lots. With his hands near or on the wheel and his eyes on the road, the car attempted more than 40 unprotected left-hand turns against oncoming traffic. It kept us on the edge of our seats.
All the while, video cameras recorded everything we experienced, including a GoPro mounted on the roof as well as the eight cameras installed by Tesla on the front, back and sides of the car.
The Trip to the Bearded Pig
The most telling moment came as the car drove us to lunch. After navigating heavy traffic on a four-lane road, taking an unexpected turn and quickly remapping its route to the restaurant, the car took a right turn onto a short street beside a small motel.
Cade:Did you intervene with a turn signal?
Chuck:No, no.
Chuck:I am not doing anything.
Chuck:It is going to have to remap though.
Chuck:I’m doing everything I can to have this take us to lunch.
But watch as the Tesla struggles to make sense of its environment, veering from the road into a motel parking lot. Chuck is forced to retake control.
Cade:Whoa!
Cade:What’s this?
Chuck:I don’t know.
Cade:Whoa!
After driving around the motel, the car almost immediately made the same mistake, jerking into the lot this time.
Chuck:I don’t know why it did that.
Chuck:So we had one disengagement and a reroute into a …
Cade: Whoa!
Chuck:So let’s see what it’s doing here.
From a different angle, it was sobering to see how close we came to hitting a parked car after we rolled over a low curb separating the parking lot.
Cade:Whoa!
Cade:What’s this?
Chuck:I don’t know.
Cade:Whoa!
Even the car’s internal display, which uses red lines to denote boundaries that the computer vision system detects, suggests that the car struggled to distinguish the curb between the road and the lot.
Cade:Whoa!
Chuck:So let’s see what it’s doing here.
Tesla is constantly modifying the technology, working to fix its shortcomings. Since the day we drove around Jacksonville, the company has twice released new versions of the technology that show signs of improvement. But the moment in the motel parking lot showed why it may be a long time before cars can safely drive anywhere on their own.
The experiences of beta testers like Mr. Cook are a window into the enormously ambitious and expensive bet that Tesla is making on self-driving technology. It and other companies are investing billions into researching and developing autonomous vehicles — taxis that can ferry us around town, trucks that will deliver our online orders and maybe even one day cars that will take our children to soccer practice.
Elon Musk and Tesla did not respond to requests to participate in this story. But Mr. Cook’s Model Y provides a glimpse of the future we are moving toward, which may prove to be safer, more reliable and less stressful — but is still years away from reality.
Tesla’s technology can work remarkably well. It changes lanes on its own, recognizes green lights, and is able to make ordinary turns against oncoming traffic.
Chuck:This is beautiful.
Chuck:I love this when it happens.
Chuck:It’s just like…
Chuck:Slows, sees, turns.
Chuck:It’s so different without traffic interaction, right?
Cade:Sure
Chuck:It’s just so confident when it knows.
Chuck:This is beautiful.
Chuck:I love this when it happens.
Chuck:It’s just like…
Chuck:Slows, sees, turns.
Chuck:It’s so different without traffic interaction, right?
Cade:Sure
Chuck:It’s just so confident when it knows.
Chuck:This is beautiful.
Chuck:I love this when it happens.
Chuck:It’s just like…
Chuck:Slows, sees, turns.
Chuck:It’s so different without traffic interaction, right?
Cade:Sure
Chuck:It’s just so confident when it knows.
But every so often, it makes a mistake, forcing testers like Chuck to intervene.
“That moment shows that the car can only know what it is trained to know,” Mr. Cook said of the sudden turn into the parking lot. “The world is a big place, and there are many corner cases that Tesla may not have trained it for.”
Experts say no system could possibly have the sophistication needed to handle every possible scenario on any road. This would require technology that mimics human reasoning — technology that we humans do not yet know how to build.
Such technology, called artificial general intelligence, “is still very, very far away,” said
Andrew Clare, chief technology officer of the self-driving vehicle company Nuro. “It is not something you or I or our kids should be banking on to help them get around in cars.”
‘Chuck’s Turn’
Ian Clontz for The New York Times
In the tight-knit community of Tesla enthusiasts, stockholders, bloggers and social media mavens, Chuck Cook is famous. This summer, Mr. Musk noticed the meticulous way he explored the boundaries of the technology in a series of YouTube videos.
Mr. Cook had been posting online clips of his Tesla trying to navigate an unprotected left turn near his home in Jacksonville. To make this turn, the car must pass through three lanes of traffic approaching from the left, squeeze through a gap in the median and merge into three more lanes of traffic approaching from the right.
Sometimes, the car made the turn with aplomb, edging into the thoroughfare and waiting for a moment when it could speed into a far lane.
Other times, it got stuck beside the median in the middle of the turn — its rear bumper jutting into the oncoming traffic:
Aerial imagery by Chuck Cook
Soon, Mr. Musk noticed the videos and vowed to solve what Tesla enthusiasts began calling “Chuck’s turn.” In the weeks that followed, Tesla equipped several test cars with a new version of its self-driving technology and sent them to Mr. Cook’s neighborhood, where they spent several weeks testing the new software and gathering data that could help improve it.
Mr. Cook and I spent a good chunk of our day asking his car to navigate the turn named after him. Each attempt was different from the last. Sometimes, the cars approached much faster from the left. Other times, from the right. Sometimes, the gap between the two was enormous. Other times, it was tiny.
Not long after that day in Jacksonville, Tesla released a new version of its software to Mr. Cook and other beta testers.
The car’s display now showed a blue overlay that indicated what was a safe zone in the median.
Before the software update
After the software update
When facing heavy traffic, it could navigate Chuck’s turn with a precision that was not possible in the past. So if it needed to stop next to the median, it would position itself so that traffic could safely pass both in front and behind.
Aerial imagery by Chuck Cook
Chuck’s turn is just one scenario among the endless scenarios a Tesla might face on American roadways.
Some are relatively common. Companies like Tesla can test and retest their technologies in these situations until they are confident a car can handle them safely. But other scenarios are rare and unexpected — what industry experts call “edge cases.”
“It is very easy to solve the first 90 percent of the problem, very hard to solve the last 10 percent,” Mr. Clare said, referring to the decades-long effort to create self-driving cars. “You need to be able to handle those edge cases gracefully.”
Facing the unexpected
After lunch, when Mr. Cook told the car to drive us to a small neighborhood park near the river, the skies were overcast and the streets were wet from summer rain.
Guided by Tesla’s self-driving technology, the car drove along the river and over a bridge before reaching an intersection lined with trees. Then it turned left toward an unmarked road that ran between several giant oaks draped in Spanish moss.
As the car approached the shadows beneath this mossy canopy, it suddenly changed course, turned sharply right and headed the wrong way down a one-way street:
Chuck:Let’s see what it does here.
Chuck:Traffic there.
Chuck:Took the right of way.
Cade:Whoa, whoa, whoa!
Chuck:It didn’t find it.
Chuck:Let’s see what it does here.
Chuck:Traffic there.
Chuck:Took the right of way.
Cade:Whoa, whoa, whoa!
Chuck:It didn’t find it.
Chuck:Let’s see what it does here.
Chuck:Traffic there.
Chuck:Took the right of way.
Cade:Whoa, whoa, whoa!
Chuck:It didn’t find it.
The moment highlighted the difference between Tesla’s self-driving technology and “robotaxi” services being developed by companies like Waymo, owned by the same parent company as Google, and Cruise, backed by General Motors.
The robotaxi companies are trying to reduce these unexpected moments by tightly controlling where and how a car can drive. Using laser sensors called lidar, they build three dimensional digital maps of individual neighborhoods that give cars a fine grained understanding of their environment. Then they spend months or even years testing cars in these contained areas.
These companies are now preparing self-driving car services that will operate without backup drivers in places like San Francisco and Austin, Texas. But these services will have strict limitations that make the task easier. The cars will travel only in certain neighborhoods under certain weather conditions at relatively low speeds. And company technicians will provide remote assistance to cars that inevitably find themselves in situations they cannot navigate on their own.
Tesla is not operating in this way. Lidar sensors are too expensive for consumer vehicles. Building three-dimensional maps and testing vehicles on every American roadway is impractical. So is remote assistance. This means that Tesla cars face the unexpected more often than Waymo or Cruise cars — and that testers like Chuck Cook must keep their hands on the wheel at all times.
Just last week, he and his car revisited a few of the scenarios we encountered in August. Sometimes, the car performed perfectly. Sometimes, it did not. It drove past the motel on the way to the Bearded Pig six times, and though it remained on the road three times, it mistakenly drove into the parking lot three times as well.
When it did veer into the parking lot, it did not swerve as egregiously as it did in August. Mr. Cook says he is impressed with the progress of the technology. But he also knows that far more progress is needed. He also knows that Tesla engineers are focused on the behavior of his car and that others may not perform as well in situations that have not been closely scrutinized.
“The technology is not ready to take the driver out of the seat,” Mr. Cook told me on a recent morning. “As they continue to iterate on the hardware and the software, it is a like a salmon going up river.”
After releasing the new beta, Mr. Musk softened his claims about the immediate future of the technology. He now says that the technology will not be widely available until next year — and that regulators are unlikely to approve it for use without hands on the wheel. Autopilot still requires this oversight.
Federal regulators have spent the past several months investigating a series of crashes involving Autopilot, and they have not yet revealed the results. Safety experts worry that the arrival of Full Self-Driving will lead to more accidents.
“It is inevitable,” said Jake Fisher, senior director of Consumer Reports’ Auto Test Center, who has used the technology. “The problem comes as this system gets better and people get complacent. It will still do the unexpected.”