6 unexpected things autonomous cars still struggle with

Part 8: A short analysis of the car market

Raymond Meester
4 min readSep 12, 2021

When I told my colleagues in 2015 that within ten years we would have autonomous cars on the roads, they couldn’t believe it. Just a few years later, almost everyone seems to agree that they were arriving soon. But now in 2021 it slowly started to get out of reach. Even the ever-so-optimistic Elon Musk admits that it’s hard:

Yes, reality is more unruly than we can imagine. It turns out for autonomous driving you don’t need limited AI for a specific field, but a very generic kind of artificial intelligence. You are there in the real world where one have to take hard weather conditions, bad roads and most of all unpredictable humans into account. Here are 6 situations that came unexpected to the brains of autonomous cars.

1. Tesla Autopilot Features Mistakes Moon For Traffic Light

It seems that it’s so easy to distinguish a sun going down from a stoplight. Think again.

The car in the video again and again thinks the traffic light goes to yellow and starts to break.

2. Volvo autonomous cars are confused by Kangaroos

Volvo invested a lot in the autodetection of animals like a deer or an elk. Turns out that on the other side of the world, this system gets confused by Kangaroos… Reason is that, well, eh they Jump. It turns out that the horizon and the Kangaroo change. Volvo engineer David Pickett explains:

“When it’s in the air, it actually looks like it’s further away, then it lands and it looks closer,” he said.

3. People using their mobile at the sidewalk

Sometimes people are on the sidewalk, just standing and look at their phone. People can mostly estimate whether the person will cross the street or not. This is still very difficult for autonomous cars to find out.

4. Confusion over road construction

There are several accidents where the Tesla autopilot got confused over a road construction where it needs to follow the yellow markers instead of the default ones.

5. Pedestrians that Jaywalk

People are sometimes unpredictable (or just plain stupid). In 2018 December, Uber’s self-driving technology didn’t take this into account, which led to a fatal accident. The software couldn’t properly distinguish between people on the sidewalk and on the road.

6. Hesitating on lane changes

Some people (especially men in premium cars) aren’t hesitating at all when changing lane. Without signals, they just go. Self-driving cars seems the opposite, they are giving signals and then wait and wait until it’s really safe to go. Not confusing for itself, but for other road users.

When self-driving will be normal?

Most car accidents have a human cause, and autonomous driving research statistics show that self-driving is actually pretty good. It seems that in our example that future systems that are better aware of the car towards other objects, the recognition of those object (including color) and the distance will improve it furthermore.

But in the end, all doesn’t matter if autonomous cars make mistakes that a human would make. It doesn’t matter if a self-driving car drives perfect for 99% of the time, people want it to work for 99 with 6 nines behind the comma. It’s all about trust.

Check here for part 1:

More reading:

--

--