Self-driving cars are the future and we're here for it. The automotive world is about to be turned on its head, but the question that's all on our minds is, who is to blame when a self-driving car crashes?
Whether it's partial or full autonomy, this question needs to be answered.
In your standard conventional cars (human-driven), the answer is simple, the driver is responsible because they are in control. The end. When it comes to self-driving ones, the answer isn't so clear-cut.
What is a self-driving car?
Also known as a driverless car or an autonomous car, a self-driving car is one that makes use of different cameras, sensors, and intelligence to travel from one place to another without the need for human operation, sounds great, doesn’t it?
It gets a bit more technical when we’re talking about the different levels of autonomy. For a car to be completely autonomous it has to be able to self-navigate to a destination using streets and roads that are yet to be modified, scary, or what!
Parties to a crash
So if there is someone behind the wheel, but they're not navigating it in any way, if the car was to have a crash, you'd automatically assume that it's the company behind the self-driving technology who should be liable. Logically, this would seem like the most realistic answer. When all else fails, blame the technology behind it. Even with driver negligence, you would still assume that the product, the technology itself, would be the one to blame.
There's no denying that self-driving cars can have crashes. You only have to read about what happened when Uber's self-driving car killed a pedestrian in the USA. It made headlines around the world and it showed us that there are many issues within the self-driving system and that technology is far less than perfect.
If we take a look at partially autonomous cars, you've still got the human element of control, so if there was to be an accident, surely the question of liability depends on what specific action led to the collision and whether it was the driver of the vehicle wWhereas for fully self-driving cars, the blame could be passed here, there, and everywhere!
If we're talking manufacturers fault, then there could be a design issue, or maybe inadequate service to the vehicle. Or, maybe the owner of the car could be liable if they have failed to update their vehicle software, it's a tricky one. If the accident could have been prevented by a human driver, self-driving cars will only be heading one way, and that's probably not going to be to the road.
It's all about the sensors
When thinking about the parties involved and who is actually to blame in the event of a self-driving car crash, you need to think about how the circumstances around the accident can be determined.
The answer is, it's all about the rich range of sensors. Fortunately for us, self-driving cars are information-rich! Thanks to their sensors that help to track, monitor, and measure everything, they can provide the experts with exactly what they need in the event of an accident. All they have to do is retrieve the sensor data to reconstruct the scene. Simples.
The sensors will detect and observe the objects surrounding it, picking up every little detail. There is of course a risk that this won't always go exactly to plan. You could get sneaky parties who might fancy altering the data to steer the liability decision in its favour. This means it's crucial that we not only record tamper-free sensor data but any interactions with the car too.
Tampering, I hear you say? You can prevent tampering by having a blockchain-based solution. Essentially, blockchain technology can help ensure there is untampered evidence of the conditions of an accident. This will help to make a decision in terms of liability.
All of this considered, going forward, the self-driving car industry might not be such a smooth ride after all. It's quite clear that up to now, this technology is still under development and when lives are at stake, self-driving cars might need to up their game a little bit and that we know exactly which party will be held accountable when things go wrong.