PLEASE NOTE:

Bradley Law has ceased to practice.

Please do not contact Bradley Law for new enquiries as we are unable to either accept instructions or give you legal advice.

BRADLEY LAW
WA PERSONAL INJURY LAWYERS

No Win No Fee

Free Case Assessment

Call Us 7 Days

Tesla's Autonomous Driver Death, Car Accidents and Personal Injury Claims

Tesla Model S with words 'Autonomous Driving Who is liable for an accident' Author: Tim Bradley | Published: 7 July 2016

The recent tragic death from an autonomously driven vehicle raises interesting theoretical questions about the law of negligence for a comparable motor vehicle accident death or injury here in Western Australia.

Who would be at fault if the crash happened here in Western Australia? What is the liability of the driver of the autonomous car? Does Tesla have any culpability?

Circumstances Of the Accident

In Florida on 7 May 2016, Joshua Brown died when his semi-autonomous Autopilot driven Tesla Model S collided with another vehicle. Brown was travelling on a highway with Autopilot engaged when a tractor towing a trailer crossed the highway. The Tesla vehicle hit the trailer side-on, resulting in Brown’s death.

Tesla confirmed the Autopilot system failed “see” the trailer and apply braking. The failure was attributed to unusual circumstances. The trailer side was white and set against a brightly lit sky. The ride-height of the trailer was elevated, with Brown’s car passing under the trailer but the windscreen impacting the underside of the trailer.

It seems there was no evidence of any attempt by Brown, the deceased driver, to retake control just prior to the crash.

Tesla noted that the Autopilot system is initially disabled and that prior to its activation owners must acknowledge the Autopilot system is experimental and requires driver supervision. Further, every time a driver uses the Autopliot system a warning is given instructing drivers to keep both hands on the wheel and be ready to take back control.

This is the first known motor vehicle accident fatality for a car being autonomously driven.

According to Wikipedia, at the time of writing this relevant road user fatality statistics include:

  • The World fatality rate is 17.4 deaths per 100,000 people per year
  • Half the deaths are “vulnerable road users” - motorcyclists (23%), pedestrians (22%), cyclists (5%)
  • Low income countries have the highest rate (24.1 per 100,000)
  • High income countries have the lowest rate (9.2 per 100,000)
  • The United States have 10.6 deaths per 100,000 people and average 7.1 deaths per 1 billion kilometres
  • Australia has 5.4 deaths per 100,000 people and average 5.2 deaths per 1 billion kilometres.

According to Tesla’s records, this is the 1st death after about 130 million kilometres of semi-autonomous driving, which is an equivalent rate of about 7.69 per 1 billion kilometres - slightly higher than the kilometre average in that country.

Me, Tesla and Autonomous Driving

Before looking at this case from a Western Australian personal injury perspective, let me first say that I am fan of both Tesla and Autonomous driving systems.

Tesla’s semi-autonomous Autopilot system, and other similar applications being developed by Google and others, are already arguably safer than an average driver in certain circumstances.

Moreover, these systems are still being developed and it is my view that these systems greatly reduce road injuries and accidents into the future.

That bias out of the way, let’s move on.

Who Would be Responsible for the Crash in Western Australia

I am not going to pretend to know motor vehicle law in Florida, USA. Instead, lets look at the accident in the context of Western Australian law - as though it had happened here.

Western Australian motor vehicle accident claims are based in the law of negligence. The law of negligence is well settled in Australia, our road infrastructure is well designed, and our road rules are well developed. As a result, it is usually very simple to determine responsibility for a car crash here in Western Australia.

This case is no different. At least in relation to drivers’ negligence, although I have had to make some factual assumption for want of more information. However, there is the more novel aspect of “product liability” being brought into the mix.

If you want a quick answer about who I think is liable if the accident happened here in Western Australia, then in nutshell:

  • The tractor driver is the main culprit;
  • The deceased, Joshua Brown, has some liability too; and
  • Telsa might be liable too but indirectly through ‘product liability’ rather than traditional ‘road negligence’.

For a little more detail, read on.

The Tractor Driver

Right of way is an important part of road usage. It informs road users on how to behave with others. Lights, lanes, lines and other devices are all deliberately used in combination to create clear rights of way because clear rights of way reduces confusion and, by extension, accidents.

Brown was a highway user when the accident happened. A highway user can generally expect to be able continue on the highway in their lane unimpeded, though still remain alert so as to avoid accidents in the event of unexpected conduct. A highway user has the right of way.

The Tractor driver was crossing the highway and didn’t have right of way over Brown. That an accident occurred, means that the Tractor driver interfered with Brown’s right of way causing an accident.

The Tractor driver was negligent.

Yes. Road users should be attentive, but with mobile phones, maps, conversations, and various other distractions, the Tractor driver is not entitled to assume that oncoming traffic would be attentive. He should have waited to cross when no one was going to be affected by his crossing.

I have assumed:

  • The Tractor driver intended to cross the highway
  • The Tractor driver didn’t stop for an unforeseen emergency
  • The Tractor driver knew the highway traffic was going to be impeded by his vehicle

Brown’s Responsibility

Just because one person is responsible for an accident, doesn’t mean that no one else can be. Multiple people’s negligence can contribute to the same accident. This is called “contributory negligence” and Court’s regularly deal with this issue by apportioning liability (blame) for an accident as a percentage.

Tractors pulling trailers are not known for being quick. Yet the Tractor had enough time to pull out on to the highway past the Tesla. We know this because the Tesla didn’t take evasive action and impacted the trailer which was being pulled behind the tractor.

The assumptions we can draw from this are:

  1. Brown would have had time to react and avoid his death had he been paying proper attention;
  2. Brown was not paying proper attention.

Brown was liable and for failing to take proper care.

Relevantly, Brown’s reliance on Tesla’s Autopilot system has no bearing here. By choosing to use the Autopilot system he is NOT relieved of his responsibility to drive in a safe manner. He has a non-delegable duty to take due care when he is the vehicle’s driver.

Tesla’s Responsibility?

Tesla cannot be responsible for the crash in the typical sense. Brown cannot delegate his duty to drive safely, so under motor vehicle accident law only Brown and the Tractor driver are liable.

However, Tesla may not be off the hook.

Tesla’s Autopilot is a ‘product’. Namely, a semi-autonmous driving functionality requiring the driver supervision. Tesla could be liable in negligence to Brown under “product liability” and so indirectly liable for the accident.

Product liability is also governed by the law of negligence, but supplemented with specific statutory laws too. It is generally more complex to run in practice than typical ‘road negligence’ because more technical investigations and specific expert opinions are usually required.

In essence, however, the issue here would probably come down to:

Did Tesla do enough
to force a driver to
remain attentive and
be ready to take control?

Of particular relevance, Tesla had two in-built warnings:

  1. To activate the Autopilot system, the driver must acknowledge Autopilot “is an assist feature that requires [the driver] to keep [their] hands on the steering wheel at all times” and that the driver needs “to maintain control and responsibility for [the] vehicle” while using it; and

  2. Each time Autopilot is engaged, the driver is reminded: “Always keep your hands on the wheel. Be prepared to take over at any time”.

Telling the driver to keep their hands on the wheel and maintain attention would go a long way to discharging any responsibility for the product from Tesla back on to the driver.

But if the Autopilot feature performs consistently well day after day after day, as it seems it has, it is (ironically) reasonable to expect people will become complacent about the warnings which increases the risk of driving without their hands on the wheel and/or without giving the road their full attention.

So, were the warnings enough?

SIDENOTE: There are some specific laws in Western Australia regarding warnings and personal injury claims but it is not clear whether those laws would apply here - or if they should apply here. I have assumed they do NOT apply but I could be wrong - it is untested at this time.

When considering the adequacy of the warnings the Courts would likely look at what other reasonable steps could have been taken to reduce the risk of an accident. What is reasonable is determined by having regard to many factors including the likelihood of the risk (here an accident), the harm posed by the the risk, and the costs of implementing countermeasures.

The Risk of an Accident Happening by Autopilot failing was not remote.

Tesla’s Autopilot is:

  • experimental
  • novel
  • still being improved

Common sense dictates that it should be expected to fail, which it did. Tesla knows this. That is the purpose of the warnings and attempting to keep driver’s vigilant. The risk is not remote.

The seriousness of harm posed by Crash is Significant.

The possible affect of a crash is significant harm. We are not talking paper cuts or pin pricks here. Motor vehicles are heavy, fast and dangerous and the severity of likely harm is high. Here the failure resulted in a death. And it is not a stretch of the imagination to think that in other circumstances more people may have been killed or injured.

Were there Reasonably Available Protective Measures?

This part is a bit more speculative, but in my view Tesla could have done a number of things to protect against the risk.

For example, Tesla could have had sensors built into the steering wheel requiring it to be grasped when using Autopilot. If not properly grasped, the car could issue a loud alarm. This measure would probably be relatively easy to implement, and almost certainly would help ensure drivers keep their hands on the wheel and, in turn, pay better attention. In those conditions a driver is not likely to lose attention and suffer an accident like Brown’s.

And this is where I think Tesla has some exposure. Given the relative immaturity of the technology and the gravity of harm posed by a car crash, a Court could conclude Tesla ought to have implemented wheel sensors or some other further measure which would have protected someone like Brown.

Overall, I think Tesla would probably be found liable to Brown under Product Liability if the accident happened here.

Assumptions:

  • Brown was not paying attention
  • Brown had taken his hands off the steering wheel
  • Brown would have spotted the Truck if forced to keep both hands on the steering wheel

Summing Up

If this crash happened here in Western Australia, my view is that all would hold some liability with the lion’s share of the blame to be given to the Tractor driver - who should have assumed an inattentive driver may be coming when he tried to cross the highway.

Relevantly, semi-autonomous driving systems will not exculpate a driver from responsibility because the decision to use a driving system does not remove the obligation to drive safely. However, that introduces a responsibility on the provider of the autonomous systems to ensure that their systems are implemented in a safe manner, and which may or may not be discharged by warnings alone.

Hopefully this tragic accident will be one of the last where an autonomous driving system failure resulted in death, and that soon such systems will improve road safety world wide. The only people perhaps to lose out from their evolution are insurers and personal injury lawyers!