More cautionary tales
Sep 20th, 2022Whilst more and more vehicles incorporate technology to assist the driver with automatic functions, the future seems to lie with vehicles that are capable of ‘driving themselves’, i.e. where the ‘driver’ of a vehicle is relegated to the status of ‘user’. Power is transferred either to automation embedded in the vehicle, or to a remote ‘driver’.
Sections 1-8 of The Automated and Electric Vehicles Act 2018, which contain provisions relating to automated vehicles and the liability of insurers, came into force on 21st April 2021. Earlier this year, the Law Commissions of England & Wales, and Scotland, jointly recommended that human ‘drivers’ should not be legally accountable for road safety in the era of autonomous cars. A ‘driver’ would instead be referred to as a ‘user-in-charge’, and it would be a binary decision as to whether a car is autonomous or not, i.e. there should be no gradation of the degree of automation.
This will undoubtedly provide a rich seam of law in the future, but we are promised an era in which collisions – and therefore personal injuries – are dramatically reduced if not eliminated.
Of course, where a human is involved somewhere in the chain of command, poor decisions can be made, sometimes resulting in poor outcomes for the safety of vehicle occupants.
Such decisions are on a wide spectrum. For example, I can recall a family holiday in the 1960’s, when I was around 10 years old, and my brother was around 5. My parents had rented a cottage in Devon, and the four of us (and dog) were crammed into an Austin 1100 (estate, to be fair), which slowly made the journey back up to Lancashire at the end of the holiday. It was dark by the time we arrived in St Annes-on-Sea where we lived, we had been on the road since very early in the morning, and there was only a sniff of petrol left in the tank. With no 24-hour petrol stations, the car nearly made it home; it finally coasted to a halt about half a mile away. My dad could have decided to call it a day, parked up and sorted out refuelling the following day, as we were in easy walking distance of home. But, unwilling to be defeated, or perhaps just fatigued by the whole thing, he put me in the driving seat, with mum and dad pushing and offering ‘remote driver input’ to me via the open window. My brother, who was fast asleep on the rear seat, woke up during this part of the journey, realising his worst nightmare of (a) losing his parents and (b) me driving. Luckily, we all made it home unscathed.
That was of course a very different era, in every way. However, there is a legitimate and serious question to pose: do safety features actually lull us into a false sense of security, tempting us to take greater risks than we otherwise would?
On 23 March 1994, Aeroflot Flight 593 was a regular passenger flight from Moscow to Hong Kong. The aircraft, an Airbus A310, crashed into a mountain range in Kemerovo Oblast, Russia, killing all 63 passengers and 12 crew on board.
Air accident investigators found no evidence of a technical malfunction. However, on-board flight recorders revealed the presence of the relief pilot’s 12-year old daughter and 16-year old son on the flight deck. Relief pilot Kudrinsky was taking his two children on their first international flight, and they were brought into the cockpit whilst he was on duty.
With the autopilot active, Kudrinsky let the children sit at the controls. Yana, the 12-year old, took the pilot’s front seat. Kudrinsky adjusted the autopilot’s heading to give her the impression that she was actually turning the plane, though actually she had no control. Shortly after, Eldar, the 16-year old, took the pilot’s seat in place of his sister. He was obviously physically stronger than her and applied enough force to the aircraft control column to contradict the flight computer for 30 seconds.
This caused the flight computer to switch the aircraft’s ailerons (which control the banking or turning of an aircraft) to manual, whilst leaving all other systems on autopilot. A silent indicator light came on to alert the pilots. Eldar was first to notice a problem, when he observed the plane was banking to the right. The resulting predicted flight path shown on a screen confused the pilots for nine seconds, during which time the plane banked to almost 90 degrees, steeper than the design allowed. The A310 cannot turn this steeply whilst maintaining altitude and the plane descended rapidly. The autopilot used its other controls to compensate for the manual ailerons, pitching the nose up and increasing thrust. As a result, the aircraft stalled and, unable to cope any longer, the autopilot disengaged completely.
By the time the pilots had regained control, they did not know how far they had descended during this crisis, by which time their altitude was too low to recover. The plane crashed at high vertical speed, estimated at 14,000 feet per minute.
The subsequent enquiry into this accident concluded that if the pilots had simply let go of the control column, the autopilot would have automatically taken action to prevent stalling, and the accident could have been avoided. Allowing the children into the cockpit was against regulations, and the airline originally denied their presence. However, they were forced to accept otherwise after publication of the transcript of the flight voice recorder.
Automated vehicles clearly hold great promise, but there is a clear need for caution in assuming the way they behave will be wholly independent of their ‘users’.