A self-driving, driverless Google Car made global headlines this week, as it crashed into a bus in California on 29 February – making it the first ever recorded road traffic accident caused by a robotic car – luckily no-one was hurt in this particular car crash.
While Google Cars have been involved in accidents before, this was the first time that an autonomous vehicle, which happened to belong to Google, had been the cause of a collision. The Google Lexus-model autonomous vehicle, known as the Google AV, hit a bus at 2 miles per hour on El Camino Real in Mountain View, in the heart of Silicon Valley.
“The Google AV test driver saw the bus approaching in the left side mirror but believed the bus would stop or slow to allow the Google AV to continue. Approximately three seconds later, as the Google AV was reentering the center of the lane it made contact with the side of the bus,” said the official incident report.
No injuries were reported, and the bus did not sustain any damage. The Google AV, however, sustained damage on its left side, as it collided with a bus travelling at 15 mph. It could be said, however dubiously, that the Google AV was ‘responsible’ for the crash. But can machines truly be responsible?
This question has been at the heart of artificial intelligence since the beginning. The famous ‘Three Laws of Robotics’, devised by Isaac Asimov, the writer behind ‘I, Robot’ state what robots are bound to. However, questions of responsibility are beyond simply philosophical musings.
The case of the Google AV colliding with the bus shows that, in the near future, these questions will have to be answered. The bus sustained no injuries, and nobody on-board was injured; however, this is but the first of such incidents.
The use of driverless cars is set to increase dramatically, with Google AVs set to hit the market in 2018, and companies such as Nissan announcing driverless cars in their showrooms as early as 2020. The Institute of Electrical and Electronics Engineers projects that, by 2040, 75% of cars will be autonomous.
Traditional laws may have to be changed to incorporate the possibility of collisions and resulting injuries by AVs. The United Kingdom, in a Department of Transport report entitled ‘Pathway to driverless cars’, attempts to establish culpability.
“The person seated in this position will continue to be commonly referred to as the ‘driver’, even if the vehicle is in an automated mode,” reads the ‘Sir Humphrey’ report. However, some cars may not even have manual controls. In this case, the report uses the word ‘vehicle user’.
For insurance purposes and legal cases, accountability will have to be established in the case of accidents. This may be the first Google AV-caused collision, but as autonomous vehicle use continues, it is unlikely to be the last.