Who was to blame when the groom got a drone square in the face?
In 2013, a Wyoming couple hired a local photographer to record and edit a bridal video — basically moving wedding photos. Using a combination of traditional and drone-mounted video cameras, the photographer captured the couple in full wedding attire running hand-in-hand through a field, practicing their vows of love and, at one point, also captured first-person footage of a quadcopter bulleting headlong into the face of a vest-clad groom-to-be.
The gracious groom accepted the photographer’s profuse apologies and attended the wedding as planned two days later, albeit with some fresh cuts on his face. The couple even asked that the photographer post the drone video of the collision online, where it went viral.
It was a happy ending for the pair the photographer called “the coolest couple in the world,” but is one example of the growing ways new technologies can cause injury — and legal liability.
As lawmakers in Chicago and elsewhere figure out how to regulate drones, another new technology is forcing attorneys to rethink the liability behind a simple car crash. Although driverless cars have not yet hit Illinois streets, the state is prepping for the eventuality, mulling several measures to regulate vehicles that can drive for us.
The cars are coming. Tech-heavy California reported 180 autonomous test vehicles on public roads in March, up from 33 test vehicles in 2014, according to a Financial Times report citing data from the California Department of Motor Vehicles.
These new technologies whizzing by our heads and maneuvering our roads pose a new set of questions for personal-injury attorneys to consider when applying the legal precedent and introduce unforeseen variables to the legal framework for accident liability.
In a world with fully autonomous vehicles, the fundamental concepts of negligence and product liability will remain the same but will be applied differently, said Steven Levin, founder and senior partner at Chicago personal-injury firm Levin & Perconti.
“Autonomous vehicles, to the extent that they reasonably eliminate human error as a cause, are going to be treated just like other personal-injury cases, but you’re just going to have to look at them a little differently,” he said. “So one of the questions in a personal-injury case will be: Was the human entitled to totally rely on the automated system?”
Attorneys will have to ask other questions, he said. Was there any reason that the human would have been aware that the automated system wasn’t working? Or was there some sort of defect in the manufacture of either the hardware or the software that operated the vehicle?
“From a personal-injury liability aspect, autonomous vehicle cases are going to involve apportionment of liability among the various actors — that would include the product manufacturer both from a hardware point of view and a software point of view. And that would also include an examination of the operator’s action,” he said. “So they will be a little more difficult to determine who is or who is not responsible because there are different people doing different things.”
Like crashes involving fully autonomous vehicles, accidents with drones — or unmanned aerial systems — will also raise new questions for determining liability in personal-injury cases. A March report from the U.S. Federal Aviation Administration estimates there were 1.1 million small hobbyist drones by the end of 2016 and 42,000 commercial drones.
But John Heil, a partner at Heyl Royster and a chair of the drone law practice at the firm’s Peoria office, said it’s unlikely that the basic principles of personal-injury law will change as drones become more prevalent.
“Whether you get hit on the head with a baseball or a drone, you are still getting hit on the head. So I don’t think the fundamentals of what personal-injury attorneys know is going to necessarily change,” Heil said. “What is going to change is the way in which certain occurrences happen and the variables that will have to be studied in one of those cases.”
Charles L. Mudd Jr., owner and principal attorney at Mudd Law, said the responsibility for controlling many of these potential variables that could cause injury to a person, animal or property would fall to the drone operator rather than the manufacturer.
“Yes, there are issues with the hardware and software that might cause the drone to go awry and cause injury, but the majority of these issues are going to be with the operator and making the right decisions,” Mudd said. “It’s not only just knowing how to fly it and not ramming it into someone but also knowing when not to fly the drone at all, when weather is bad or there’s too many people or lacking the line of sight. All of that is the [responsibility of the] operator.”
Drones’ legal landscape
In November 2015, Chicago became the first major city to enact policies regulating drones.
The Chicago ordinance prohibits drones from flying higher than 400 feet, outside the operator’s line of sight or between 8 p.m. and 8 a.m. It also bans drones from flying within 5 miles of the city’s airports or above schools, hospitals, churches and outdoor stadiums as well as property not owned by the drone operator.
Similarly, the state has gotten into the game.
Two years ago, Illinois created the Unmanned Aerial System Oversight Task Force “to study and make recommendations for the operation, usage and regulation of” drones. But the General Assembly has not passed legislation since then to govern commercial or recreational drone use.
The Freedom From Drone Surveillance Act, which became state law in 2014, restricts the use of drones by state law enforcement agencies, except in certain situations.
For example, drones can be used to counter the high risk of a terrorist attack based on credible intelligence. Law enforcement can also use drones in an attempt to locate a missing person or if they obtain a search warrant.
In the 100th General Assembly, Illinois lawmakers introduced two bills establishing penalties for illegal drone operations — one dealing with privacy and the other involving trespassing.
House Bill 3906 would make it a criminal offense if a drone is used to video- or audio-record someone at home without his or her consent, and if the drone is used in “a manner that … invades the other person’s reasonable expectation of privacy.”
The offense would be a Class A misdemeanor, unless the video or audio is disseminated, which would raise the penalty to a Class 3 felony.
House Bill 3838 would create a new offense called “criminal trespass to a critical infrastructure facility.”
A person would commit this offense if he or she uses a drone over a “critical infrastructure facility at an altitude not higher than 400 feet above ground level” or allows a drone “to make contact with a critical infrastructure facility, including any person or object on the premises of or within the facility,” with some exceptions. A violation would be a Class A misdemeanor.
Both HB 3906 and HB 3838 have been stalled in the House Rules Committee since March.
At least 40 states have put regulations in place for commercial drone use in addition to rules existing at the national level.
The Federal Aviation Administration implemented the first operational rules last year for commercial drones weighing less than 55 pounds. The FAA rules, among other standards, require drone pilots to keep an unmanned aircraft within a visual line of sight.
Mudd said he anticipates commercial drone operators, who are seeking an exemption to the line-of-sight rule, may be motivated to operate with a higher sense of responsibility than recreational drone pilots.
He predicts any injuries occurring would make the FAA less likely to provide exemptions around line-of-sight requirements. Further, legislators are more concerned about and restrictive of drone use in public spaces like parks.
“The commercial space, I think, is going to be less prone to personal-injury incidents than the amateur drone operations,” Mudd said.
Controlling driverless cars
As of June, 18 states have passed laws to implement rules for autonomous vehicles, according to the National Conference of State Legislators.
Meanwhile legislators in Illinois have proposed at least five bills regarding driverless cars. None, however, have yet to become law.
House Bill 791 would prevent a local government from passing an ordinance that prohibits the use of cars with an “automated driving system” on its roadways.
The bill defines these cars with an “automated driving system” as a “vehicle equipped with … hardware and software that are collectively capable of performing the entire dynamic driving task on a sustained basis … ”
The measure passed unanimously by both chambers and, as of July, awaits Gov. Bruce Rauner’s signature.
Another proposal, House Bill 2747, would create the Safe Autonomous Vehicle Act. Under this bill, a fully autonomous vehicle can drive on state highways, regardless of whether a human is physically present.
The bill states that accident liability involving a fully autonomous vehicle will “be determined under existing product-liability law or common-law negligence principles.”
The measure has been stalled in the Rules Committee since late April. Three other pieces of legislation relating to autonomous vehicles — HB 2997, HB 4050 and SB 1432 — have likewise been inching through the Statehouse since spring.
Jonathan Rosenfeld of Rosenfeld Injury Lawyers said laws governing autonomous vehicles are important in terms of shaping human behavior.
“If you do not have laws that are specific to driverless cars, you have a very gray area where people really don’t know how to act or how to behave and it poses a danger really to everyone,” he said.
In response to technological advances in vehicle automation, the National Highway Traffic Safety Administration issued nonbinding guidance on the subject last September.
The guidance focused on important areas that manufacturers and other entities should consider when designing, testing and deploying fully automated vehicles.
The guidance defines levels of automation on a scale from Level 0 to Level 5, where at Level 0 “the human driver does everything” and at Level 5 “the automated system can perform all driving tasks, under all conditions that a human driver could perform them.”
Fewer car crash cases
A 2016 study by the National Highway Traffic Safety Administration found that human error is the critical factor in roughly 94 percent of all car crashes.
If autonomous vehicles are able to largely eliminate human error, which is the major cause of car accidents, we can expect both fewer collisions and personal-injury cases that would have resulted, or so goes the reasoning.
In a hypothetical fully autonomous world, Levin & Perconti’s Levin said, there will be significantly fewer collisions but those that do occur will be more complicated to investigate and litigate because they will involve product liability as opposed to human error.
“It’s much harder showing product liability, much more expensive and more difficult to show product-liability-type defects than it is to analyze human error,” he said.
In today’s world, Levin said, humans are pretty well trained to assess fault in collisions involving human error.
But, he said, autonomous vehicle collisions resulting from a product defect will require an expert to secure the product, analyze it and determine the source of the defect, which could become a costly and time-consuming process in routine car accident cases.
“It’s very expensive,” Levin said. “You need time, money and experts. And oftentimes, it is inconclusive because maybe the collision itself so damaged the vehicles involved that it’s hard to reconstruct it.”
Some personal-injury attorneys, like Ken Apicella, said we should expect to see fewer collisions when fully automated cars take over the roads. But he does not anticipate the phenomenon will become the nail in the coffin for personal-injury cases.
“I think the laws are going to evolve and this area of practice is going to have to evolve with it,” said Apicella, a founding partner of Drost, Gilbert, Andrew and Apicella. “I don’t foresee the inception of driverless cars being the end of personal-injury cases because I don’t think you can program for every possible contingency that’s out there. And just like everything else, there is going to be some trial and error.”
But others, like Rosenfeld of Rosenfeld Injury Lawyers, say the safety of fully automated vehicles could make those types of personal-injury cases close to extinct.
“I don’t think there’s any doubt that the driverless cars are significantly more safe and less accident-prone than any type of vehicle that’s operated by a human being,” he said. “As a consequence, I think we’re going to see a drastic reduction in the number of accidents involving all types of motor vehicles.”
He said litigation of personal-injury cases will “slowly decline and ultimately probably come close to dying off at some point in the future.”
Still, Rosenfeld said the technology is a long way from fully automated vehicles that require no human intervention.
“I just think that any attorney who thinks that motor vehicle cases are just going to go on indefinitely, in terms of the stream of business, they need a wake-up call because I think we are going to see a real reduction in the number of these cases.”