Washington, DC CNN Business  — 

A year after the first fatality caused by a fully self-driving car, questions about liability in the event of a death involving the cars are still completely up in the air.

Officials announced earlier this week that Uber won’t face criminal charges in the death of a pedestrian struck and killed by one of its self-driving cars nearly a year ago in Tempe, Arizona.

The Yavapai County Attorney’s Office said it conducted a thorough review of the evidence and determined there was no basis for criminal liability against Uber. It did not detail how the decision was made and has declined to answer any questions on the case.

The pedestrian was walking a bicycle across a road at night. Uber’s self-driving software system initially classified the pedestrian as an unknown object, then as a vehicle, then as a bicycle, but never braked.

However, the Uber employee who was behind the wheel of the SUV could still face criminal charges. Companies working on self-driving cars, such as Uber, have test drivers who are supposed to intervene if the car fails to act properly.

It’s not clear why the prosecutor’s office made its decision — if it is because Uber bore no responsibility or that there was no law under which the company could be charged. That means a first chance for the industry, legal experts and society to discuss this issue may have been lost. And it’s a discussion that needs to be had, because it’s a hard problem to solve: Experts say regulators will have to juggle holding businesses accountable for wrongdoing and concerns that they will be slowing down a technology expected to save lives.

“If we start throwing Uber executives in jail and having large criminal penalties, it’ll be harder to test and more lives will be lost because of human errors,” said Ed Walters, who teaches courses on robotics law at Georgetown University and Cornell University. “We should limit criminal liability for the most extraordinary circumstances.”

Walters said criminal liability should begin when there’s evidence that a company knew people’s lives would be lost, and ignored that information. An example is if a company released a product while knowing its sensors failed to identify people on bicycles or couldn’t see pedestrians in the rain.

Criminal liability would be different in a case involving a self-driving car that had a test driver behind the wheel, Walters said. The company would have to know that its self-driving technology would fail in a way that threatened lives, and that the test driver would be unable to fix it.

It’s generally difficult to hold companies criminally liable, according to Bryant Walker Smith, a professor at the University of South Carolina School of Law who studies autonomous vehicles. The circumstances would have to be egregious and reckless: for example, a company that paid bonuses to test drivers whose vehicles had the most close calls with pedestrians. A company that falsified information on the quality of its cars, misleading the government, would also have a higher chance of being criminally liable.

Ultimately, Walters said, he could foresee fully autonomous vehicles and human driven vehicles being held to different standards. An autonomous vehicle might make mistakes a human would never make, but it also would never make some mistakes that humans routinely do. Self-driving cars could instead be judged against each other. The rationale behind this, according to Walters, is that self-driving cars could be much safer than human drivers generally, even if in a handful of situations they’re worse.

The Uber incident had unusual circumstances, so it’s not a model case for setting law and policy around fully self-driving cars, according to Walters. Video showed the test driver was distracted and not watching the road.

And test drivers aren’t expected to be present when companies eventually deploy self-driving cars on a large scale.

Arizona is a rarity among US states because it does not have a vehicular manslaughter statute. Prosecutors would have to rely on a general manslaughter charge, which in Arizona typically involves DUIs, speeding, aggressive driving or racing, none of which match the circumstances of the Uber fatality.

“The attorney’s office seems to have concluded either that Uber’s conduct was not bad enough to warrant criminal charges or that there just isn’t a specific crime to fit Uber’s conduct, however bad,” Smith said.

Uber isn’t facing criminal charges but has dealt with adverse effects on its business as a result of the incident. The company temporarily halted its vehicle test program and shut down its self-driving operations in Arizona, laying off 300 workers.

In December, Uber resumed testing vehicles in autonomous mode. But the fatality increased public skepticism of self-driving vehicles, and slowed efforts to pass autonomous vehicle legislation on Capitol Hill.

Uber also faced the risk of a civil lawsuit. Edwards said this is why the company moved quickly to settle with the family of Elaine Herzberg, the pedestrian who was killed, shortly after her death. Uber and Herzberg’s family settled fewer than two weeks after the crash. Details of the agreement weren’t revealed.

Ultimately it may take another tragedy before there is any clarity on criminal liability and self-driving cars.

“The larger issue that not many people seem to be talking about is, should we require self-driving cars to be perfect before they can be sold?” said Todd Benoff, an attorney at the law firm Alston & Bird who is focused on autonomous vehicles. “No one has ever even suggested we do the same thing with human drivers.”