CNN  — 

Picture: A bleak dystopian future in which humanity is ruled by robots, suppressed by its own creation. Not an original thought, admittedly. Pop culture is filled with examples of robot overlords and android uprisings; of artificial intelligence gone bad.

But so far, few – if any – of these dystopias begin with mall cops.

Dubai Police is about to test these waters in the mildest way possible when it introduces the first of a fleet of robots into its ranks on May 24. Skynet this is not, but with Dubai planning to recruit enough robots to make up 25% of its police force by 2030, it does throw up plenty of questions.

Is this a novelty, a PR stunt, or a small step, a slow creep, towards a RoboCop future? As robot intelligence increases and a Russian android learns to shoot guns, the answer is less straightforward than you might think.

The lone ranger

Writer Isaac Asimov’s Three Laws of Robotics are a fictional creation with real-world value:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

These 75-year-old laws provide an outline for benign creations, designed for our benefit. They’re also to protect us from any potential harm.

"I, Robot" (2004), based on a selection of sci-fi shorts by author Isaac Asimov, explores the Three Laws of Robotics.

But what if a robot’s actions put someone in prison? Does this violate either the First or Second Law, or uphold them? Can a robot do this safely? Should robots – could robots – delve into the nuances of criminal law and one day replace the role of humans as arbiters?

This is the ethical debate stirred up by Dubai Police’s plans for robot officers.

Its first step is the Dubai Police Robot, a single unit entering service this week. An adapted REEM humanoid robot, citizens can report crimes to it and kick-start real-life human investigations. But while it may not be able to arrest people – or chase down suspects for that matter – Dubai Police is working towards one that can.

Designed by PAL Robotics of Spain, REEM was first unveiled in 2011. Weighing 220 pounds and 5 feet 6 inches high, the two armed, wheel-based service robot can speak nine languages out of the box and is highly customizable, say its creators.

REEM, designed by PAL-Robotics of Spain.

“The robot is going to be an interactive service for the people,” says Brigadier Khalid Nasser Alrazooqi, general director of the Smart Services Department at Dubai Police and the man responsible for the police robot project.

Citizens can ask the robot questions, pay fines and access a variety of police information via purpose-built software. Its facial recognition technology is only 80% accurate, says Alrzooqi, but the robot’s camera eyes will send live feeds to a command control center for analysis.

dubai police robot brigadier alrazooqi

After a review period, many more Dubai Police Robots will soon be rolled out. (PAL Robotics declined to share the number of units ordered.)

“In the first phase it’s going to be available (at) all tourist attractions (and) shopping malls in the city,” the brigadier says, adding that further units will soon act as receptionists in police stations.

But this model is only the beginning for Dubai Police.

More human than human?

The next stage, already in research and development says Alrazooqi, is to create a “fully-functional robot that can work as (a) normal police officer.” He declined to name the companies involved on record.

“Technology-wise, we are still struggling,” he admits, citing the need for greater dexterity – “robots (that) can run similar to a human being, capturing criminals, carrying weapons.”

“We are still waiting until manufacturers reach that target. We’re working very closely with them, and we even provided them (with) all the system requirements,” he adds.

Mobility aside, Alrazooqi might be heartened to see developments elsewhere. In China, the E-Patrol Robot Sheriff, also designed for use in public spaces, was rolled out in February. According to local reports it can cross-reference faces against police databases, and when the robot detects a wanted person it will follow them until the police arrive.

The E-Patrol Robot Sheriff on duty at Zhengzhou East Railway Station on February 15, 2017, Henan Province, China.

If a fully-functional officer sounds like pie in the sky, Dubai Expo 2020 is around the corner and Alrazooqi says the Smart Services Department hopes to have a prototype to show.

Alan Winfield, Professor of Robot Ethics at the University of the West of England, has some reservations.

“There are big ethical problems,” he argues. “If you’re asking a robot to apprehend criminals, how can you be sure that the robot would not injure people?” Making a robot which safely intervenes in crime-related scenarios would be “extremely difficult,” he adds.

In the eventuality that a robot makes a mistake, what then? In July 2016 an egg-shaped mall security robot in California caused a hit-and-run incident with a toddler. There were only minor injuries to the child, but confidence in the robot was dented, causing makers Knightscope to go public on the “freakish accident.”

“Of course, when humans make mistakes they are held to account,” says Winfield. “The problem is that you can’t make a machine responsible for its mistakes… How do you punish it? How do you sanction it? You can’t.”

Chrissie Lightfoot, CEO of Robot Lawyer LISA and a leading figure in the robot law debate, sees vicarious responsibility as a huge issue. If a robot makes a mistake, blame can fall on many people. Its manufacturer, programmer, owner and operator all sit within the liability milieu.

“Are insurance companies going to be happy to actually insure these kinds of scenarios going forward?” she wonders.

“I think we need to have the law in place as robots and AI evolve so that at every step and every stage there is recourse, one way or another.”

Why guns are a “red line”

One thing Dubai’s future robot officer won’t be doing is carrying a gun, says Alrazooqi.

Doing so would constitute a “very serious red line,” argues Winfield. PAL Robotics concurs, saying its company policy is not to engage in military-style projects. Business manager at PAL Robotics Carlos Vivas admits however that military contracts are potentially extremely lucrative.

With funding there’s always the possibility of breakthroughs in AI, but to what end?

One example – perhaps relevant to the Dubai Police’s efforts – emerged recently from Russia.

On Twitter in April, Russia’s Deputy Prime Minister Dmitry Rogozin previewed the capabilities of FEDOR (Final Experimental Demonstration Object Research), an android built by the Russian Foundation for Advanced Research Projects in the Defense Industry – an agency that’s been compared to DARPA in the US.

This gunslinger appears to be able to drive cars, lift dumbbells and shoot with deadly accuracy. For reasons not fully explained, the agency say they plan to send FEDOR into space in 2021.

robot with guns
This Russian robot shoots guns
01:39 - Source: CNNMoney

“Combat robotics is key to making intelligent machines,” Rogozin wrote on the social network. “Shooting training is a way for machines to prioritize and, moreover, instantly make decisions. We’re not creating Terminator, but an artificial intelligence.”

Winfield is skeptical. Beyond the moral question of whether a robot should be allowed a weapon, he argues “artificial intelligence is not good enough” for a robot to safely carry a gun, “and is unlikely to be good enough for a very long time.”

Part of the issue is bias. It’s heavily prevalent in sub-sentient robots we have today. Put simply, their decision making capabilities, almost unavoidably, will be influenced by their programmers.

We’ve seen numerous examples of “algorithmic bias” in tech: translation apps assigning genders to job titles, or facial recognition technology detecting Caucasian faces, but not black. How these biases might manifest as prejudices is a crucial issue for robot policing.

Providing a reality check to both Rogozi and Dubai Police and their AI ambitions, Winfield says there’s a long way to go.

“We’re very good at building ‘narrow’ AI. That is, AI that can solve a problem – drive a car or play chess,” he explains. But general intelligence – “learning from one domain of knowledge and then transferring it to another,” like a human – is another matter.

“We simply can’t make artificial general intelligence,” he adds. “It’s such a difficult problem that we don’t even know what problems we have to solve in order to get there.”

“I’m not sure we’ll never solve it – I’m sure we will – but not for a very long time.”

"Ex Machina" (2015) imagines a near future in which artifical general intelligence has been realized.

This knowledge gap opens the potential for deadly mistakes. (Some will say this is no different to current policing, however.) Developments in AI remain relatively slow, but that hasn’t stopped the likes of Stephen Hawking and Elon Musk raising their concerns. The latter has called AI “our biggest existential threat.”

Musk and other tech influencers invested $1 billion in research group OpenAI in 2015 precisely because, if AI development must take place, the industry needs moderating voices to ensure it is developed safely and only for our collective benefit.

Especially, suggests Lightfoot, as there’s no sign that nation states will reach a consensus on robot use and limitations.

“I don’t think (we’re) going to get a universal, global bill and act on robot law,” she believes. “We cannot expect the law to be unanimous. It must be heterogenous, based on different nations having different laws, because of (their) cultures, laws and ethics.”

What about us?

So is our paranoia justified, or are we merely wary of change? What’s more likely: that robots will take our lives or our jobs?

Dubai Police say no human jobs will be lost in the move to a part-robot force, Brigadier Alrazooqi telling CNN personnel will be reassigned “to concentrate in different fields.”

Nonetheless, he acknowledges it’s natural for his staff to have concerns. As might we all: A McKinsey report from January estimated 1.1 billion employees globally work in roles that could be automated entirely with “currently demonstrated technology.”

"RoboCop" (1987). The titular cyborg's remit was to "Serve the public, protect the innocent and uphold the law." In Paul Verhoeven's film, members of the Detroit Police Department go on strike due to RoboCop, fearing for their future employment.

PAL Robotics says it envisions a future in which robots work alongside humans, not replace them. “My personal opinion is that humans are very bad at predicting the future,” Vivas cautions. “People predict something and it just ends up (going) in another direction.”

Speculation is the lifeblood of robot ethicists, however. It’s part of their job to workshop scenarios with technology not yet in existence – and then wonder how we’ll react.

Winfield believes robot law enforcement officers are “likely to be treated with disdain,” and potentially exploited by the public through hacking or gaming. “Human beings are much more reassuring than robots,” he adds, saying a machine cannot help engender a sense of community.

Lightfoot is more positive. “Humans will become more trusting (of robots),” she argues. “As they become more trusting, then we can move (and) engage at a higher level, to the point where we would probably trust a robot rather than a human.”

Brigadier Alrazooqi will be hoping that by 2030, Dubai’s citizens and visitors will be closer to Lightfoot’s projection than Winfield’s. How Dubai Police’s 2020 Expo prototype will be received could be a key indicator.

The benefits of successfully integrating robot AI into policing are huge: more information shared more efficiently and less human officers in harm’s way. On the flipside, others will see only danger. They will argue these tentative first steps should be confined to the laboratory and not played out on the streets.

It is, as Lightfoot puts it, a “divide between promise and peril” in the public imagination, and the battle for hearts and minds is underway.

In the meantime, a five foot six, 220-lb mall cop is about to begin its first day on the job.

Related