Human-Robot Collaboration.. on Crime?

evil-1297439_960_720

Not long ago, a group of hackers got control over a bunch of home electronic devices and used them to shut down Twitter, Netflix, and other super important websites. People were deeply upset about not getting to see Nick Cage, and some businesses lost money, and possibly customers.

We’re not really saying that this is unimportant, but the consequences of wayward robots could be far more serious in the industrial sphere. Gordon Briggs, and Matthias Scheutz recently wrote an article in Scientific American suggesting that it would be easy for “a disgruntled employee who understands the limited sensory or reasoning capabilities of a mobile industrial robot [to] trick it into wreaking havoc in a factory.”

Briggs and Scheutz suggested that programming robots to refuse to carry out orders could be the solution. Right now, industrial robots often have strong safety features built in which effectively cause those robots to refuse to do dangerous things. There are often limitations on which team members have the power to program robots and other machinery. The movie scenario in which a random human comes up to a robot and barks out an order which the robot must then obey is pure fantasy in an industrial setting.

These safety measures are likely to be more effective than teaching robots to question human orders. Birggs and Scheutz ran an experiment in which their robots were able to cause human beings to feel bad about making the robot knock down a tower it had proudly displayed. This experiment reinforces years of data showing that people will very quickly grow fond of machines or at least empathetic toward them. The slightest appearance of human-like feelings, and people will engage in human-like interactions with machines. The number of people who dress up their household robots or sleep with their phones under their pillows demonstrates this.

You’ve probably done it yourself with Siri, or Old Bess, your first car, or that machine you were swearing at yesterday in the factory.

But getting human-robot crime collaboration going would not be made easier by humanizing machines and encouraging interactions with humans. That disgruntled employee Briggs and Scheutz imagined would have to be a disgruntled engineer with high-level access to machinery with very limited built-in safety measures.

As the recent cyber attack shows, it’s easier for humans to co-opt machinery without any interaction at all. Safety measures for machinery and security for software will be more useful in avoiding robotic crime sprees than trying to program robots with assertive ethical systems.

24 Hour Turnaround

Factory Repair services available with 24 hour turnaround.
customerservice@hyperdynesystems.com

Call (479) 422-0390 for immediate assistance

Support Request