General > General Technical Chat

When a human is mistaken for a box of veg

<< < (3/6) > >>

SL4P:
A sad bit of reporting…
Blame the AI and the ‘robots’, not the mule that bypassed or installed the unit without safety interlocks,

Karen at the shops, now has a new axe to grind about self-checkouts.

tggzzz:
From the real world comes an interesting essay/exam question that can and should be asked and answered by students and practitioners of law, philosophy, and engineering - and actuaries.

Compare and contrast the possible unintentional interactions that humans can have between (a) industrial robots and (b) partially/fully self-driving cars.

That is finally beginning to be discussed by insurers, lawyers, and governments. It ought to be at the forefront of engineers' minds.

harerod:
The most important lesson to be taken from this is that time pressure kills. 

A quarter century ago, about five people died on "my" machines every year. I really enjoyed when a public prosecutor asked for reviews of our designs to find a culprit who was going to pay for widows and their kids. Safety interlocks were part of the design. The operators found many ways to override those. Cut cables, pulled wires, screwdrivers rammed into alarm speakers, modifications in switch cabinets opened by copied keys.
I remember one rather unpleasant episode at a German port, when a high ranking manager of a client yelled at me in front of several of his subordinates and told me to override certain safety features now, otherwise we wouldn't sell any more machines. I politely asked him to speak to my manager. That modification never happened.
At any rate, I feel sorry for that guy. Quite frankly, I remember way too many occasions where sheer luck saved old dumb me. I have seen way too many occasions where somebody was out of luck at the wrong time.

Xena E:
I recall a story a few years ago that detailed an incident where a fixed robot in a quary that was controlled by an object recognition system to pick up rocks from a conveyor belt to place into a crusher for road stone, picked up a worker by his head and dropped him into the machine instead.

It could only have happened if there were no physical guarding or lockouts in place.

More recently my company were called in to explain to an H&S officer as to what fault modes could occur to cause failure of a coded light guard system it had designed; as an operative at the industrial facility where the system was installed had been injured when he walked into an active cell where the machine was running, palletising production packages. It was subsequently found that the machine could run regardless of the safety system status.

When we investigated, it was found that the SmartScan units had been displaying an error sequence because their outputs had been bridged by an ignorant and incorrect addition to the safety wiring of the control system by a maintenance electrician. It later transpired that the person involved in the additional electrical work, had only ever been involved in domestic electrical installations and had never been trained in industrial control.

Unfortunately it is not just the AI nightmare of machines in control as the sensationalists would have you believe that is the problem, it is also good old fashioned human ignorance and incompetence that is, and will continue to be the main danger with automated machinery control systems.

MarkS:
It's sad, but seeing as how he was an employee of the robot's manufacturer and not the plant it was installed in, he should have known better than to get into the cage with a powered on robot. There are world-wide industrial safety regulations just to prevent this kind of accident and South Korea is FAR from a third world country. He has no excuse. It's sad, but it's totally his own fault.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

There was an error while thanking
Thanking...
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod