Tag Archives: Borut Povse

Robots, pain, and dance

There was a time many years ago when I knew and interacted with a lot of dancers (mostly in the modern genre) and they often talked about pain. It seems to be a feature of any field where you push your body, e.g., sports, dance, combat, etc. This is somewhat unrelated to the post I’d planned on robots and pain but, this morning I found some information on robots and dance in addition to the previous material on pain and that old memory about dancers and pain popped up out of nowhere.

The article which started this ball rolling in the first place is by Kit Eaton for Fast Company and is titled, Why Robots Are Learning Our Pain Threshold (from the article),

How do you teach a robot how not to hurt humans? Train one to hit someone in an experiment, to find our pain limit. Sounds infinitely sensible, doesn’t it? Until you remember your dystopian sci-fi and consider the implications. [emphasis mine]

The robot experiments are taking place at the lab of Professor Borut Povse in Slovenia. (Yes, he is probably well aware that he sounds like a Bond villain.) He’s been thinking about the future of human-machine interactions, when our daily lives involve working much more closely with robots than we do now. …

Povse spotted a key problem with this scenario: Machines don’t know how much energy in any given impact would result in pain to a person. Or to put it in laymen’s terms, robots don’t know their own strength. Hence he came up with an experiment to solve the problem. Somewhere in Solvenia there’s a robot punching volunteers at a variety of energies, with blunt or sharper “hammers,” so it can work out where the pain threshold is.

The plan is to use the data to inform the design of robots that will operate in close proximity to humans, so that they don’t make sudden movements with too much energy.

As Eaton goes on to note, robots could also be used to hurt/torture in very precise ways that could evade detection. These ethical issues are raised in the article with a suggestion that ethical issues around another ‘robotic programme’, the Predator drone programme (Predator drones are remotely controlled, unmanned planes) have not been handled as well as they could be. Eaton specifically cites an article by Jane Mayer for The New Yorker Magaine (The Predator War; What are the risks of the C.I.A.’s covert drone program?). If you’re interested in these kinds of issues please do read the article. As I don’t want to copy Mayer’s entire piece into this posting I’m going to focus on the pragmatic aspects of the problems  discussed (from the article),

David Kilcullen, a counter-insurgency warfare expert who has advised General David Petraeus in Iraq, has said that the propaganda costs of drone attacks have been disastrously high. Militants have used the drone strikes to denounce the Zardari government—a shaky and unpopular regime—as little more than an American puppet. A study that Kilcullen co-wrote for the Center for New American Security, a think tank, argues, “Every one of these dead non-combatants represents an alienated family, a new revenge feud, and more recruits for a militant movement that has grown exponentially even as drone strikes have increased.” His co-writer, Andrew Exum, a former Army Ranger who has advised General Stanley McChrystal in Afghanistan, told me, “Neither Kilcullen nor I is a fundamentalist—we’re not saying drones are not part of the strategy. But we are saying that right now they are part of the problem. If we use tactics that are killing people’s brothers and sons, not to mention their sisters and wives, we can work at cross-purposes with insuring that the tribal population doesn’t side with the militants. Using the Predator is a tactic, not a strategy.”

Exum says that he’s worried by the remote-control nature of Predator warfare. “As a military person, I put myself in the shoes of someone in FATA”—Pakistan’s Federally Administered Tribal Areas—“and there’s something about pilotless drones that doesn’t strike me as an honorable way of warfare,” he said. [emphasis mine] “As a classics major, I have a classical sense of what it means to be a warrior.” An Iraq combat veteran who helped design much of the military’s doctrine for using unmanned drones also has qualms. He said, “There’s something important about putting your own sons and daughters at risk when you choose to wage war as a nation. We risk losing that flesh-and-blood investment if we go too far down this road.”

It seems to me that from a practical perspective, the use of drones (according to the military strategists quoted in the article) is turning neutral parties into hostile parties at a greater rate than standard warfare tactics would accomplish. At least one of these advisors is also implying that the morale of the parties using the drones is at risk if the means of warfare (the drones) are viewed as less than honourable.

On a possibly less disturbing note, Kit Eaton has another Fast Company article, Robots Dance Their Way Into Uncanny Valley, Next Stop: Your Heart, about a recent demonstration of the HRP-C4 robot. From the article,

Now rewind it, squint a little, and watch again: You’ll almost be able to mistake the ‘bot for one of the real dancers on the stage. Uncanny valley, ladies and gentlemen–HRP4C is busy dancing her way in here, and if the trend continues we can imagine future HRPx units dancing out the other side with a realism and finesse that may even be enough to move you emotionally if you saw them performing live.

Here’s one of the videos available (you can find at least one more on YouTube) but this gives you the best grasp of the ‘uncanny valley’,

For those who like definitions, here’s one for ‘uncanny valley’ from a Wikipedia essay,

The uncanny valley is a hypothesis regarding the field of robotics.[2] The theory holds that when robots and other facsimiles of humans look and act almost like actual humans, it causes a response of revulsion among human observers. The “valley” in question is a dip in a proposed graph of the positivity of human reaction as a function of a robot’s lifelikeness.

I think that’s enough for robots and disturbing thoughts about ethics and ‘uncanny valleys’.