FROM THE OFFICE OF THE PRESIDENT:

As a privately held company, Modern Evil is not required to publicly report on any of its operations or activities. This blog is a faint reflection of our interests and opinions. Thank you.

~ Dr. Archibald T. Staph, Ph.D, President

Showing posts with label killer robot. Show all posts
Showing posts with label killer robot. Show all posts

4.12.08

How Ethical is a Killer Robot

CATEGORY: War, Ethics, Killer Robots

DIVISION: Modern Evil

COMMENT: The whole point of a killer robot is to be unethical, to give its operator an "arms-length" alibi and [by extension] free-reign to unleash hell as desired. But we get the point of planting an article in a major newspaper, creating a media-trail so that when it all goes "wrong" [read: as planned] they can say that it wasn't their intention. We understand it and we love it, you sneaky Pentagon devils.
















Pentagon Hires British Scientist to Help Build Robot Soldiers That 'Won't Commit War Crimes'

By Tim Shipman in Washington

The US Army and Navy have both hired experts in the ethics of building machines to prevent the creation of an amoral Terminator-style killing machine that murders indiscriminately.

By 2010 the US will have invested $4 billion in a research programme into "autonomous systems", the military jargon for robots, on the basis that they would not succumb to fear or the desire for vengeance that afflicts frontline soldiers.

A British robotics expert has been recruited by the US Navy to advise them on building robots that do not violate the Geneva Conventions.

Colin Allen, a scientific philosopher at Indiana University's has just published a book summarising his views entitled Moral Machines: Teaching Robots Right From Wrong.

He told The Daily Telegraph: "The question they want answered is whether we can build automated weapons that would conform to the laws of war. Can we use ethical theory to help design these machines?"

Pentagon chiefs are concerned by studies of combat stress in Iraq that show high proportions of frontline troops supporting torture and retribution against enemy combatants.

Ronald Arkin, a computer scientist at Georgia Tech university, who is working on software for the US Army has written a report which concludes robots, while not "perfectly ethical in the battlefield" can "perform more ethically than human soldiers."

He says that robots "do not need to protect themselves" and "they can be designed without emotions that cloud their judgment or result in anger and frustration with ongoing battlefield events".

Airborne drones are already used in Iraq and Afghanistan to launch air strikes against militant targets and robotic vehicles are used to disable roadside bombs and other improvised explosive devices.

Last month the US Army took delivery of a new robot built by an American subsidiary of the British defence company QinetiQ, which can fire everything from bean bags and pepper spray to high-explosive grenades and a 7.62mm machine gun.

But this generation of robots are all remotely operated by humans. Researchers are now working on "soldier bots" which would be able to identify targets, weapons and distinguish between enemy forces like tanks or armed men and soft targets like ambulances or civilians.

Their software would be embedded with rules of engagement conforming with the Geneva Conventions to tell the robot when to open fire.

Dr Allen applauded the decision to tackle the ethical dilemmas at an early stage. "It's time we started thinking about the issues of how to take ethical theory and build it into the software that will ensure robots act correctly rather than wait until it's too late," he said.

"We already have computers out there that are making decisions that affect people's lives but they do it in an ethically blind way. Computers decide on credit card approvals without any human involvement and we're seeing it in some situations regarding medical care for the elderly," a reference to hospitals in the US that use computer programmes to help decide which patients should not be resuscitated if they fall unconscious.

Dr Allen said the US military wants fully autonomous robots because they currently use highly trained manpower to operate them. "The really expensive robots are under the most human control because they can't afford to lose them," he said.

"It takes six people to operate a Predator drone round the clock. I know the Air Force has developed software, which they claim is to train Predator operators. But if the computer can train the human it could also ultimately fly the drone itself."

Some are concerned that it will be impossible to devise robots that avoid mistakes, conjuring up visions of machines killing indiscriminately when they malfunction, like the robot in the film Robocop.

Noel Sharkey, a computer scientist at Sheffield University, best known for his involvement with the cult television show Robot Wars, is the leading critic of the US plans.

He says: "It sends a cold shiver down my spine. I have worked in artificial intelligence for decades, and the idea of a robot making decisions about human termination is terrifying."

24.3.08

Killer Robot Plans, DIY and You

CATEGORY: Killer Robots, Suicide, DIY

DIVISION: Modern Evil Products, R&D

NOTE: Killer Robots are the backbone of every nefarious scheme - from world domination all the way down to driveway suicide. But giving away the blueprints to the DIY crowd is like showing your hand before the flop. So while we meet with our internal Products Division, we'd like to remind you that our fantastic line of Killer Robots are all ICC-Approved and finance-ready for overnight shipping within the continental U.S.










Man, 81, Kills Himself with Shot from 'Suicide Robot'

Fran Yeoman

An elderly man has killed himself by programming a robot to shoot him in the head after building the machine from plans downloaded from the internet.

Francis Tovey, 81, who lived alone in Burleigh Heads on the Australian Gold Coast, was found dead in his driveway.

According to the Gold Coast Bulletin, he had been unhappy about the demands of relatives living elsewhere in Australia that he should move out of his home and into care.

Notes left by Mr Tovey — who was born in England — revealed that he had scoured the internet for plans before constructing his complex machine, which involved a jigsaw power tool and was connected to a .22 semi-automatic pistol loaded with four bullets. It could fire multiple shots once triggered remotely.

At 7am on Tuesday he set the robot up in the driveway of his £450,000 house and activated it.

His notes suggested that Mr Tovey chose to kill himself in the driveway because he knew there were workmen building a new house next door who would find his body.

The scheme worked, as carpenter Daniel Skewes heard gunshots and ran to Mr Tovey's home. "I thought I heard three shots and when we ran next door he was lying on the driveway with gunshot wounds to the head," Mr Skewes told the GCB.

A neighbour, who did not want to be named, told the newspaper that Mr Tovey had lived at his home on Gabrielle Grove since 1984. "He was a really marvellous man, an ideal neighbour and I will miss him greatly," she said.

"He was born in England, like I was, and we used to enjoy our tea together. He had visitors from England and family interstate from somewhere far away in Australia.

"There was no inkling of anything amiss, it is just very sad."