Military Robotics

Israel Building 4 Ton Killer Robot

Posted 13 Jan 2007 at 15:28 UTC by steve Share This

Roland Piquepaille writes, "According to a short report by AFP, Israel is developing a killer robot plane. This drone is designed for long-range operations -- more than 50 hours and several thousand miles -- while weighing 4 tons during takeoff. This Unmanned Combat Aerial Vehicle (UCAV), dubbed Eitan (Steadfast) will have a wing span of 35 meters -- like a Boeing 737. This [High Altitude] Long-Endurance (HALE) drone is the largest unmanned aircraft designed by Israel. It should be tested in coming days, but details are scarce, and it might already have flown in 2006." For more details on this new robot aircraft see Roland's blog.

Killer roboticists, posted 13 Jan 2007 at 19:15 UTC by littleprince » (Apprentice)

I am very sorry for robitics engineers who are tricked into working on those killer military projects. It is always easier to build a machine which kills or destroys rather than brings you a coke in a bar. There are many killing machines, but still no robotics home maids. Think how many bright brains are building wrong stuff

Wonder if there is an open source software license which prohibits using software/designs in military purposes. Something like GPL license but prohibiting any militaty applications of the product.

I wish robotics scientists (especialy those working on open source projects) start using a non-military license just to protest. Don't think it is going to happen though at universities as a large share of their funding comes from militaries

Are engineers being tricked?, posted 13 Jan 2007 at 20:16 UTC by Rog-a-matic » (Master)

This viewpoint is naive and one that denies the existence of good and evil in our world. Sadly, this moral confusion is rampant in our society today.

Anything humans are capable of building will eventually be built. I'm not real happy about that. What matters to me is who accomplishes it first.

WWII is an example. If good people decided building fighter planes was immoral then there would have been a different outcome.

Military robots, posted 13 Jan 2007 at 20:22 UTC by motters » (Master)

Firstly such an aerial goliath - whether flown autonomously or by human pilots - is unlikely to be very successful in any aggressive military sense. Large aircraft like this are easily visible to radar and therefore vulnerable to ground based missile or directed energy attacks.

Secondly, I share the previous commenter's reticence about the military use of robotics. Whilst I realise this is an inevitable consequence of technological development I think the introduction of autonomous robots onto the battlefield and their consequent destructive powers has the potential to do much damage to the public's perception of robotics in general. The robot ceases to be something cute or amusing and instead becomes the focus of fear and suspicion. There will be a growing concern about the possible development of private robotic militias, and those involved in robotics research and development may be required to submit to some forms of government regulation/inspection comparable to the regulation of amateur rocketry societies today.

Killer bots, posted 13 Jan 2007 at 21:33 UTC by steve » (Master)

Someone at a recent DPRG meeting noted that both of the robots I'm currently working on have tank-tracks. He postulated that my next robot would be driving over mountains of human skulls, firing lasers at the last remaining humans (you know, like the one from the opening scenes of the Terminator movie).

Seriously, though, I share a concern about this too. As robots become more autonomous, people may find it acceptable to pass targeting decisions to the robots, in effect allowing them to decide which humans live and die. Some of the UCAVs (like the X-45) already being tested in US have limited capabilities along this line. This is something new and different from any weapon we've developed before. WWII fighter planes, even H-bombs, generally do what the operators tell them to. They're just tools. When robots transition from being tools to being something more, it would be nice if we've passed on to them our better qualities and left out some of our baser impulses, like the desire to kill everyone who's different than we are. Otherwise, the robots may someday decide that it's us humans who are evil and need to be destroyed.

To answer littleprince's question, there has always been talk of adding clauses to software licenses to prevent one group or another from using the software but, at that point, it is no longer Open Source / Free Software. Here's what Richard Stallman said recently about such restrictions:

"A free program must be available for all kinds of use. That's fundamental. For authors to try to restrict what users do, in their own lives, in their own activities, is completely unacceptable. Any program, unless it serves only trivial purposes, like a game, can be used for evil purposes, just as a pen, or typewriter, or a telephone, can be used for evil purposes.

I don't think that we should allow pens or telephones or typewriters to come with requirements for what you can use them for, nor general purpose programs. That is its own form of tyranny. A program with such restrictions would not be Free Software. We will not allow any such restrictions to be added to any version of GNU GPL.

The most common restriction they propose, is restriction against military use. I would be extremely unhappy if my friend in Venezuela could no longer install new versions of our software on the army's servers. The army of Venezuela may be necessary for resistance against an invasion some day from: you can guess where."


Not the Terminator, but still destructive, posted 13 Jan 2007 at 23:53 UTC by motters » (Master)

I don't believe that we're going to see any kind of Terminator style robotic takeover, but I think not too many years from now you will see autonomous robots patrolling streets and shooting "insurgents" (or whoever becomes the future bogeyman or hated minority group). This may sound sci-fi, but its well within existing technological capabilities and much development is currently going into building that kind of system. It takes perhaps surprisingly little artificial intelligence to detect pedestrians and figure out where the head region is, then to calculate a trajectory which could be used to aim a weapon.

Another disturbing consequence is that unlike the multi million dollar tanks of today military type robots may not be too expensive or complicated to build, and could be manufactured by small groups of people.

Robots should only be used for GOOD!!!, posted 14 Jan 2007 at 05:13 UTC by The Swirling Brain » (Master)

Although we would hope robots would always be used for good, that may just be wishful thinking. Alas there are bad meanie weenies out there that will make bad robots! (it's already happened!) If it ever were to come down to Asimovs laws, only good people would incorporate them in their robots; bad people plain and simple just won't! Welcome to the real world!

Therefore, how can you fully trust any robot? Movies have been made about losing such trust. Don't be fooled into thinking robots will always be used for good. Instead, try to figure out how to prevent bad robots.

When you look at the sky and it is blue you say, it's going to be a nice day, and when you look at the sky and see dark clouds, you say, I think it's going to rain. Can you read the signs of the times...

A war is coming. It's inevitable.

A war is coming? , posted 14 Jan 2007 at 05:47 UTC by Botnerd » (Master)

A war is coming???!!! I hope I don't miss it. That's kinda ironic that Israel is building killer robots. I wonder if World War 3 will involve Israel sending robot planes to blow their enemies up.

Non-Military GPL, posted 14 Jan 2007 at 13:25 UTC by littleprince » (Apprentice)

Steve, I am thinking about an open source license which prohibits using the software by any military, not just a few selected ones (Venezuela, etc). Will this type of license follow the spirit of open- source movement?

Understand that the license is a more of a PR instrument other than anything else as I don't believe it will prevent militaries from using my patent/software if they really decide they need it (especially in developing countries)

As for the sentiments I posted above, I definetly don't share admiration about a new killer robot being built. It will probably take an invasion of space aliens to unite the humankind against the common threat - and stop people from inventing more ways to kill each other :- ) The world is not a fair and safe place, that's for sure :-)

Low cost robotics weapons, posted 14 Jan 2007 at 16:29 UTC by littleprince » (Apprentice)

motters, totaly share your concerns about small groups of people building weaponized robots from off-the-shelf parts at low cost. Flying drones (UAVs) seems to be relatively easy to build. There are plenty of instructions on the Internet.

A guy in New Zealand suggested that it is possible to build a cruise missile at a cost of $5K.

I don't totally agree with him, but I am sure that building a UAV that carries a resonable amount of explosives for several miles guided by GPS is possible for anyone who is determined to do it

Perceptions, posted 15 Jan 2007 at 12:53 UTC by c6jones720 » (Master)

Its kind of weird, when I show people robots and AI programs I've written more and more people make out that Im doing something I shouldnt!

The perception of the evil robot that will take over the world and kill everybody is already here and its ingrained in peoples imagination. Maybe people will never change their minds..

Link to Roland's Blog, posted 15 Jan 2007 at 16:28 UTC by steve » (Master)

Ooops. just noticed the link to Roland's blog was broken. If you've havent read it yet, be sure to check out the details on this story at his site.

See more of the latest robot news!

Recent blogs

30 Sep 2017 evilrobots (Observer)
10 Jun 2017 wedesoft (Master)
9 Jun 2017 mwaibel (Master)
25 May 2017 AI4U (Observer)
25 Feb 2017 steve (Master)
16 Aug 2016 Flanneltron (Journeyer)
27 Jun 2016 Petar.Kormushev (Master)
2 May 2016 motters (Master)
10 Sep 2015 svo (Master)
14 Nov 2014 Sergey Popov (Apprentice)
Share this page