Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
The Military Transportation United States

Rise of the Robot Squadrons 245

Velcroman1 writes 'Taking a cue from the Terminator films, the US Navy is developing unmanned drones that network together and operate in 'swarms.' Predator drones have proven one of the most effective — and most controversial — weapons in the military arsenal. And now, these unmanned aircraft are talking to each other. Until now, each drone was controlled remotely by a single person over a satellite link. A new tech, demoed last week by NAVAIR, adds brains to those drones and allows one person to control a small squadron of them in an intelligent, semiautonomous network.'
This discussion has been archived. No new comments can be posted.

Rise of the Robot Squadrons

Comments Filter:
  • by Chris Burke ( 6130 ) on Tuesday November 03, 2009 @12:57PM (#29965890) Homepage

    And personally, I'm not especially afraid the armed forces are going to change their tune on that aspect. They most definitely want to have a human being in the firing loop. And I bet part of the reason is that we may be close to having machines that can find and attack targets on their own, we're a hell of a long way from having machines that you can usefully reprimand for fucking up. :) But in all seriousness, this seems like a deeply ingrained philosophy in the military that humans should be in charge of the technology.

    • I don't think they want a man in the loop simply because you would have a weapon system that could be subverted by the enemy.

      At worst, a robot weapon system run amok is a hazard (like a minefield) and can be dealt with.

      • by kevinNCSU ( 1531307 ) on Tuesday November 03, 2009 @01:28PM (#29966288)

        That's akin to saying you wouldn't trust your squadmate to cover your ass in battle because he could be subverted by the enemy. Military is going to trust their brothers in arms that have fought and bled beside them far more then some piece of code.

        Mainly because unlike a robot their buddy isn't going to hang him out to dry without care or regard if the contractors that put his helmet together didn't properly ensure the security between it and the company that put the chinstrap together.

        • Re: (Score:3, Insightful)

          That's exactly what I'm saying, imagine if your fire support was autonomous but with a remote override, that remote override gets subverted and now you have your own support fire shooting you in the ass or worse, not providing the cover its supposed to.

          I'd rather have a person manning a weapon system BECAUSE he is much more difficult to subvert. Joe in the trench doesn't have a wifi port you can hack.

          Leave the automation to mines.

          • In that case I think either I misread or you mistyped and we're in agreement. I thought you said they didn't want a man in the loop because the man could be subverted by the enemy. Probably because you said:

            I don't think they want a man in the loop simply because you would have a weapon system that could be subverted by the enemy.

            To clarify, are you arguing if there's going to be any computer control of fire support it should be fully autonomous or not at all due to fear of the operator interface getting hacked or are you saying that no such device should be able to make a fire decision without a human at the controls?

      • Re: (Score:2, Insightful)

        by theIsovist ( 1348209 )
        because as we all know, minefields have never been difficult to remove after they've out lived their usefulness. oh wait...
        • because as we all know, minefields have never been difficult to remove after they've out lived their usefulness. oh wait...

          They're not for the most part. Western doctrine has mandated for quite a while that all minefields be marked physically, and that the locations of individual mines must be accurately plotted on a map / range-card. We've even looked at creating mines which are self-neutralizing after a set period of time. Mine removal is only an issue when they're used by guerrilla forces and shitty armies.

          Of course, your statement really had nothing to do with what he was saying, anyway.

          • Re: (Score:3, Insightful)

            by Entropius ( 188861 )

            In other words, mine removal is not an issue when they are used by a force with overwhelming military superiority over their opponents, which is in control of the terrain where the mines were planted after hostilities are over and which can come back and remove them based on their maps.

            In a situation where the mine-user doesn't have overwhelming superiority and the breathing room to accurately document their locations, ensure that that documentation is kept, and remove them after the war, it's not that simp

          • Re: (Score:3, Informative)

            by theIsovist ( 1348209 )
            You are correct, my statement had nothing to do with his other than to be a bit of a troll and point out that his analogy was poor at best.

            but as long as we're off topic, please note the following pages on land mine statistics-
            http://www.newint.org/issue294/facts.html [newint.org]
            http://www.redcross.ca/article.asp?id=1945&tid=110 [redcross.ca]
            http://www.unicef.org/sowc96pk/hidekill.htm [unicef.org]

            a couple of key facts:
            2,000 people are involved in landmine accidents every month - one victim every 20 minutes. Around 800 of these w
      • Robots have bugs and glitches requiring timely patches and PRODUCT recalls. They are too complex to blame anybody and when they are free to decide more on their own much of the blame will be gone as well. It'll be like blaming microsoft for your computer sucking. bugs happen.

        Blue screen of DEATH gets a new meaning.

        Perhaps at this point, we'll finally get investigation on whether quality control really exists or if errors are not a form of planned obsolescence... since there would be additional incentive an

        • by c6gunner ( 950153 ) on Tuesday November 03, 2009 @04:20PM (#29968404) Homepage

          You're clearly trolling, but what the hell:

          Robots have bugs and glitches requiring timely patches and PRODUCT recalls

          So do people. Psychiatrists and psychologists exist for a reason.

          Blue screen of DEATH gets a new meaning.

          It's had that meaning for quite a while, seeing as how much of our modern transportation infrastructure is either computer controlled or heavily dependent on computers. Yet, amazingly, the majority of accidents still happen due to driver/pilot error. Thousands of lives could be saved if we'd take control away from people, yet we continue to insist on having human operators because of our paranoid fears of computer malfunctions.

          Seriously, this only illustrates how ethics and courage are not part of the empire mindset; just window dressing. This is how fat lazy cowards can take over the world. On the grander scale, its no different than traditional cultures going up against the Spanish, Romans etc- who's goal was conquest and not the honor of a risky act of sacrifice.

          That's right - I'm sure that the Aztecs would have been complaining about the "unfairness" of it all, if they hadn't been scared shitless by the sound of boom-sticks, and I'm sure some spoiled twits back in Spain had notions similar to yours. Idiots have been whining about the advance of military technology for centuries - meanwhile those with a decent IQ and a bit of common sense have gladly embraced new tech as a means to protect lives and be more effective. If you want to clutch on to a Vietnam-era AK while cowering with the Taliban in some shitty little cave, feel free. You can feel all warm and fuzzy about how much "ethics and courage" you're showing as a hellfire missile turns you into pink jello. Me, I'll gladly watch from a distance, happy in the knowledge that every such explosion means I'll have one less flag-draped casket to carry down the tarmac.

          Americans would attack everybody if it didn't cost them anything personally; that IS the reality.

          President Ahmedinejad? I didn't know you had a Slashdot account! I guess being laughed at during your speeches at the UN and Columbia University wasn't enough for you, huh?

    • You turn it off and replace the code. Try doing the same with a human soldier, pilot, etc.

      • I've seen that movie, it's call "The Manchurian Candidate [imdb.com]"
      • Re: (Score:3, Informative)

        by bughunter ( 10093 )

        "Turn it off and replace the code" is easy to type, but in practice it is immensely difficult, to the point of impracticality. It's far more likely to just stop working and be a UXO threat... or be salvage for terrorists (if they don't blow an arm off in the process).

        Sibling to parent post actually got it right; a compromised system is more of a hazard than anything else.

    • Re: (Score:2, Funny)

      by silver69 ( 1481169 )
      I would say "I for one welcome are new overloads" but I think it is a little to close to home.
    • Re: (Score:3, Informative)

      by geckipede ( 1261408 )
      This is not universally true: http://www.nationaldefensemagazine.org/archive/2009/October/Pages/FailureToFieldRightKindsofRobotsCostsLives,ArmyCommanderSays.aspx [nationalde...gazine.org]

      There is at least one general who believes that robots should be deployed right now with the ability to fire their own weapons. Quoted from the linked article:

      "There's a resistance saying that armed ground robots are not ready for the battlefield. I'm not of that camp," he told National Defense. That includes the robot autonomously firing the

    • And I bet part of the reason is that we may be close to having machines that can find and attack targets on their own, we're a hell of a long way from having machines that you can usefully reprimand for fucking up.

      Well the issue here is that the US military is going to be fighting human targets for some time so the delay between a human operator in a bunker versus the target they are fighting.

      Now, if the US ever went against an enemy whose targeting was based on computer decisions leaving humans out of the

    • by megamerican ( 1073936 ) on Tuesday November 03, 2009 @01:47PM (#29966514)

      It's not necessarily up to the military. Congress has blocked funding on a program that could autonomously fire grenades, much like a minefield, except much easier to set up, program and dismantle afterwards. Congress, and thus the people still have the power of the purse to decide whether or not our weapon systems can be autonomous or not.

      There have been cases where our own drones have been shot down by us because they did not return to a safe mode when instructed to. As of now, that could simply mean that they were in an armed state when it shouldn't have been and couldn't change back.

      A co-worker of mine always jokes that we should be adding requirements that state if the system becomes self aware it should be loyal to the US Constitution. I told him that could cause a lot of trouble for politicians in Washington depending on how it interprets the Constitution.

      • Re: (Score:3, Funny)

        by Anonymous Coward

        Somewhere, in a quiet, dark corner of the Capitol, Ron Paul is grinning an evil grin.

      • Re: (Score:2, Offtopic)

        by Zordak ( 123132 )

        Congress, and thus monied corporations and lobbyists still have the power of the purse to decide whether or not our weapon systems can be autonomous or not.

        Looks like you had a typo.

    • Re: (Score:2, Interesting)

      by dgr73 ( 1055610 )
      You could always have swarms and swarms of small, but inexpensive machines with no autonomy over target selection, but preprogrammed attack modes. Things that come to mind are miniature flying darts for anti personnel work. Once a target has been identified and a valid go-ahead has been given by operator, the swarm would detach a portion of it's strength for an suicide attack. If the target remains valid, it could be reattacked or a new validation sought (to prevent dummies from sapping the swarms). For ant
    • Re: (Score:3, Insightful)

      by S77IM ( 1371931 )

      Human operators are also cheaper to rollout and maintain than all but the simplest robot AI, and will remain so for the foreseeable* future.

        -- 77IM

      * For certain values of foresight -- I'm sure some AI enthusiast will jump on here and say that realistic, reliable target-acquisition AI should be possible in "about 10 years..."

    • by Znork ( 31774 )

      But in all seriousness, this seems like a deeply ingrained philosophy in the military that humans should be in charge of the technology.

      Perhaps. Even considering such reluctance some future politicians might not be entirely happy with that; humans may be reluctant to, or even refuse to fire upon their own citizens, and that may be a flaw that highly automated systems can correct.

      Even the nastiest warlords in history were limited in their engagement in atrocities by their ability to get their soldiers marchi

      • Even the nastiest warlords in history were limited in their engagement in atrocities by their ability to get their soldiers marching in their desired direction.

        Really? That's news to me. I mean, yeah, they were sometimes limited by the logistical needs (ie. food, water, clothing, equipment, fuel) of their armies. Even that was often not an issue since the really bad ones would just let the weak soldiers die. But, anyway, if that's what you're talking about then ok. Otherwise, I think a citation is in order.

    • we're a hell of a long way from having machines that you can usefully reprimand for fucking up. :)

      Apparently you aren't familiar with the phrase "Disassemble Number Five".

  • All this air stuff is awesome, but the guys on the ground could still use a device that can detect a buried pipe bomb from a safe distance.

    • Unfortunately, the Mark 1 Eyeball can not be remotely operated.
    • Re: (Score:3, Informative)

      We already have this. And they function on more or less the same swarm functions. They scale really easily, since they simply communicate with each other to navigate. If one blows up, no loss, and you've found a bomb.

      It's not quite as elegant as a magic bomb detector, but it's just as effective. I saw them demoed at a CS conference a few years back, and the designer said that they sent them off to Iraq and got back the empty husks (they're basically rolling cylinders with a single 'payload' unit that is jus

    • All this air stuff is awesome, but the guys on the ground could still use a device that can detect a buried pipe bomb from a safe distance.

      Not quite sure of my own reasoning on this yet, but we need to at least recognize the danger of making war too safe for any party. It doesn't seem too far off that we could replace foot soldiers with ground-based drones, and station our troops out of DC metro, with time after a raid on [insert 3rd world village] to make the kids' soccer game and have some pizza over I

    • We're doing that, too! [defense-update.com]

      Also, this [defense-update.com].

  • by nycguy ( 892403 ) on Tuesday November 03, 2009 @01:00PM (#29965914)
    ...we've still got 75 years left [wikipedia.org]!
  • by StanTheBat ( 1478937 ) on Tuesday November 03, 2009 @01:09PM (#29966026)
    Well that explains the Starcraft II delay.... Blizzard has been busy designing interfaces for the military.
    • You're joking, but I work in R&D for one of the biggest US manufacturers of UAVs, and the DuneII/C&C/WarcraftII/Starcraft paradigm for controlling and commanding "swarms" of UAVs, and for displaying the data they retrieve, is exactly the inspiration we're using for multiple platform systems with one operator. We ultimately envision one pilot commanding tens or even hundreds of Protoss Observers...

      (And for those of you who are FUDding about "skynet" -- 99.9% of the UAVs in the sky are ISR-only, like the Protoss Observer, not weapon platforms. And the ones that do have weapons don't fire at anything without a human issuing at least two orders, and that human is under observation himself. Please stop the FUD. The only functions these craft do autonomously are piloting (i.e., responding to stick commands and short time constant variations in atmospherics) and waypoint-to-waypoint navigation. The rest is done by human pilots and payload operators.)

      And yes, we can't wait for StarcraftII to come out.

      • Sure its FUD now, but how much of a technological leap will be required for a swarm of autonomous drones to leave a base, independently traverse the intervening terrain, and then independently attack targets based on whatever parameter is fed into them? All without any human intervention other than the initial order? None?

        The fear, at least mine, isn't predicated on the computers suddenly becoming self aware, it has to do with the concept itself. The more immediate fear is that the safegaurds in place
        • Sure its FUD now, but how much of a technological leap will be required for a swarm of autonomous drones to leave a base, independently traverse the intervening terrain, and then independently attack targets based on whatever parameter is fed into them? All without any human intervention other than the initial order? None?

          Quite a bit actually - as in, only within the realm of Science Fiction.

          This "rogue swarm" would need to be aware enough to 1) have a motive to do such a thing in the first place, 2) learn enough about outside systems to 2a) break into an outside network and 2b) research information about its target, and 3) learn how to fuel itself or recharge its batteries, 4) socially engineer some E4 to load a few bombs on board (what, you think these things are kept armed in the hangar?), and 5) manage to elude the ground and air traffic controllers long enough to get off the ground and 6) evade fighter interceptors that will eventually chase after them when they're noted missing.

          Now, it's reasonably arguable that one of these systems could fall into the hands of someone with foul intentions. But so could a tank, or a Harrier Jet, or a nuke. In fact, it's far easier to take control of something that is not remotely piloted, and that has a standard unencrypted interface like a stick, rudder and throttle.

          But to seriously argue that these things could have a mind of their own is ludicrous. Anyone who argues such a position is heedlessly ignorant of how these things are designed, built and operated.

          At the very basic level, they don't have enough processing power on board to be any smarter than a moth. We don't put anything more powerful in them than absolutely necessary because we need to conserve as much mass and power as possible for flight endurance.

          • I appreciate that you are focused on denying their ability for sentience and don't argue taht point at all. My point was that we have the technology for drones to fly from point A to B without human interaction, to distinguish between vehicles and stationary targets, and to carry weapon systems that allow them to engage these targets. The Autonomous point doesn't revolve around them deciding of their own accord to it, it has to do with the technology necessary for them to carry out their mission without a
        • Re: (Score:3, Insightful)

          by LWATCDR ( 28044 )

          This ability isn't exactly new.
          Torpedoes at the end of WWII had seekers. A diving sub could fire one that would circle and hit any ship that it happened to find. We have had captor mines for years that sit on the seabed and wait for a sub to come buy and sinks them. Should we worry about the abuse of weapons?
          Well heck yes. Ever since we developed the bow we need to worry about people abusing the ability to kill at a distance but this drone tech isn't revolutionary.

          • Ever since we developed the bow we need to worry about people abusing the ability to kill at a distance but this drone tech isn't revolutionary

            Right, but you had to be in bow distance yourself. Drone allow for no reciprocity. The sub that launched the torpedo was subject to depth charges. The captive mines had to be placed by ship or sub, also at risk. That's the point. the risk with drones is only to the drone, not to the operator. That's the change. I'm not saying that's not a logical progressio
        • Sure its FUD now, but how much of a technological leap will be required for a swarm of autonomous drones to leave a base, independently traverse the intervening terrain, and then independently attack targets based on whatever parameter is fed into them? All without any human intervention other than the initial order? None?

          That already exists. Cruise missiles, and to a lesser extent, smart bombs.
          Leave the launch platform, fly a preprogrammed route, blow up when you get there.
      • Fair enough, I 'll postpone worrying to the day you choose the protoss carrier as your next inspirational source :)

      • by IdahoEv ( 195056 )

        99.9% of the UAVs in the sky are ISR-only, like the Protoss Observer, not weapon platforms

        Only on Slashdot is the function of a fictitious video game entity used to explain a real-world military system.

        I'm afraid that I don't understand, since I'm not a Starcraft player. Can you use a car analogy instead?

    • So *that's* why the flamethrowers have been asking if we've got any questions about propane accessories!

  • by Monkeedude1212 ( 1560403 ) on Tuesday November 03, 2009 @01:10PM (#29966044) Journal

    No, I don't mean Terminator.

    Did anybody actually watch Stealth? I wish I could unwatch it.

    • It reminds me more of an episode of "Masters of Science Fiction"; Watchbird [imdb.com]

      A scientist guy creates flying drones that aid in wars, taking out the enemy - specifically anyone who has intent or is in the process of harming US soldiers.
      But the government think the 'birds' will help fight crime at home by autonomously flying around in the sky above cities looking for people committing crime or even thinking about it.
      It all goes wrong when the 'birds' stop listening to their masters and start killing people
    • by Ksevio ( 865461 )

      Yeah that was a pretty horrible movie.

      What most people missed (I only saw it because I was projecting it) was an extra scene at the end of the credits showing the wreckage of the plane being looked over by the North Koreans. Then they scan over the damaged AI unit of the plane (the crystally thing) which suddenly starts to glow again. Sequel anyone?

      The biggest problem with that plane though seemed to be that it thought it was much better than the people commanding it. Which was probably true in that case

    • I see this film. A good film, until the ludicrous, stupid and mediocre final.
  • [protoss voice]Carrier has arrived.[/protoss voice] *releases swarm of autonomous drones*
  • I hope those new rats [slashdot.org] don't manage to take over the networked swarm drones!
  • WOW (Score:2, Funny)

    (get it?)
  • by AP31R0N ( 723649 ) on Tuesday November 03, 2009 @01:28PM (#29966286)

    While we're on the subject, let's talk about the difference between drones/UAVs and robots so we use the right words.

    A drone/UAV is controlled remotely by a human. If a UAV is on autopilot flying to the target area, it is function as a robot. With the US military, there is a "man in the loop" for any attack using a UAV. The bomb disposal machines are not robots. They are remote controlled. A land mine would be closer to a robot.

    A robot follows a program and is NOT controlled by a person. An air to air heat seeking missile is a robot. The software tells it what to do.

    An android is a robot in the shape of a human, like the T800.

    Mecha in Robotech and the like... are NOT robots. They are vehicles piloted by people. The transformers are robots that happen to be sapient. Big metal walking thing != robot. Absence of pilot inside != robot.

    The machines in Battle Bots are remote controlled cars with armor and weapons. They are NOT robots. But it would be awesome if they were.

    • I've always thought that autonomous Battlebots would be incredible to watch; better by far than the remote-controlled version.

      • by AP31R0N ( 723649 )

        That would rock indeed. Even if the bots go stupid one third of the time, it'd still be fun to watch the coders and engineers try to get their system running before the kill-o-matic trashes them.

        i'd have categories for weight class, terrain and whether the AI was onboard or remote. The latter option allows competitors to use a server farm as the brain of the system, rather than lugging it around. For terrain, i'd have indoor/urban, land, water and air, and mixed. Maybe categories for swarms.

        One challenge

    • If a UAV is on autopilot flying to the target area, it is function as a robot.

      And all our base are belong to it.

  • A larger drone... (Score:3, Interesting)

    by gedrin ( 1423917 ) on Tuesday November 03, 2009 @01:39PM (#29966426)
    While useful, isn't this just a larger drone with it's parts connected by signals rather than wires? Sure, it's got ablative resilience (one of three drones can go boom and you still have the rest of the formation), and more payload (more drones to cary stuff), but there doesn't seem to be any capacity for communication beyond holding formation and relaying orders from the human controller.
  • ... this for sure will broaden the semantics of hack attack [ccil.org].

    CC.
  • Out of context (Score:4, Informative)

    by Anonymous Coward on Tuesday November 03, 2009 @02:07PM (#29966734)

    The reason why they are calling these UAVs "swarms" is because they are using Particle Swarm Optimization to determine their flight path and schedule. (The basis for this research was done at my school, Purdue, so I know a lot about it.) The whole 'networking together' idea is not necessarily true either. The UAV's status is reported to a central machine/server/program that constantly reprocesses the incoming data to determine an optimal order of operations (such as blowing this up, looking at this, etc.) The program considers all of the situations of various other drones, in addition to other external data (wind speed, etc) to determine the optimal result.

    Taken out of context, it sounds a lot like terminator type stuff, but it's not really... it's more like optimizing the operations of drones so that they can be controlled by less people.

  • When you watch the precision [youtube.com] of the people flying Predators and Reapers, one wonders what would be the incentive to give the machines more autonomy.

    There have been armed UAV's that have gone off the reservation [army.mil] and failed to respond to commands or their default programming, which tells them to fly home.

    I'm not sure we want to give something with that kind of bomb load more latitude. You could maybe automate the actual flying, let the auto-pilot handle the aircraft control but I'm not really seeing the

    • by Animats ( 122034 )

      When you watch the precision [youtube.com] of the people flying Predators and Reapers, one wonders what would be the incentive to give the machines more autonomy.

      Because they land better semi-autonomously. The USAF flies the things manually with remote officer pilots in the US. The Army uses autoland and enlisted controllers located with the using units. The USAF has a much higher crash rate [theregister.co.uk] than the Army.

    • by BobMcD ( 601576 )

      I agree with this as well. "Human in the loop" should be absolutely required for all ordinance. Despite what we're being told, I'm unconvinced. What I'd want to see and hear is that the weapons system is discrete from the flight system, with separate communications and control. If they're touching, even at the communications level, they can bleed over to one another.

      Remember the parable in Terminator [imdb.com]:

      The Terminator: The Skynet Funding Bill is passed. The system goes on-line August 4th, 1997. Human decisions are removed from strategic defense. Skynet begins to learn at a geometric rate. It becomes self-aware at 2:14 a.m. Eastern time, August 29th. In a panic, they try to pull the plug.

      Sarah Connor: Skynet fights back.

      Step one, remove humans.

      Step two, machine learns.

      Step three, it does something we didn't expect it to do

  • Because now, one dedicated hacker with his OLPC will be able to take down a whole army. Or even better: Make them fly back, acting as if they had been successful, landing, and then either detonating right there, or in the face of their best engineers who just before that downloaded the trojan that will now spread though the whole research facility and then report back to its master.

    Man... killing is always the action of a coward. No exceptions. No sides taken.
    And war is mass murder. Always. Period. No discu

  • by Swarm Master ( 1670492 ) on Tuesday November 03, 2009 @04:14PM (#29968306)

    It takes three people to remotely pilot a Predator. There are never enough Predators or Global Hawks in the sky for all the intelligence we would like to gather. We don't have enough people, platforms and dollars to buy, launch, pilot, and support all the reconnaissance we would like. And while the imaging capabilities on the big unmanned platforms is impressive, they still can't see through mountain ridges or down deep urban canyons. For that you need something that can fly right overhead and get close enough without being seen or heard and that requires lots of small UASs. But the only way we can get enough of those into the air is to have some way for a single person to manage two or a hundred platforms just as easily as one.

    Swarm may be an unfortunate term, since it can evoke the image of a killer swarm of bees - hence we naturally think of swarms as lethal attack technology. In fact, unmanned attack swarms are still science fiction. The swarming research that is going on (and demonstrated in the article) is all about surveillance and reconnaissance. Even if we get to the point of arming the individual swarming platforms, there will always be a human in the loop making the final decision to fire a weapon. Don't kid yourself: even with all the new technology it has only gotten more difficult to make the decision to engage not easier over time. Ask those that do this for a living about the hoops they have to run through before they can fire a weapon from a Reaper.

  • Oblig. xkcd (Score:2, Informative)

  • Not Brains! (Score:3, Funny)

    by uvajed_ekil ( 914487 ) on Tuesday November 03, 2009 @07:49PM (#29971758)
    The summary says they added "brains." I disagree, because I ambushed and tried eating on of these new drones, and I did not find it to be satisfying in the least. Quite a let down, really. Sincerely, Steve the zombie.
  • by MrKaos ( 858439 ) on Wednesday November 04, 2009 @06:44AM (#29976304) Journal

    Former Defense policy advisor to President Obama, Peter Singer does a great [abc.net.au] interview for [abc.net.au] Hungry Beast on autonomous military robotics. Quite an interesting interview. It is a video but it won't start buffering until you hit play.

    He raises a good point about us human doing things like this and then thinking 'maybe that wasn't such a good idea'. So much for Asimov's laws for robotics.

It's currently a problem of access to gigabits through punybaud. -- J. C. R. Licklider

Working...