Tuesday, September 22, 2009

Pentagon exploring Robot killers that can fire on their own- stubleupon

http://www.stumbleupon.com/s/#2E7gU8/www.commondreams.org/headline/2009/03/25-11/
This article is about robots being able to make the choice to fire not a human. The on-board computer programs would make the choice of firing the weapon. This seems good and all, it would make for less casualties. But what would happen if these things went crazy and we tried to make them to much like a human. Giving them the decision to fire their weapon is big they will never be able to understand the situation that they are in. What if something had happened and the robot should fire its weapon but doesn't since he was not programmed to do that certain thing. A human would diagnose situation and figure out if he or she should fire the weapon. I feel like there are a lot of things that can go wrong with these things. What if they all just got together and created a gang and tried to control us? How much would these items cost? How many would they be sending into the war? Are there enough technicians to fix them or would we just scrap them and buy all new ones? I feel like this has its pros and cons but right now it has to many cons to take a risk on such a thing like this. 

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.