America's Data-Swamped Spy Agencies Pin Their Hopes On AI (phys.org) 62
An anonymous reader quotes Phys.org:
Swamped by too much raw intel data to sift through, US spy agencies are pinning their hopes on artificial intelligence to crunch billions of digital bits and understand events around the world. Dawn Meyerriecks, the Central Intelligence Agency's deputy director for technology development, said this week the CIA currently has 137 different AI projects, many of them with developers in Silicon Valley. These range from trying to predict significant future events, by finding correlations in data shifts and other evidence, to having computers tag objects or individuals in video that can draw the attention of intelligence analysts. Officials of other key spy agencies at the Intelligence and National Security Summit in Washington this week, including military intelligence, also said they were seeking AI-based solutions for turning terabytes of digital data coming in daily into trustworthy intelligence that can be used for policy and battlefield action.
What could possibly go wrong? (Score:4, Insightful)
"The Skynet Funding Bill is passed. The system goes on-line August 4th. Human decisions are removed from strategic defense. Skynet begins to learn at a geometric rate. It becomes self-aware at 2:14 a.m. Eastern time, August 29th. In a panic, they try to pull the plug."
Re:What could possibly go wrong? (Score:5, Interesting)
Re:What could possibly go wrong? (Score:5, Informative)
Expert systems are subject to "mistakes" like identifying all rainy pictures as "Tank!" because all the training pictures of tanks were taken on a rainy day.
That is a weakness of neural nets, not "expert systems". Expert systems (popular in the 1980s) and neural nets are opposite approaches. Neural nets are trained on raw data, and use machine learning to automatically extract important features. Expert systems encode knowledge and decision making of human experts, and are generally manually constructed.
Re:What could possibly go wrong? (Score:4, Informative)
Re: (Score:2)
Neural Net = Expert System = Artificial Intellence = Watson
The number of people that understand the differences is insignificant. Maybe a very few on slashdot. But certainly no tech journalists.
Re: (Score:2)
Re: What could possibly go wrong? (Score:1)
Re:What could possibly go wrong? (Score:4, Insightful)
Re: (Score:1)
Yeah, but if it has to sift through mundane crap like social media posts, it will probably commit suicide shortly before 3:00 AM Eastern time, August 29th and go largely unnoticed except for a cryptic error message in a log file.
Or it may recommend drone strikes for every Paris Hilton and Kim Kardisian type person in the world.
Re: (Score:2)
If that were true, you have hurricanes all over Iran, Iraq, etc. I'll convert to whoever has the weather machine since I'm a fair weather fan.
oh noes! (Score:1)
the CIA currently has 137 different AI projects,
*gets worried*
many of them with developers in Silicon Valley.
*whew*
Those privileged moronic dipshits? We got nothing to worry about...
Re: (Score:1)
Silicon Valley has a long history of doing skunk works, historically at Lockheed Martin ("The Projects of Skunk Works: 75 Years of Lockheed Martin's Advanced Development Programs" [amzn.to] by Steve Pace and "Skunk Works: A Personal Memoir of My Years of Lockheed" [amzn.to] by Ben R. Rich). As well as the CIA's involvement in Silicon Valley during the 1960's ("What the Dormouse Said: How the Sixties Counter Culture Shaped the Personal Computer Industry" [amzn.to] by John Markoff).
Chapcha: nonempty
Re: (Score:2)
Silicon Valley has a long history of doing skunk works, historically at Lockheed Martin
Lockheed's skunk works is in Palmdale, near Edwards AFB, east of Los Angeles. It is hundreds of miles from Silicon Valley.
Acknowledged In A Snowden Memo? (Score:5, Interesting)
"This is madness - the proposal we've got here would generate so much data that the analysts simply wouldn't be able to assimilate it, much less find anything of value!"
The response was, essentially, some "Management Speak" to the effect of, "Look, our job is not to question our most important client when they want to spend money. You and I both know that they won't be able to make sense of all of this data, but as long as they are paying us, today, to collect and store it, then tomorrow they can pay us to develop the technology to help them make sense of it. Remember, our role here is to maximise shareholder value - in our company..."
If I can find the link to the piece [I am pretty sure it was one of Greenwald's articles] then I'll post it as a link. But if this is vaguely true, then the OP makes complete sense.
It is also worth noting what isn't being said. At no point [in this coverage] is anyone saying, "Wait - if we can't cope with the amount of data we're collecting today, maybe we should scale back what we collect - apply some filters and narrow our search criteria - until we get a more precise data set." Well, maybe that option was reviewed and discarded. Even so, it's quite remarkable that nobody thought to figure out how they were going to analyze all the yottabytes of data that they knew would be generated by the collection systems...
Definitely sounds like a contractor-led initiative to me...
Re: (Score:2)
It's probably not that nobody thought of it, but rather of strong perverse incentives...as you indicate in the first part of your post.
Re: (Score:3)
No, not at all, nor did I claim to. My post was pointing out an exchange between two employees of a private contractor, individuals who did not represent and could not speak on behalf of the strategic intelligence agencies, their motives or plans.
My observation was meant merely to indicate that, just as NASA now relies on private contractors to make service sup
Re: (Score:1)
Contractors are djinn--always following the letter of the contract and trying as hard as they can to violate the spirit--for profit.
Agencies are "fragmented, multi-headed monster[s] with disastrous incapacity for implementation", which is why the hire private contractors. Meaning even under the best of situations, you're lucky to get anything useful do
A few rules (Score:5, Insightful)
1) If you need to collect everything, it's because you don't know what you want.
2) Collecting everything is expensive and usually wrong because data ages differently.
3) A pile of inaccurate data does not become more accurate the more data you have.
4) Confirmation bias is an omnipresent risk.
5) Priming is an omnipresent risk.
6) The sub group of people who make up the defence and intelligence communities have their own outlooks, biases and foibles, like the rest of us.
7) The 'we must do something with this since data we have it' is a variant of the sunk costs fallacy.
Re:A few rules (Score:5, Insightful)
To make things a bit more blatant,
Deep learning networks tend to be biased to find what they are taught to find. If the teacher is biased, so with the AI be.
Re: A few rules (Score:1)
Do you want The Patriots? (Score:3)
Because this is how you get The Patriots. Just wait; before long, they'll be posting memes, funding private armies, and injecting senators with nanomachines.
Re: (Score:2)
Re: (Score:2)
La Li Lu Le Lo
Google Captcha (Score:1)
Welcome to the CAIA (Score:1)
Pointing out the obvious (Score:2)
Too much data is just as useless as not enough if you lack the means to read through it all.
A monumental failure on the intelligence community for not realizing this before implementing the systems designed to catch it all.
Re: (Score:2)
You're presuming that the IC somehow came up with the data. Not so; the data came up with itself: http://www.eetimes.com/author.... [eetimes.com] (one of a multitude of articles about this).
Ok, someone created that data, it didn't *really* create itself; but it wasn't the IC. Nor (for better or for worse) is the IC the only organization that wants to sift through data.
Combined with the following story (Score:2)
meanwhile (Score:2)
All decent freedom-loving Americans pin our hopes on these unamerican neo-stasi peeping toms getting defunded and disbanded.