US Intelligence Wants To Radically Advance Facial Recognition Software 178
coondoggie writes "Identifying people from video streams or boatloads of images can be a daunting task for humans and computers. But a 4-year development program set to start in April 2014 known as Janus aims to develop software and algorithms that erase those problems and could radically alter the facial recognition world as we know it. Funded by the Office of the Director of National Intelligence's 'high-risk, high-payoff research' group, Intelligence Advanced Research Projects Activity (IARPA) Janus 'seeks to improve face recognition performance using representations developed from real-world video and images instead of from calibrated and constrained collections.'"
Not much need to worry (Score:2, Interesting)
I've worked with current facial recognition systems and they're absolutely junk. They can match mug shots with perfect lighting but that's about all. It's a very long way to being able to pick people out of some crappy live video stream.
Mind, I worked with whatever's publicly available; maybe the various big brother agencies have better stuff; i wouldn't bet on it though.
Isn't going to help I expect, but.. (Score:5, Interesting)
Get Poser, or something similar, and start replacing the face picks of all your contacts with pics of poser models asses selected for a best match to the contact's ass. Remember to find an appropriate image for companies and agencies. I'm thinking a Hydra would be appropriate for the NSA, Medusa for the FBI, Mantis for the CIA, etc.
Bonus points for doing r/g stereo of the images, or 3d if the phone supports that directly.
Re:As expected (Score:2, Interesting)
and i'ts not even usefull.
The ability to harass practically anyone will likely prove quite useful to the government.
Wow. And here I thought they had enough shit online tracking us today, along with the manipulative control to harass citizens (you won a free tax audit based on your party affiliation!) at the Federal level. What the hell was I thinking. Clearly the NSA budget is lacking. Yes, we need more of that. I'm sure that's the answer.
What's that? The average citizen unknowingly commits three felonies a day you say? And the government wants even more visibility into that? What could possibly go wrong?
Re:Not much need to worry (Score:4, Interesting)
Every picture on Facebook is scanned by Interpol by facial recognition software. Yes even the guy in the background that didn't know you took the picture. Interpol you say? Yep, that's why this whole fake outrage over the NSA is all bunk. They all work together, and have been for decades, spying on each other then sharing the information to bypass privacy laws in each country.
Re:Not much need to worry (Score:5, Interesting)
Assume that it's 99.9% accurate for a given success rate (wildly optimistic) That is for every 1000 faces you show it 1 is incorrectly flagged as matching.
Suppose that you have a list of 100 people you are 'interested' in. If the system is in an airport with 200,000 people per day - you are going to get 20,000 incorrect matches a day.
Re:Transitioning from academic to real world ... (Score:3, Interesting)
In the academic world it is perfectly acceptable to use carefully selected or crafted inputs (facial images in this case) to develop and evaluate your algorithms. You may have separate date sets for development and evaluation, however careful selection or crafting is OK to simplify the project and avoid issues/variables outside of the project's scope.
As a CompSci academic, I am consistently shocked by the fact that we don't really consider the ethics our research. Some of the research, like the folks that are still interested in Chess playing algorithms, is pretty benign. Other research, like facial recognition, data mining, etc.... not so much. Case and point, there's a great Ted Talk [ted.com] by a researcher from Carnegie Mellon in which he demos an iPhone app (paired with some server-side software) his team wrote for using facial recognition to predict social security numbers in seconds. For those with experience on the academic side, how often have you or your colleagues stopped to consider that your research may be used unethically? Unless you're working in security, I suspect that it's probably infrequently despite the fact that advances in just about every major CS research area could be misused.
To be fair, I don't really know what to do about this problem. Someone is going to do the research. If it isn't me, or you, it'll be someone working in a government research facility... perhaps working for a government that isn't so friendly. All I suppose I'm really saying is that we really need to start thinking about the fact that there's a digital arms race going on [zdnet.com]... and we're the ones making the weapons.
It'd be nice if we could have advice from some of the researchers from the dawn of the last arms race, like Oppenheimer. This time, the race isn't about becoming omnipotent, it's about becoming omniscient.
Re:It's like (Score:2, Interesting)
Hey, who are those thugs on those motorcycles over there that just pulled up to the playground?
Nevermind, just ignore them and hope that they go away or don't bother us.
I'm still having fun building this sand castle.
Hey, they started taking pictures.
Don't look at them and maybe they will go away.
Besides, they haven't beat anyone up yet.