Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
United States AI Government

The US Military is Funding an Effort To Catch Deepfakes and Other AI Trickery (technologyreview.com) 70

The Department of Defense is funding a project that will try to determine whether the increasingly real-looking fake video and audio generated by artificial intelligence might soon be impossible to distinguish from the real thing -- even for another AI system. From a report: This summer, under a project funded by the Defense Advanced Research Projects Agency (DARPA), the world's leading digital forensics experts will gather for an AI fakery contest. They will compete to generate the most convincing AI-generated fake video, imagery, and audio -- and they will also try to develop tools that can catch these counterfeits automatically. The contest will include so-called "deepfakes," videos in which one person's face is stitched onto another person's body.

Rather predictably, the technology has already been used to generate a number of counterfeit celebrity porn videos. But the method could also be used to create a clip of a politician saying or doing something outrageous. DARPA's technologists are especially concerned about a relatively new AI technique that could make AI fakery almost impossible to spot automatically. Using what are known as generative adversarial networks, or GANs, it is possible to generate stunningly realistic artificial imagery.

This discussion has been archived. No new comments can be posted.

The US Military is Funding an Effort To Catch Deepfakes and Other AI Trickery

Comments Filter:
  • This is a big problem. And our society isn't prepared for it.
    Congress needs to move fast to make it a felony to create photo-realistic likenesses of people (user or AI generated) without their permission. If you want to parody someone, all you have to do is cartoon-ize their face a little bit so that it's clear that it's a parody and not real.

    Creating or sharing photo-realistic AI generated porn should be sexual assault.

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      Creating or sharing photo-realistic AI generated porn should be sexual assault.

      I prefer that we leave the definition of sexual assault alone.

      In a rape joke culture, it's already hard enough to get actual sexual assault (i.e.: a physical assault with a sexual intent) taken seriously.

      We don't need something that isn't a physical assault masquerading as a sexual assault, giving people the idea that sexual assault isn't serious.

    • Dallas May raped me. Accusation is guilt. Toss 'em in the Gulag!

    • by Anonymous Coward

      You really need to get some perspective.

      How is making a (technically sophisticated) even in the same league as attacking someone?

      What about people who happen to look like famous people? Are they banned from making porn? Twins?

      Get a grip. If you are so fragile that you can't live with the idea of someone using your image in ways you don't like, you clearly have few actual problems to complain about.

      People near you are being attacked, starving and dying of preventable diseases. Your imaginary problems are not

    • Creating or sharing photo-realistic AI generated porn should be sexual assault.

      And while we're at it, let's ban even imagining having sex with someone without his permission! /s

    • by AvitarX ( 172628 )

      I'm not sure.

      I agree with your basic premise, but if it's as easy and realistic as people claim, it will be so prevalent as to not really even be seen as a problem for the victims I suspect (maybe I'm naive and optimistic, I'm willing to accept that).

      Today, I'm sure it would be horribly traumatizing, but in a decade I suspect it will culturally be along the lines of someone claiming to have slept with you that hasn't (bullying/harassment, not assault).

  • by Anonymous Coward

    No need to fake it. Our citizenry expect it, because the United States is turning into a full blown Idiocracy, and both the left and right are bring drawn into it, compelled by moronic (but good sounding to one group of people) views.

    P.s. the whole point of the GAN paradigm is that it can produce more convincing fakes by first improving the algorithm that detects fakes and then splicing the improved version into the system.

  • The thought occurs that an inevitable explosion of fake video and audio recordings will drive the development of encrypted authentication networks that verify that a supposed recording came from a sealed, supposedly tamper-proof recording device from a manufacturer whose production lines, parts suppliers, and design teams are closely monitored by government agencies and nonprofit organizations against the possibility of firmware tampering. Recordings produced by unvetted devices will be automatically assume

    • by thomst ( 1640045 )

      resistant hypothesized:

      The thought occurs that an inevitable explosion of fake video and audio recordings will drive the development of encrypted authentication networks that verify that a supposed recording came from a sealed, supposedly tamper-proof recording device from a manufacturer whose production lines, parts suppliers, and design teams are closely monitored by government agencies and nonprofit organizations against the possibility of firmware tampering. Recordings produced by unvetted devices will be automatically assumed by courts and other interested parties to be inherently unreliable and very likely fake in all cases of controversy.

      I don't think you have a lot of experience with how courts work in actual practice - or legislatures, either.

      Judges are basically free to accept or reject evidence according to their own rules. In the USA, for instance, some of them still admit latent fingerprint testimony, despite the fact that an AAAS panel of expert forensic scientists has completely debunked [amazonaws.com] the science behind it. Another such AAAS panel also determined that much of the "science" behind forensic arson analysis is [amazonaws.com]

  • by Zorro ( 15797 ) on Wednesday May 23, 2018 @12:48PM (#56659948)

    Supports deep fakes.

    She said so this morning at a Disney press conference.

  • Picture this: A few days before the election, a new video has "Candidate A" speaking at a satanist convention, disparaging the flag, mom, and apple pie. Totally fake, but how can Candidate A fight back?

    Simply put out a new ad, with fake footage of Candidate B saying _exactly_ the same speech... then JFK, then Nixon, then ($pop_idol_of_the_week). Tagline is "I'm Candidate A, and I can make stuff up too."

    A sad state of affairs, but if I ran a political party, I'd crank up a rendering farm for this type of eme

    • by Nidi62 ( 1525137 )

      Picture this: A few days before the election, a new video has "Candidate A" speaking at a satanist convention, disparaging the flag, mom, and apple pie. Totally fake, but how can Candidate A fight back?

      Too obviously and explicitly fake. The best way to do it is something small and subtle, enough to simply cause confusion that, if released a week or so before the election, is fresh enough in everyone's mid to sway independents and undecideds to your preferred candidate. Maybe the candidate doing/talking about doing drugs, or domestic violence, sexual assault, taking bribes, etc. Something that isn't huge but would take a while to investigate and repudiate. No amount of fake video will sway partisans on

    • That doesn't solve the problem, because it just creates a deeply cynical electorate who doesn't believe any facts.

      Because what if Candidate A really was speaking at a satanist convention, and just got caught, and then covers his tracks by making a fake video of Candidate B saying exactly the same speech? Now everyone thinks both were fakes, even though one side is telling the truth, and people elect a satanist who disparages the flag, mom, and apple pie.

      The truth is, we're already there. We don't need c

    • by sinij ( 911942 )
      More interesting implication of this, is that now you could get away with actually speaking at a satanist convention, disparaging the flag, mom, and apple pie. Because you could just claim it is fake news.
  • When our Cells can do it...

    Perhaps a better definition of the singularity.

  • https://en.wikipedia.org/wiki/... [wikipedia.org] Understand...Virtual Child Pornography involved the Supreme Court because digitally created images were used to agitate and turn Muslim suspects held at sites around the world in Psy-Operations. We've got to understand that the whole nature of the way American democracy guards its freedom has been changed.~ John Ashcroft, former US Attorney General
  • I wonder if soon all photographic, video, and audio recording devices will be required to incorporate hardened individual hardware certificates that can sign all the recordings that particular device creates. At least in that case it seems it would be possible to verify that a particular portion of unedited source material was recorded directly by a specific device without any additional manipulation, which could be useful in legal, political, or scientific contexts where verification of origin is required
  • The society went from no video, to video surveillance everywhere. Are we now coming full circle where videos will carry no credibility at all, therefore equivalent of not having any video surveillance anyways? Revenge porn, no problem if nobody can prove it's real or not.

Never worry about theory as long as the machinery does what it's supposed to do. -- R. A. Heinlein

Working...