The US Military is Funding an Effort To Catch Deepfakes and Other AI Trickery (technologyreview.com) 70
The Department of Defense is funding a project that will try to determine whether the increasingly real-looking fake video and audio generated by artificial intelligence might soon be impossible to distinguish from the real thing -- even for another AI system. From a report: This summer, under a project funded by the Defense Advanced Research Projects Agency (DARPA), the world's leading digital forensics experts will gather for an AI fakery contest. They will compete to generate the most convincing AI-generated fake video, imagery, and audio -- and they will also try to develop tools that can catch these counterfeits automatically. The contest will include so-called "deepfakes," videos in which one person's face is stitched onto another person's body.
Rather predictably, the technology has already been used to generate a number of counterfeit celebrity porn videos. But the method could also be used to create a clip of a politician saying or doing something outrageous. DARPA's technologists are especially concerned about a relatively new AI technique that could make AI fakery almost impossible to spot automatically. Using what are known as generative adversarial networks, or GANs, it is possible to generate stunningly realistic artificial imagery.
Rather predictably, the technology has already been used to generate a number of counterfeit celebrity porn videos. But the method could also be used to create a clip of a politician saying or doing something outrageous. DARPA's technologists are especially concerned about a relatively new AI technique that could make AI fakery almost impossible to spot automatically. Using what are known as generative adversarial networks, or GANs, it is possible to generate stunningly realistic artificial imagery.
Sharing Deep-Fake porn should be Sexual Assult (Score:1)
This is a big problem. And our society isn't prepared for it.
Congress needs to move fast to make it a felony to create photo-realistic likenesses of people (user or AI generated) without their permission. If you want to parody someone, all you have to do is cartoon-ize their face a little bit so that it's clear that it's a parody and not real.
Creating or sharing photo-realistic AI generated porn should be sexual assault.
Re: (Score:2)
Re: (Score:2)
Re:Sharing Deep-Fake porn should be Sexual Assult (Score:5, Insightful)
How about -- in 10 years time nobody will be able to believe anything they read, hear, or see, at which point democracy becomes unworkable. As Jefferson famously said, a properly functioning democracy depends on an informed electorate -- but it's nearly impossible for anyone to be well-informed when every valid piece of information is competing with a cloud of equally-plausible forgeries, and even people operating in good faith can no longer discern what is or isn't real.
We are already seeing this occurring, with segments of society breaking off into their own enclosed media-bubbles, the walls of which become increasingly impermeable to inconvenient facts or reasoning, as any evidence that would traditionally cause them to adjust their worldview now simply gets dismissed as 'fake'. I see no reason to think it won't get worse over time.
Re: (Score:1, Interesting)
it's nearly impossible for anyone to be well-informed when every valid piece of information is competing with a cloud of equally-plausible forgeries, and even people operating in good faith can no longer discern what is or isn't real.
You seem to be describing our current state, not the future.
The solution is simple: A well-informed electorate.
That means you actually have to, you know, work at knowledge, rather than drooling and giggling and clapping like a moron at the 24/7 "news" cycle.
Re:Sharing Deep-Fake porn should be Sexual Assult (Score:5, Interesting)
The solution is simple: A well-informed electorate.
Okay, but that's begging the question -- how do you gain (or maintain) a well-informed electorate when the forgeries have become so realistic that distinguishing between a forgery and an authentic piece of information takes more skill than most people possess? Simply demanding that people become exponentially smarter isn't going to make them so, no matter how severely we chastise them.
Usually when people realize they don't have the skill to make a reliable determination about something for themselves, they look to a knowledgeable expert (or at least, a trustworthy authority figure) to help them make the right determination... but here again we now run into the same problem -- who is actually a valid expert or a trustworthy authority figure, and who is faking it? And even if the person in question is valid and trustworthy, how does the average person verify that what they are hearing that person say are things that person actually said, and not clever forgeries designed to mislead people?
Re: (Score:3)
how does the average person verify that what they are hearing that person say are things that person actually said, and not clever forgeries designed to mislead people?
A public key signature by the person doing the talking ?
Re: (Score:1)
You, sir, have just summed up the en
You are pretending like that has ever been true. (Score:2, Insightful)
Go read up on the Spanish-American War, the Opening of Japan, Native American/German-American/Japanese-American Internment, etc. The frothing masses have always been fed whatever narrative makes them back the currently desired course of action since the founding of this 'fair country'. If you go back further you can find the same action going on all over the world, only overshadowed by rebellions when the narrative was disrupted by facts on the ground, usually resource shortages or labor abuses so prevalent
Re: (Score:2)
Re: (Score:2)
Deepfake porn is the porn of a new generation. It is the porn of my generation. We don't need government intrusion into it. Let the free market decide. You liberals and your laws.
Because today's young people are no longer having sex, it stands to reason that their porn should be fake also.
Re: (Score:3, Insightful)
Creating or sharing photo-realistic AI generated porn should be sexual assault.
I prefer that we leave the definition of sexual assault alone.
In a rape joke culture, it's already hard enough to get actual sexual assault (i.e.: a physical assault with a sexual intent) taken seriously.
We don't need something that isn't a physical assault masquerading as a sexual assault, giving people the idea that sexual assault isn't serious.
Re: (Score:2)
What name do you prefer? "Involuntary pornography" perhaps?
Re: Sharing Deep-Fake porn should be Sexual Assult (Score:1)
Dallas May raped me. Accusation is guilt. Toss 'em in the Gulag!
Re: Sharing Deep-Fake porn should be Sexual Assult (Score:3, Insightful)
You really need to get some perspective.
How is making a (technically sophisticated) even in the same league as attacking someone?
What about people who happen to look like famous people? Are they banned from making porn? Twins?
Get a grip. If you are so fragile that you can't live with the idea of someone using your image in ways you don't like, you clearly have few actual problems to complain about.
People near you are being attacked, starving and dying of preventable diseases. Your imaginary problems are not
Re: (Score:2)
Creating or sharing photo-realistic AI generated porn should be sexual assault.
And while we're at it, let's ban even imagining having sex with someone without his permission! /s
Re: (Score:1)
I'm not sure.
I agree with your basic premise, but if it's as easy and realistic as people claim, it will be so prevalent as to not really even be seen as a problem for the victims I suspect (maybe I'm naive and optimistic, I'm willing to accept that).
Today, I'm sure it would be horribly traumatizing, but in a decade I suspect it will culturally be along the lines of someone claiming to have slept with you that hasn't (bullying/harassment, not assault).
Our politicians already say outrageous things (Score:2, Insightful)
No need to fake it. Our citizenry expect it, because the United States is turning into a full blown Idiocracy, and both the left and right are bring drawn into it, compelled by moronic (but good sounding to one group of people) views.
P.s. the whole point of the GAN paradigm is that it can produce more convincing fakes by first improving the algorithm that detects fakes and then splicing the improved version into the system.
Encrypted Authenticity Verification Networks (Score:1)
The thought occurs that an inevitable explosion of fake video and audio recordings will drive the development of encrypted authentication networks that verify that a supposed recording came from a sealed, supposedly tamper-proof recording device from a manufacturer whose production lines, parts suppliers, and design teams are closely monitored by government agencies and nonprofit organizations against the possibility of firmware tampering. Recordings produced by unvetted devices will be automatically assume
Re: (Score:2)
resistant hypothesized:
The thought occurs that an inevitable explosion of fake video and audio recordings will drive the development of encrypted authentication networks that verify that a supposed recording came from a sealed, supposedly tamper-proof recording device from a manufacturer whose production lines, parts suppliers, and design teams are closely monitored by government agencies and nonprofit organizations against the possibility of firmware tampering. Recordings produced by unvetted devices will be automatically assumed by courts and other interested parties to be inherently unreliable and very likely fake in all cases of controversy.
I don't think you have a lot of experience with how courts work in actual practice - or legislatures, either.
Judges are basically free to accept or reject evidence according to their own rules. In the USA, for instance, some of them still admit latent fingerprint testimony, despite the fact that an AAAS panel of expert forensic scientists has completely debunked [amazonaws.com] the science behind it. Another such AAAS panel also determined that much of the "science" behind forensic arson analysis is [amazonaws.com]
Carrie Fisher (Score:5, Funny)
Supports deep fakes.
She said so this morning at a Disney press conference.
The solution to deepfakes is... more deepfakes! (Score:1)
Picture this: A few days before the election, a new video has "Candidate A" speaking at a satanist convention, disparaging the flag, mom, and apple pie. Totally fake, but how can Candidate A fight back?
Simply put out a new ad, with fake footage of Candidate B saying _exactly_ the same speech... then JFK, then Nixon, then ($pop_idol_of_the_week). Tagline is "I'm Candidate A, and I can make stuff up too."
A sad state of affairs, but if I ran a political party, I'd crank up a rendering farm for this type of eme
Re: (Score:3)
Picture this: A few days before the election, a new video has "Candidate A" speaking at a satanist convention, disparaging the flag, mom, and apple pie. Totally fake, but how can Candidate A fight back?
Too obviously and explicitly fake. The best way to do it is something small and subtle, enough to simply cause confusion that, if released a week or so before the election, is fresh enough in everyone's mid to sway independents and undecideds to your preferred candidate. Maybe the candidate doing/talking about doing drugs, or domestic violence, sexual assault, taking bribes, etc. Something that isn't huge but would take a while to investigate and repudiate. No amount of fake video will sway partisans on
Re: (Score:1)
That doesn't solve the problem, because it just creates a deeply cynical electorate who doesn't believe any facts.
Because what if Candidate A really was speaking at a satanist convention, and just got caught, and then covers his tracks by making a fake video of Candidate B saying exactly the same speech? Now everyone thinks both were fakes, even though one side is telling the truth, and people elect a satanist who disparages the flag, mom, and apple pie.
The truth is, we're already there. We don't need c
Re: (Score:3)
When our Cells can do it... (Score:1)
Perhaps a better definition of the singularity.
On Ashcroft's Wings... (Score:2)
cryptographic signatures (Score:2)
Privacy achieved via deepfakes? (Score:2)
The society went from no video, to video surveillance everywhere. Are we now coming full circle where videos will carry no credibility at all, therefore equivalent of not having any video surveillance anyways? Revenge porn, no problem if nobody can prove it's real or not.