Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI Education

Explicit Deepfake Scandal Shuts Down Pennsylvania School (arstechnica.com) 65

An anonymous reader quotes a report from Ars Technica: An AI-generated nude photo scandal has shut down a Pennsylvania private school. On Monday, classes were canceled after parents forced leaders to either resign or face a lawsuit potentially seeking criminal penalties and accusing the school of skipping mandatory reporting of the harmful images. The outcry erupted after a single student created sexually explicit AI images of nearly 50 female classmates at Lancaster Country Day School, Lancaster Online reported. Head of School Matt Micciche seemingly first learned of the problem in November 2023, when a student anonymously reported the explicit deepfakes through a school portal run by the state attorney's general office called "Safe2Say Something." But Micciche allegedly did nothing, allowing more students to be targeted for months until police were tipped off in mid-2024.

Cops arrested the student accused of creating the harmful content in August. The student's phone was seized as cops investigated the origins of the AI-generated images. But that arrest was not enough justice for parents who were shocked by the school's failure to uphold mandatory reporting responsibilities following any suspicion of child abuse. They filed a court summons threatening to sue last week unless the school leaders responsible for the mishandled response resigned within 48 hours. This tactic successfully pushed Micciche and the school board's president, Angela Ang-Alhadeff, to "part ways" with the school, both resigning effective late Friday, Lancaster Online reported.

In a statement announcing that classes were canceled Monday, Lancaster Country Day School -- which, according to Wikipedia, serves about 600 students in pre-kindergarten through high school -- offered support during this "difficult time" for the community. Parents do not seem ready to drop the suit, as the school leaders seemingly dragged their feet and resigned two days after their deadline. The parents' lawyer, Matthew Faranda-Diedrich, told Lancaster Online Monday that "the lawsuit would still be pursued despite executive changes." Classes are planned to resume on Tuesday, Lancaster Online reported. But students seem unlikely to let the incident go without further action to help girls feel safe at school. Last week, more than half the school walked out, MSN reported, forcing classes to be canceled as students and some faculty members called for resignations and additional changes from remaining leadership.

Explicit Deepfake Scandal Shuts Down Pennsylvania School

Comments Filter:
  • And now this person is going to end up on the registry. Goodbye future and hello ankle monitor whenever he or she gets out of lockup.
    • Re: (Score:3, Interesting)

      by jythie ( 914043 )
      Which is probably why administrators were reluctant to report anything. Given the area, they tend to see reporting laws as something for getting brown kids into the prison system, NOT something to 'ruin the lives' of little white boys. If it had been a dark skinned kid making deepfakes of white classmates, he would probably be in juvenile detention.
      • Which is probably why administrators were reluctant to report anything. Given the area, they tend to see reporting laws as something for getting brown kids into the prison system, NOT something to 'ruin the lives' of little white boys. If it had been a dark skinned kid making deepfakes of white classmates, he would probably be in juvenile detention.

        Witness the rapist Brock Turner and his dad [stanforddaily.com] who, at sentencing, said the rapist shouldn't have his future ruined for 20 minutes of action:

        His life will never be the one that he dreamed about and worked so hard to achieve. That is a steep price to pay for 20 minutes of action out of his 20 plus years of life.

    • And now this person is going to end up on the registry.

      I don't know Pennsylvania's laws, but many US states have special laws for young (typically under 18 or under 21) first-time sex offenders so they aren't automatically put on the sex-offender registry and which allow them to get removed much earlier if they are forced to register.

    • by rsilvergun ( 571051 ) on Monday November 18, 2024 @05:13PM (#64955569)
      Kids do not always understand the consequences of their actions or the gravity of them relative to our society. It sounds like a big deal that they did 50 of them but they could literally take their yearbook, take a picture of it with their iPhone and then run the picture through any one of several programs to generate the nudes in mass.

      It's one of those things where at a certain point when we as a civilization have made it so easy to commit a crime I have a hard time wanting to bring the full force of law down on somebody.

      Also if our country and our society didn't have such a fucked up opinion of nudity this wouldn't be an issue. The reason this is so damaging is because the assumption is that if you can find nude pictures of a girl then that girl is of low moral character and should be ostracized.
      • Re: (Score:3, Insightful)

        by markdavis ( 642305 )

        >"Kids do not always understand the consequences of their actions or the gravity of them relative to our society"

        Indeed. They don't understand a LOT of stuff. Which is why children shouldn't be carrying around phones/tablets/whatever with full internet access. There is an entire world of insanity, and allowing kids access to however, whenever, and wherever it is incredibly irresponsible.

        Maybe I am missing something, I read the summary and articles, and there is nothing about what HAPPENED with the ima

        • It just would have taken a little more effort. He could scan in yearbook pictures. Once he's got them on a computer the sky's the limit.

          Unless you are going to watch absolutely everything kids do 24/7 then they're going to get into some shit. I think there are probably better solutions that are full-on orwellian police state for anyone under 18...

          And again none of this would be an issue if we didn't treat women who have pictures of themselves naked as horrible people that need to be punished.
          • >"Unless you are going to watch absolutely everything kids do 24/7"

            They won't have constant unrestricted access to the internet, unless someone GIVES it to them. Usually in the form of an unrestricted phone, tablet, or computer with no supervision. If it was the norm that kids shouldn't have such devices (or that they are special devices only, designed for kids, with appropriate whitelists only), then it wouldn't be a big deal. Instead, I see kids walking around all over with unrestricted devices. It

    • by ihavesaxwithcollies ( 10441708 ) on Monday November 18, 2024 @05:55PM (#64955713)

      And now this person is going to end up on the registry. Goodbye future and hello ankle monitor whenever he or she gets out of lockup.

      Don't forget to say hello to Matt Gaetz. This person will be able to be Attorney General someday.
      A GOP DEI hire includes child molesters.

    • He could grow up to head the DOJ one day. https://www.pbs.org/newshour/a... [pbs.org]

  • Welcome to the new world of vouchers.
  • Where's the abuse? (Score:4, Insightful)

    by Murdoch5 ( 1563847 ) on Monday November 18, 2024 @05:04PM (#64955525) Homepage
    If someone generated AI “deep fakes”, then what abuse took place? While those images are generally distasteful, rude, offensive, and invasive, if they're generated images, then they can't rise to the level of abuse, since the majority of the content is fake, and therefore protected until freedom of expression / speech. This idea has been fought before, it's why people can publish child sexual novels, and why organizations like NAMBLA can exist.

    Regardless how anyone feels about the content, I know as a father of two teenage daughters, I would want the death penalty, the right to free speech and free expression must be upheld. What is the difference between a generated picture of a naked teenager, vs, Romeo and Juliet? Would those same parents demand Romeo and Juliet get removed, and people resign?

    Just so there's absolutely no confusion, I'm only coming at this from a free speech / freedom of expression context. I do not defend generating sexual fakes of others.
    • by abEeyore ( 8687599 ) on Monday November 18, 2024 @05:20PM (#64955589)
      There are. indeed, laws against generating content depicting the sexual exploitation of minors, even if it is entirely fictitious - and this is not entirely fictitious because they are based on real people. I understand that you feel this skirts close to the edge of so called "thought crime", but in this case, the violation is not simply in creating the pictures, but in then distributing them.

      The creator's rights - whatever you may imagine them to be - end where the rights of the girls in question begin - and even the most strident libertarian has to agree that the girls - as minors, if for no other reason - have a right to NOT be publicly exploited in such a way.
      • They weren't exploited because the images were generated. Imagine generating a James Bond novel, that had themes and elements from old novels, but was generated enough as to not breach copyrights or trademarks. If you distribute the novel, all you've really done is distributed a ripoff, contrast the Hardy Boys with Nancy Drew, they're effectively the same concept, but different enough as to be unique.

        I'm not going to defend distributing the images because I think it's gross AF, but fundamentally, if th
        • by vux984 ( 928602 )

          But they weren't *entirely generated* because they depicted real recognizable people.

          You might find this interesting...

          https://www.owe.com/resources/... [owe.com]

          Any one of the criteria noted can elevate the use of someones likeness to illegal; this case easily satisfies multiple critera.

          And the fact that its porn and involves minors, makes the arguments even more compelling.

    • by hey! ( 33014 ) on Monday November 18, 2024 @05:33PM (#64955631) Homepage Journal

      if they're generated images, then they can't rise to the level of abuse, since the majority of the content is fake

      They are invasions of privacy. While we usually think of privacy as protecting disclosures of sensitive information, that's only just one kind of privacy intrusion. The legal and ethical issues of privacy are actually considerably broader. They have to do with a whole host of issues relating to your right to personal autonomy.

      For example if you are in a public place people are free to observe you, or even in most jurisdiction to film you. But if someone follows you around observing you to the point that it would interfere with a reasonable person's ability to conduct their lives, it's well established in US law [nytimes.com] that that's a privacy intrusion, even though it doesn't involve disclosing privileged information.

      Your neighbor shining a spotlight into your bedroom at night is a privacy intrusion, even if he doesn't *look* into your bedroom, because it interferes with your ability to choose when you sleep.

      Privacy intrusions are actions that interfere with personal liberties. While none of us are entitled to what we would regard as a good reputation with other people, we do have a reasonable expectation of that impression being ours to establish. Suppose your neighbor put up a website in which he published *fiction* in which he depicted you as a child sex abuser, and then he posted QR codes to this all around the neighborhood. You would feel wronged even though he disclaimed the stories as fiction, and you would be right to feel wronged *even if the stories were fiction*, because every time you met a neighbor their reaction to you is colored by that fictional story.

      That's what these sexually explicit deepfakes do: they take away the power of people who are *not* public figures to control their public image. When these girls meet people at school who have seen these deepfakes, those deepfakes will dominate the reaction people have of them in a way that can't be undone, short of moving away. If they're released onto the Internet they may never be able to outrun them.

      • This is absolutely a privacy violation, and a terrible privacy violation, without argument. Would I be okay with that website existing, no, but, would I allow it? Depending on the content, providing it wasn't making factual statements about me, I probably would.

        I say this as a person whose nickname for ~4 years, during high school, was “child porn” because several jocks spread a rumour that I was into it. I just ignored it, and after ~4 years, it died because everyone realized it was clear
        • by hey! ( 33014 )

          It's conceivable to organize society to deal with balancing freedom of expression and privacy in many different ways, but the way *our* society is organized, in general there are prohibitions against prior restraint by the government, but the government absolutely can punish speech which harms other people. This gets complicated with lots of corner cases and things that don't work out quite the way we want as opposed to "absolutist" free speech where nothing you say ever has consequences, but that has its

          • I agree, and to make sure this is clear, my point is that the generated nature of the image is what should be protected. If this person uploaded naked pictures of the people without substantive modification, that without argument, is abuse.
            • by hey! ( 33014 )

              Right. If you created the deepfakes for your own exclusive use, and took reasonable precautions to avoid them falling into other peoples' hands, it'd be icky, but it wouldn't be a privacy intrusion.

              But we're talking about a situation where distribution was the whole point. That distribution is so damaging that it literally can't be repaired.

      • It's also a violation of personal copyright and defamation. And harassment.

    • Aren't most nude deep fakes just photoshopping someone's head on someone else's nude body. So isnt the original nude photo considered child porn? The "AI" part is just automating it.

      • That's a good question, I honestly don't know, but if the images are all real and reorganized, that's a different issue!
    • If someone generated AI “deep fakes”, then what abuse took place? While those images are generally distasteful, rude, offensive, and invasive, if they're generated images, then they can't rise to the level of abuse, since the majority of the content is fake, and therefore protected until freedom of expression / speech.

      Today I learned Michelangelo & Leonardo da Vinci are criminals because they created images of people without clothes on.

      • That's actually a great example, would the parents demand any reference to those works be removed?
      • by PPH ( 736903 )

        because they created images of people without clothes on

        I suppose if the people Michelangelo and da Vinci depicted file a complaint, then yes. They are criminals.

    • This all happened a year ago already. Why shut the school down now?
    • by PPH ( 736903 )

      then they can't rise to the level of abuse

      They can, if they depict a particular individual in a manner so as to make them identifiable. The abuse is subjecting that person to a kind of public exposure which they did not consent to (or can not, being under age).

      it's why people can publish child sexual novels

      Are the characters in these novels identifiable actual people? Or invented personas?

  • Not I.
    Everything that can be used to create porn will be used to create porn.
    Even if something is only hinting at nudity there are those that will find it offensive and call it porn. *
    Hormones do lower intelligence on all teenagers.
    And these large artificial neural models that are open for all to use lowers the bar substantially for a horny and/or bullied teenager to do this.

    This might not have been done out of spite or with ill intent. "Don't attribute to..."

    *generally speaking of what "AI" models can prod

  • by Anonymous Coward

    I can see little Billy making a bunch of deepfakes, and being smart enough to use an encrypted virtual machine, VPN, or even use a bogus credit card to use a VM service to make the deepfakes. From there, find a way to throw them on the computer of someone he is bullying, either by copying pictures while the computer is unattended or just making a USB drive with the target's name and address on it, throwing it in their backpack, then calling the principal that they were looking at pr0n on their computer.

    Bam

  • by geekmux ( 1040042 ) on Monday November 18, 2024 @06:31PM (#64955813)

    A person generating hyperrealistic nude images without consent using the eeeevil “AI” to do it? Society says burn them at the stake.

    Now tell me what happens when society stumbles across an inspiring artist. One who creates hyperrealistic images using that 300-year old John Wick technology. Also known as a fucking pencil. What happens to said artist when you find dozens of hyperrealistic nude images, created without consent? Is that still a deviant who deserves to be destroyed, or is that merely an inspiring artist?

    (The father in me? Sure. Especially if it were my kid. Question still stands. Because society created it.)

    • by HBI ( 10338492 )

      I had a girlfriend back when I was a teenager who was an excellent artist. She ended up making a living doing airbrushed motorcycle helmets and such, t-shirts, whatever. But pencil drawings were really her forte. She made me look like some kind of Adonis, which I wasn't and am not.

      The naked self-portraits she drew in a mirror were something to behold. Much better than any photographic porn you could imagine. She had a beautiful form, even taking into account the love I still feel for her (she died of c

  • ... and "revenge porn" at the same time: Task a governmental office to create "explicit deepfake" images of every single citizen, make them available for download on a public server (with the image files having random names, and being replaced with new ones once in a while). What possible "damage" could the existence of more such "explicit images" do, then? Today, the "damages" claimed are of the "irrational fear" kind, that somebody seeing such image could "think bad" of the person the picture claims to de
  • But Micciche allegedly did nothing

    I'll venture a guess that he was doing .... something.

In order to get a loan you must first prove you don't need it.

Working...