Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Education

Explicit Deepfake Scandal Shuts Down Pennsylvania School (arstechnica.com) 48

An anonymous reader quotes a report from Ars Technica: An AI-generated nude photo scandal has shut down a Pennsylvania private school. On Monday, classes were canceled after parents forced leaders to either resign or face a lawsuit potentially seeking criminal penalties and accusing the school of skipping mandatory reporting of the harmful images. The outcry erupted after a single student created sexually explicit AI images of nearly 50 female classmates at Lancaster Country Day School, Lancaster Online reported. Head of School Matt Micciche seemingly first learned of the problem in November 2023, when a student anonymously reported the explicit deepfakes through a school portal run by the state attorney's general office called "Safe2Say Something." But Micciche allegedly did nothing, allowing more students to be targeted for months until police were tipped off in mid-2024.

Cops arrested the student accused of creating the harmful content in August. The student's phone was seized as cops investigated the origins of the AI-generated images. But that arrest was not enough justice for parents who were shocked by the school's failure to uphold mandatory reporting responsibilities following any suspicion of child abuse. They filed a court summons threatening to sue last week unless the school leaders responsible for the mishandled response resigned within 48 hours. This tactic successfully pushed Micciche and the school board's president, Angela Ang-Alhadeff, to "part ways" with the school, both resigning effective late Friday, Lancaster Online reported.

In a statement announcing that classes were canceled Monday, Lancaster Country Day School -- which, according to Wikipedia, serves about 600 students in pre-kindergarten through high school -- offered support during this "difficult time" for the community. Parents do not seem ready to drop the suit, as the school leaders seemingly dragged their feet and resigned two days after their deadline. The parents' lawyer, Matthew Faranda-Diedrich, told Lancaster Online Monday that "the lawsuit would still be pursued despite executive changes." Classes are planned to resume on Tuesday, Lancaster Online reported. But students seem unlikely to let the incident go without further action to help girls feel safe at school. Last week, more than half the school walked out, MSN reported, forcing classes to be canceled as students and some faculty members called for resignations and additional changes from remaining leadership.

Explicit Deepfake Scandal Shuts Down Pennsylvania School

Comments Filter:
  • by Malay2bowman ( 10422660 ) on Monday November 18, 2024 @04:49PM (#64955445)
    And now this person is going to end up on the registry. Goodbye future and hello ankle monitor whenever he or she gets out of lockup.
    • by jythie ( 914043 )
      Which is probably why administrators were reluctant to report anything. Given the area, they tend to see reporting laws as something for getting brown kids into the prison system, NOT something to 'ruin the lives' of little white boys. If it had been a dark skinned kid making deepfakes of white classmates, he would probably be in juvenile detention.
      • Which is probably why administrators were reluctant to report anything. Given the area, they tend to see reporting laws as something for getting brown kids into the prison system, NOT something to 'ruin the lives' of little white boys. If it had been a dark skinned kid making deepfakes of white classmates, he would probably be in juvenile detention.

        Witness the rapist Brock Turner and his dad [stanforddaily.com] who, at sentencing, said the rapist shouldn't have his future ruined for 20 minutes of action:

        His life will never be the one that he dreamed about and worked so hard to achieve. That is a steep price to pay for 20 minutes of action out of his 20 plus years of life.

    • And now this person is going to end up on the registry.

      I don't know Pennsylvania's laws, but many US states have special laws for young (typically under 18 or under 21) first-time sex offenders so they aren't automatically put on the sex-offender registry and which allow them to get removed much earlier if they are forced to register.

    • Kids do not always understand the consequences of their actions or the gravity of them relative to our society. It sounds like a big deal that they did 50 of them but they could literally take their yearbook, take a picture of it with their iPhone and then run the picture through any one of several programs to generate the nudes in mass.

      It's one of those things where at a certain point when we as a civilization have made it so easy to commit a crime I have a hard time wanting to bring the full force of
      • Also if our country and our society didn't have such a fucked up opinion of nudity this wouldn't be an issue. The reason this is so damaging is because the assumption is that if you can find nude pictures of a girl then that girl is of low moral character and should be ostracized.

        Civilized countries have always had restrictions on nudity.

      • >"Kids do not always understand the consequences of their actions or the gravity of them relative to our society"

        Indeed. They don't understand a LOT of stuff. Which is why children shouldn't be carrying around phones/tablets/whatever with full internet access. There is an entire world of insanity, and allowing kids access to however, whenever, and wherever it is incredibly irresponsible.

        Maybe I am missing something, I read the summary and articles, and there is nothing about what HAPPENED with the ima

    • And now this person is going to end up on the registry. Goodbye future and hello ankle monitor whenever he or she gets out of lockup.

      Don't forget to say hello to Matt Gaetz. This person will be able to be Attorney General someday.
      A GOP DEI hire includes child molesters.

    • He could grow up to head the DOJ one day. https://www.pbs.org/newshour/a... [pbs.org]

  • Welcome to the new world of vouchers.
  • Where's the abuse? (Score:4, Insightful)

    by Murdoch5 ( 1563847 ) on Monday November 18, 2024 @05:04PM (#64955525) Homepage
    If someone generated AI “deep fakes”, then what abuse took place? While those images are generally distasteful, rude, offensive, and invasive, if they're generated images, then they can't rise to the level of abuse, since the majority of the content is fake, and therefore protected until freedom of expression / speech. This idea has been fought before, it's why people can publish child sexual novels, and why organizations like NAMBLA can exist.

    Regardless how anyone feels about the content, I know as a father of two teenage daughters, I would want the death penalty, the right to free speech and free expression must be upheld. What is the difference between a generated picture of a naked teenager, vs, Romeo and Juliet? Would those same parents demand Romeo and Juliet get removed, and people resign?

    Just so there's absolutely no confusion, I'm only coming at this from a free speech / freedom of expression context. I do not defend generating sexual fakes of others.
    • Re: (Score:3, Insightful)

      by abEeyore ( 8687599 )
      There are. indeed, laws against generating content depicting the sexual exploitation of minors, even if it is entirely fictitious - and this is not entirely fictitious because they are based on real people. I understand that you feel this skirts close to the edge of so called "thought crime", but in this case, the violation is not simply in creating the pictures, but in then distributing them.

      The creator's rights - whatever you may imagine them to be - end where the rights of the girls in question begin
      • They weren't exploited because the images were generated. Imagine generating a James Bond novel, that had themes and elements from old novels, but was generated enough as to not breach copyrights or trademarks. If you distribute the novel, all you've really done is distributed a ripoff, contrast the Hardy Boys with Nancy Drew, they're effectively the same concept, but different enough as to be unique.

        I'm not going to defend distributing the images because I think it's gross AF, but fundamentally, if th
    • by hey! ( 33014 )

      if they're generated images, then they can't rise to the level of abuse, since the majority of the content is fake

      They are invasions of privacy. While we usually think of privacy as protecting disclosures of sensitive information, that's only just one kind of privacy intrusion. The legal and ethical issues of privacy are actually considerably broader. They have to do with a whole host of issues relating to your right to personal autonomy.

      For example if you are in a public place people are free to observe you, or even in most jurisdiction to film you. But if someone follows you around observing you to the point that

      • This is absolutely a privacy violation, and a terrible privacy violation, without argument. Would I be okay with that website existing, no, but, would I allow it? Depending on the content, providing it wasn't making factual statements about me, I probably would.

        I say this as a person whose nickname for ~4 years, during high school, was “child porn” because several jocks spread a rumour that I was into it. I just ignored it, and after ~4 years, it died because everyone realized it was clear
        • by hey! ( 33014 )

          It's conceivable to organize society to deal with balancing freedom of expression and privacy in many different ways, but the way *our* society is organized, in general there are prohibitions against prior restraint by the government, but the government absolutely can punish speech which harms other people. This gets complicated with lots of corner cases and things that don't work out quite the way we want as opposed to "absolutist" free speech where nothing you say ever has consequences, but that has its

    • Aren't most nude deep fakes just photoshopping someone's head on someone else's nude body. So isnt the original nude photo considered child porn? The "AI" part is just automating it.

      • That's a good question, I honestly don't know, but if the images are all real and reorganized, that's a different issue!
    • If someone generated AI “deep fakes”, then what abuse took place? While those images are generally distasteful, rude, offensive, and invasive, if they're generated images, then they can't rise to the level of abuse, since the majority of the content is fake, and therefore protected until freedom of expression / speech.

      Today I learned Michelangelo & Leonardo da Vinci are criminals because they created images of people without clothes on.

      • That's actually a great example, would the parents demand any reference to those works be removed?
  • Not I.
    Everything that can be used to create porn will be used to create porn.
    Even if something is only hinting at nudity there are those that will find it offensive and call it porn. *
    Hormones do lower intelligence on all teenagers.
    And these large artificial neural models that are open for all to use lowers the bar substantially for a horny and/or bullied teenager to do this.

    This might not have been done out of spite or with ill intent. "Don't attribute to..."

    *generally speaking of what "AI" models can prod

  • by Anonymous Coward

    I can see little Billy making a bunch of deepfakes, and being smart enough to use an encrypted virtual machine, VPN, or even use a bogus credit card to use a VM service to make the deepfakes. From there, find a way to throw them on the computer of someone he is bullying, either by copying pictures while the computer is unattended or just making a USB drive with the target's name and address on it, throwing it in their backpack, then calling the principal that they were looking at pr0n on their computer.

    Bam

"I've seen it. It's rubbish." -- Marvin the Paranoid Android

Working...