Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AI Education

Explicit Deepfake Scandal Shuts Down Pennsylvania School (arstechnica.com) 138

An anonymous reader quotes a report from Ars Technica: An AI-generated nude photo scandal has shut down a Pennsylvania private school. On Monday, classes were canceled after parents forced leaders to either resign or face a lawsuit potentially seeking criminal penalties and accusing the school of skipping mandatory reporting of the harmful images. The outcry erupted after a single student created sexually explicit AI images of nearly 50 female classmates at Lancaster Country Day School, Lancaster Online reported. Head of School Matt Micciche seemingly first learned of the problem in November 2023, when a student anonymously reported the explicit deepfakes through a school portal run by the state attorney's general office called "Safe2Say Something." But Micciche allegedly did nothing, allowing more students to be targeted for months until police were tipped off in mid-2024.

Cops arrested the student accused of creating the harmful content in August. The student's phone was seized as cops investigated the origins of the AI-generated images. But that arrest was not enough justice for parents who were shocked by the school's failure to uphold mandatory reporting responsibilities following any suspicion of child abuse. They filed a court summons threatening to sue last week unless the school leaders responsible for the mishandled response resigned within 48 hours. This tactic successfully pushed Micciche and the school board's president, Angela Ang-Alhadeff, to "part ways" with the school, both resigning effective late Friday, Lancaster Online reported.

In a statement announcing that classes were canceled Monday, Lancaster Country Day School -- which, according to Wikipedia, serves about 600 students in pre-kindergarten through high school -- offered support during this "difficult time" for the community. Parents do not seem ready to drop the suit, as the school leaders seemingly dragged their feet and resigned two days after their deadline. The parents' lawyer, Matthew Faranda-Diedrich, told Lancaster Online Monday that "the lawsuit would still be pursued despite executive changes." Classes are planned to resume on Tuesday, Lancaster Online reported. But students seem unlikely to let the incident go without further action to help girls feel safe at school. Last week, more than half the school walked out, MSN reported, forcing classes to be canceled as students and some faculty members called for resignations and additional changes from remaining leadership.

This discussion has been archived. No new comments can be posted.

Explicit Deepfake Scandal Shuts Down Pennsylvania School

Comments Filter:
  • by Malay2bowman ( 10422660 ) on Monday November 18, 2024 @04:49PM (#64955445)
    And now this person is going to end up on the registry. Goodbye future and hello ankle monitor whenever he or she gets out of lockup.
    • Re: (Score:2, Interesting)

      by jythie ( 914043 )
      Which is probably why administrators were reluctant to report anything. Given the area, they tend to see reporting laws as something for getting brown kids into the prison system, NOT something to 'ruin the lives' of little white boys. If it had been a dark skinned kid making deepfakes of white classmates, he would probably be in juvenile detention.
      • Re: (Score:3, Interesting)

        by quonset ( 4839537 )

        Which is probably why administrators were reluctant to report anything. Given the area, they tend to see reporting laws as something for getting brown kids into the prison system, NOT something to 'ruin the lives' of little white boys. If it had been a dark skinned kid making deepfakes of white classmates, he would probably be in juvenile detention.

        Witness the rapist Brock Turner and his dad [stanforddaily.com] who, at sentencing, said the rapist shouldn't have his future ruined for 20 minutes of action:

        His life will never be the one that he dreamed about and worked so hard to achieve. That is a steep price to pay for 20 minutes of action out of his 20 plus years of life.

    • And now this person is going to end up on the registry.

      I don't know Pennsylvania's laws, but many US states have special laws for young (typically under 18 or under 21) first-time sex offenders so they aren't automatically put on the sex-offender registry and which allow them to get removed much earlier if they are forced to register.

      • The reason being that a large proportion of "teenage sex offenders" are just horny teens where (typically) the girl's parents don't approve of the boy, or some superstition says they shouldn't be doing it before they're married, or whatever. This also severely dilutes the effectiveness, if any, of Megan's Law, "we were both 15 and our parents caught us, that's the only reason I'm even on the register".
    • by rsilvergun ( 571051 ) on Monday November 18, 2024 @05:13PM (#64955569)
      Kids do not always understand the consequences of their actions or the gravity of them relative to our society. It sounds like a big deal that they did 50 of them but they could literally take their yearbook, take a picture of it with their iPhone and then run the picture through any one of several programs to generate the nudes in mass.

      It's one of those things where at a certain point when we as a civilization have made it so easy to commit a crime I have a hard time wanting to bring the full force of law down on somebody.

      Also if our country and our society didn't have such a fucked up opinion of nudity this wouldn't be an issue. The reason this is so damaging is because the assumption is that if you can find nude pictures of a girl then that girl is of low moral character and should be ostracized.
      • Re: (Score:3, Insightful)

        by markdavis ( 642305 )

        >"Kids do not always understand the consequences of their actions or the gravity of them relative to our society"

        Indeed. They don't understand a LOT of stuff. Which is why children shouldn't be carrying around phones/tablets/whatever with full internet access. There is an entire world of insanity, and allowing kids access to however, whenever, and wherever it is incredibly irresponsible.

        Maybe I am missing something, I read the summary and articles, and there is nothing about what HAPPENED with the ima

        • It just would have taken a little more effort. He could scan in yearbook pictures. Once he's got them on a computer the sky's the limit.

          Unless you are going to watch absolutely everything kids do 24/7 then they're going to get into some shit. I think there are probably better solutions that are full-on orwellian police state for anyone under 18...

          And again none of this would be an issue if we didn't treat women who have pictures of themselves naked as horrible people that need to be punished.
          • >"Unless you are going to watch absolutely everything kids do 24/7"

            They won't have constant unrestricted access to the internet, unless someone GIVES it to them. Usually in the form of an unrestricted phone, tablet, or computer with no supervision. If it was the norm that kids shouldn't have such devices (or that they are special devices only, designed for kids, with appropriate whitelists only), then it wouldn't be a big deal. Instead, I see kids walking around all over with unrestricted devices. It

            • You're willing to subject kids to living under a police state, and hell I wouldn't be surprised if you're not willing to subject adults to that, rather than have a serious discussion about outmoded gender stereotypes built around a time when women were property and their chastity was part of their value.

              Your completely bypassing my main point which is that the only reason this crime is so serious is that a devalues the women on a fundamental level. It does that because having a couple of fake pictures o
            • They won't have constant unrestricted access to the internet, unless someone GIVES it to them.

              Or, you know, they mow a few lawns, buy a $50 smartphone from Walmart, and hop onto the neighbor's unsecure wi-fi. It's not that hard to get online these days, and kids face a lot of social pressure to do so.

              • Make internet devices, likes tons of other stuff, not sellable to minors. Again, if it was the norm that kids don't have such devices, there would be no social pressure.

                • Prohibition doesn't seem like a realistic solution, if the shockingly high rates of teenage vaping are anything to go by.
              • "Or, you know, they mow a few lawns,"

                How quaint. It's more like "Here's a smartphone, Little Chuckums. You won't be seen as a weirdo by your peers anymore. Now shut up and leave us alone."

        • by AmiMoJo ( 196126 )

          Maybe kids should get relationship and sex education from a young age, so they do understand this stuff.

          I don't mean teaching 6 year olds the mechanics of sex, I mean teach them about things like consent and respect for each others privacy.

          • Maybe, but that doesn't address stalking, grooming, violent text and video, bullying, and tons of other stuff that is all over the internet and that such communication enables.

            Kids should ideally have locked-down devices that operate on a whitelist, and allow only non-group communication to/from only parent-approved contacts.

      • Teenagers are dumb, but they aren't *that* dumb. This guy absolutely knew that he would be causing a lot of pain for a lot of people. If he didn't, at the very least he should be in an therapy.

    • Re: (Score:3, Insightful)

      And now this person is going to end up on the registry. Goodbye future and hello ankle monitor whenever he or she gets out of lockup.

      Don't forget to say hello to Matt Gaetz. This person will be able to be Attorney General someday.
      A GOP DEI hire includes child molesters.

      • That was my reaction to the (so-far) list of appointments, "some of those guys aren't even sex offenders".
      • Allegations do not mean someone is guilty.

        If no charges are brought and the law not involved....I don't think you really have a case.

        Innocent until proven guilty? Is that not a thing with you anymore?

        • What about trump? All his charges are getting delayed or dropped because he is president, does that mean i will not be anle call him what he is, a felon, cheat and criminal?
          • What about trump? All his charges are getting delayed or dropped because he is president, does that mean i will not be anle call him what he is, a felon, cheat and criminal?

            Well, those 30 something "felonies" in NYC...were looking to be vacated anyway, the appeals court had signaled that the prosecutors were almost malfeasant in their actions....they actually were closing begging mostly to not have their licenses barred.

            It appears that most everything was going to be thrown out even if he didn't become pr

            • It appears that most everything was going to be thrown out even if he didn't become president

              You see, you are wrong. You are also a fucking no good liar.

    • He could grow up to head the DOJ one day. https://www.pbs.org/newshour/a... [pbs.org]

    • I'm not so willing to give this guy a free pass because "he's just a kid." Teenagers are very much smart enough to know that their actions can hurt people, and hurt them severely. Too many parents have bought into the crap that "oh my little 17-year-old is so sweet and innocent, he could never do such a horrible thing!" Yeah, yeah they can.

      • I'm not so willing to give this guy a free pass because "he's just a kid." Teenagers are very much smart enough to know that their actions can hurt people, and hurt them severely. Too many parents have bought into the crap that "oh my little 17-year-old is so sweet and innocent, he could never do such a horrible thing!" Yeah, yeah they can.

        No one knows that much about the perpetrator in this case.

        But in general, I suspect a lot of problematic behaviour could have been avoided if we just gave people a wake up call earlier. Recall how Joss Whedon (who still has piles of money) basically lost his career after all the tales of abusive behaviour and sexual exploitation came out. He was actually interviewed about it a while later and it's somewhat enlightening [vulture.com]. Assuming you take him at face value those bad behaviours were in him, but didn't really

        • The people wanting to excuse this guy because he's "too young" also don't have enough information to excuse him based on any kind of real evidence. My point is not to pretend that I know enough about this guy to make a judgment, just that being a minor doesn't, by itself, excuse someone from responsibility for heinous behavior. Now if this kid was autistic, then perhaps there is cause for psychotherapy instead of jail. But on the whole, teenagers are more than aware enough to understand what serious harm to

  • Welcome to the new world of vouchers.
  • Where's the abuse? (Score:3, Insightful)

    by Murdoch5 ( 1563847 ) on Monday November 18, 2024 @05:04PM (#64955525) Homepage
    If someone generated AI “deep fakes”, then what abuse took place? While those images are generally distasteful, rude, offensive, and invasive, if they're generated images, then they can't rise to the level of abuse, since the majority of the content is fake, and therefore protected until freedom of expression / speech. This idea has been fought before, it's why people can publish child sexual novels, and why organizations like NAMBLA can exist.

    Regardless how anyone feels about the content, I know as a father of two teenage daughters, I would want the death penalty, the right to free speech and free expression must be upheld. What is the difference between a generated picture of a naked teenager, vs, Romeo and Juliet? Would those same parents demand Romeo and Juliet get removed, and people resign?

    Just so there's absolutely no confusion, I'm only coming at this from a free speech / freedom of expression context. I do not defend generating sexual fakes of others.
    • by abEeyore ( 8687599 ) on Monday November 18, 2024 @05:20PM (#64955589)
      There are. indeed, laws against generating content depicting the sexual exploitation of minors, even if it is entirely fictitious - and this is not entirely fictitious because they are based on real people. I understand that you feel this skirts close to the edge of so called "thought crime", but in this case, the violation is not simply in creating the pictures, but in then distributing them.

      The creator's rights - whatever you may imagine them to be - end where the rights of the girls in question begin - and even the most strident libertarian has to agree that the girls - as minors, if for no other reason - have a right to NOT be publicly exploited in such a way.
      • They weren't exploited because the images were generated. Imagine generating a James Bond novel, that had themes and elements from old novels, but was generated enough as to not breach copyrights or trademarks. If you distribute the novel, all you've really done is distributed a ripoff, contrast the Hardy Boys with Nancy Drew, they're effectively the same concept, but different enough as to be unique.

        I'm not going to defend distributing the images because I think it's gross AF, but fundamentally, if th
        • by vux984 ( 928602 )

          But they weren't *entirely generated* because they depicted real recognizable people.

          You might find this interesting...

          https://www.owe.com/resources/... [owe.com]

          Any one of the criteria noted can elevate the use of someones likeness to illegal; this case easily satisfies multiple critera.

          And the fact that its porn and involves minors, makes the arguments even more compelling.

          • I bookmarked that site to read later, I'll return tomorrow :)
          • Okay, I've read the article, but it actually goes back to the issue I keep bringing up, the fact the images are generated. It's hard to say how much that impacts the amount of invasion. While I could certainly agree it's some kind of privacy violation, the fact the work is generated, aka fake, is important.

            The argument the generator needs to make, is simple, “It might look like X, but it's not, the work was created without any input of a similar character or person.” The fact it happens to
            • by vux984 ( 928602 )

              The example given was that the artist had taken some photos of the gentleman in public, and then created a *painting* of that person. The painting is not the photo, and the artist may have taken any number of creative liberties with his expression of the person in the painting.
              But what matters is that it was recognizable.

              In particular, in that case it wasn't even "mechanically" or "electronically" part of the input, it was merely part of "input" in the sense that artist referenced it when creating the pain

              • Right:

                Whether that likeness was a violation, or not, hinged on the remaining criteria. And in that example it was deemed not to be a violation of his rights, even though it was his likeness.

                , that's my point! I'm not going to argue against a violation of privacy, if the photos look like X, it's a violation of privacy, it might be an invasion of privacy, but, does that rise to the level of abuse?

                In this case it would simply be stretching credibility far past the breaking point to try to argue that some boy at a private school who had and was distributing deepfake porn of a whole bunch of people who were very recognizably his classmates was "pure coincidence". Give me a break.

                I'm suggesting that should be his argument, not that I'd buy that excuse because I wouldn't. However, regardless if I buy the excuse, if I can't present evidence to show he's lying, then how do I know it wasn't some fever dreams AI hallucination generating likeness using an input of social medi

                • by vux984 ( 928602 )

                  if the photos look like X, it's a violation of privacy, it might be an invasion of privacy, but, does that rise to the level of abuse?

                  Ok it seems were not too far apart. We seem to agree its a violation of privacy, and then you ask if it a " 'violation of privacy' rises to the level of abuse"

                  And I don't know if that has an answer, because we haven't really agreed on the definition of the terms. What is "abuse"?

                  For example, I think people taking their family into the grocery store every day to get a free banana from produce and free cookie from the bakery and don't buy anything, are absolutely 'abusing' the system in place to make the groc

                  • We're really close, I just think you can't point to a painting of a naked flying 5-year-old, and think it's beautiful art, but then point to a naked generated picture of a 5-year-old, and say it's disgusting. The issue is: It's almost pure hypocrisy because you're saying one form of something is absolutely inappropriate, but then saying the same or worse idea is okay, even if that's done through ignorance, or some beloved interest in history.

                    This is why I keep saying free speech / free expression, becau
                    • by vux984 ( 928602 )

                      Providing these pictures were of the girls or boys, and not generated, you would have absolutely crossed that line.

                      I think this is one of the hairs we're splitting. If I feed a picture of Amy and a bunch of random porn stars and say "give me a picture of Amy doing XYZ" then its a generated image, AND its a picture of Amy. To my mind, then, even though it was "generated" and "never happened" it is also a picture of Amy, and a violation of her likeness rights/privacy/ and to distribute it would be sexual harrassment too (but not 'sexual abuse'). Do you agree with all that?

                      Now, if I simply prompt "give me 1000 a different

      • and this is not entirely fictitious because they are based on real people.

        There are more than 7 billion people on this planet. Any realistic face that is drawn is certain to look like someone. Are you absolutely certain that you want to go full Gestapo down this path?

      • In Australia, depictions of Bart and Lisa Simpson can be considered "child porn". That isn't the case in the US. In the US, there has to be an identified victim.

        I can't see someone getting criminally arrested and/or convicted for copy/pasting a picture of a celebrity's head onto an image of a naked body. And this kid did basically the same thing. It all sounds too much like "art" and a 1st Amendment nightmare. Sure, it can be a school policy not to do that, or distribute those kinds of things. Many have hon

    • by hey! ( 33014 ) on Monday November 18, 2024 @05:33PM (#64955631) Homepage Journal

      if they're generated images, then they can't rise to the level of abuse, since the majority of the content is fake

      They are invasions of privacy. While we usually think of privacy as protecting disclosures of sensitive information, that's only just one kind of privacy intrusion. The legal and ethical issues of privacy are actually considerably broader. They have to do with a whole host of issues relating to your right to personal autonomy.

      For example if you are in a public place people are free to observe you, or even in most jurisdiction to film you. But if someone follows you around observing you to the point that it would interfere with a reasonable person's ability to conduct their lives, it's well established in US law [nytimes.com] that that's a privacy intrusion, even though it doesn't involve disclosing privileged information.

      Your neighbor shining a spotlight into your bedroom at night is a privacy intrusion, even if he doesn't *look* into your bedroom, because it interferes with your ability to choose when you sleep.

      Privacy intrusions are actions that interfere with personal liberties. While none of us are entitled to what we would regard as a good reputation with other people, we do have a reasonable expectation of that impression being ours to establish. Suppose your neighbor put up a website in which he published *fiction* in which he depicted you as a child sex abuser, and then he posted QR codes to this all around the neighborhood. You would feel wronged even though he disclaimed the stories as fiction, and you would be right to feel wronged *even if the stories were fiction*, because every time you met a neighbor their reaction to you is colored by that fictional story.

      That's what these sexually explicit deepfakes do: they take away the power of people who are *not* public figures to control their public image. When these girls meet people at school who have seen these deepfakes, those deepfakes will dominate the reaction people have of them in a way that can't be undone, short of moving away. If they're released onto the Internet they may never be able to outrun them.

      • This is absolutely a privacy violation, and a terrible privacy violation, without argument. Would I be okay with that website existing, no, but, would I allow it? Depending on the content, providing it wasn't making factual statements about me, I probably would.

        I say this as a person whose nickname for ~4 years, during high school, was “child porn” because several jocks spread a rumour that I was into it. I just ignored it, and after ~4 years, it died because everyone realized it was clear
        • by hey! ( 33014 )

          It's conceivable to organize society to deal with balancing freedom of expression and privacy in many different ways, but the way *our* society is organized, in general there are prohibitions against prior restraint by the government, but the government absolutely can punish speech which harms other people. This gets complicated with lots of corner cases and things that don't work out quite the way we want as opposed to "absolutist" free speech where nothing you say ever has consequences, but that has its

          • I agree, and to make sure this is clear, my point is that the generated nature of the image is what should be protected. If this person uploaded naked pictures of the people without substantive modification, that without argument, is abuse.
            • by hey! ( 33014 )

              Right. If you created the deepfakes for your own exclusive use, and took reasonable precautions to avoid them falling into other peoples' hands, it'd be icky, but it wouldn't be a privacy intrusion.

              But we're talking about a situation where distribution was the whole point. That distribution is so damaging that it literally can't be repaired.

      • It's also a violation of personal copyright and defamation. And harassment.

        • by DarkOx ( 621550 )

          ^^THIS^^

          There are lots of free speech issues and complexity both legal and moral when it comes to is an drawing or generated images actually CASM and what is art/speech and all that.

          Maybe the answer to everything isnt the criminal justice system? Maybe the place to deal with this is civil court system.

    • Aren't most nude deep fakes just photoshopping someone's head on someone else's nude body. So isnt the original nude photo considered child porn? The "AI" part is just automating it.

      • That's a good question, I honestly don't know, but if the images are all real and reorganized, that's a different issue!
    • If someone generated AI “deep fakes”, then what abuse took place? While those images are generally distasteful, rude, offensive, and invasive, if they're generated images, then they can't rise to the level of abuse, since the majority of the content is fake, and therefore protected until freedom of expression / speech.

      Today I learned Michelangelo & Leonardo da Vinci are criminals because they created images of people without clothes on.

      • That's actually a great example, would the parents demand any reference to those works be removed?
      • by PPH ( 736903 )

        because they created images of people without clothes on

        I suppose if the people Michelangelo and da Vinci depicted file a complaint, then yes. They are criminals.

        • because they created images of people without clothes on

          I suppose if the people Michelangelo and da Vinci depicted file a complaint, then yes. They are criminals.

          Oh, for fucks sake. Do NOT do that shit. Please. Bad enough the woke mob wants to persecute your “crimes” from childhood. Last fucking thing we need is for someone to suggest we drag The Hague with us through the time warp.

    • This all happened a year ago already. Why shut the school down now?
    • by PPH ( 736903 )

      then they can't rise to the level of abuse

      They can, if they depict a particular individual in a manner so as to make them identifiable. The abuse is subjecting that person to a kind of public exposure which they did not consent to (or can not, being under age).

      it's why people can publish child sexual novels

      Are the characters in these novels identifiable actual people? Or invented personas?

      • One of the characters in the novels was tightly based around the girl the person violated. I'll look up the case on the weekend, it was mentioned in an old Law & Order SVU episode.
    • While those images are generally distasteful, rude, offensive, and invasive, if they're generated images, then they can't rise to the level of abuse, since the majority of the content is fake, and therefore protected until freedom of expression / speech. This idea has been fought before, it's why people can publish child sexual novels, and why organizations like NAMBLA can exist.

      Freedom of speech is not freedom of consequence and specifically for your example the "idea has been fought before" had the distinction of being a fantasy with fantasy elements.

      If you want to AI generate kiddy porn, go for it.
      If you want to AI generate kiddy porn which looks like a very specific person then expect to spend some time learning the value of free speech from prison.

      If someone generated AI “deep fakes”, then what abuse took place?

      Apparently you think abuse is only abuse if it is physical. You have a lot to learn both legally as well as how to develop empathy

      • I don't lack empathy, I'm coming at this from the standpoint the images are fake, and therefore classified as art. Think about works like Romeo and Juliet, a play where kids have some kind of sexual interest leading up to pseudo necrophilia. Would you declare that an inappropriate work? If you say yes, but, it falls under free speech / free expression, while now you're on my side, so I guess you lack empathy?

        If you say no, but identify it as free speech / free expression, then you have some kind of in
  • Not I.
    Everything that can be used to create porn will be used to create porn.
    Even if something is only hinting at nudity there are those that will find it offensive and call it porn. *
    Hormones do lower intelligence on all teenagers.
    And these large artificial neural models that are open for all to use lowers the bar substantially for a horny and/or bullied teenager to do this.

    This might not have been done out of spite or with ill intent. "Don't attribute to..."

    *generally speaking of what "AI" models can prod

  • by geekmux ( 1040042 ) on Monday November 18, 2024 @06:31PM (#64955813)

    A person generating hyperrealistic nude images without consent using the eeeevil “AI” to do it? Society says burn them at the stake.

    Now tell me what happens when society stumbles across an inspiring artist. One who creates hyperrealistic images using that 300-year old John Wick technology. Also known as a fucking pencil. What happens to said artist when you find dozens of hyperrealistic nude images, created without consent? Is that still a deviant who deserves to be destroyed, or is that merely an inspiring artist?

    (The father in me? Sure. Especially if it were my kid. Question still stands. Because society created it.)

    • by HBI ( 10338492 ) on Monday November 18, 2024 @07:23PM (#64955939)

      I had a girlfriend back when I was a teenager who was an excellent artist. She ended up making a living doing airbrushed motorcycle helmets and such, t-shirts, whatever. But pencil drawings were really her forte. She made me look like some kind of Adonis, which I wasn't and am not.

      The naked self-portraits she drew in a mirror were something to behold. Much better than any photographic porn you could imagine. She had a beautiful form, even taking into account the love I still feel for her (she died of cervical cancer around 2001). But her pencil drawings were more memorable than the real thing. And now that i'm thinking about it, we were both underage at the time. I have one photograph of her left, her poking her head around a tree somewhat playfully. But I can still remember those drawings...

      Her mom found them after she left home. Wasn't so bothered by the nudes, but apparently she'd written some stuff that sounded pseudo-suicidal in a notebook and that freaked her out. Not that I remember her as being particularly suicidal, just very experimental.

    • by AmiMoJo ( 196126 )

      It's more than just the act of creating them in this instance, it's the act of distributing them. It's an evolving area of the law in many jurisdictions, but generally speaking the harm caused by distributing them is at least an aggravating factor.

      There is also the intent of creating them. In the UK things that would be perfectly legal for most people to own, like a catalogue of children's clothing, can be illegal if collected for the purpose of titillation. In practice I doubt that there would be a prosecu

    • using that 300-year old John Wick technology.

      Hm. Joker technology would have conjured a more interesting image, but John Wick kind of works.

  • ... and "revenge porn" at the same time: Task a governmental office to create "explicit deepfake" images of every single citizen, make them available for download on a public server (with the image files having random names, and being replaced with new ones once in a while). What possible "damage" could the existence of more such "explicit images" do, then? Today, the "damages" claimed are of the "irrational fear" kind, that somebody seeing such image could "think bad" of the person the picture claims to de
  • But Micciche allegedly did nothing

    I'll venture a guess that he was doing .... something.

Hackers are just a migratory lifeform with a tropism for computers.

Working...