Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
AI Education

Microsoft, OpenAI, and a US Teachers' Union Are Hatching a Plan To 'Bring AI into the Classroom' (wired.com) 32

Microsoft, OpenAI, and Anthropic will announce Tuesday the launch of a $22.5 million AI training center for members of the American Federation of Teachers, according to details inadvertently published early on a publicly accessible YouTube livestream. The National Academy for AI Instruction will be based in New York City and aims to equip kindergarten through 12th grade instructors with "the tools and confidence to bring AI into the classroom in a way that supports learning and opportunity for all students."

The initiative will provide free AI training and curriculum to teachers in the second-largest US teachers' union, which represents about 1.8 million workers including K-12 teachers, school nurses and college staff. The academy builds on Microsoft's December 2023 partnership with the AFL-CIO, the umbrella organization that includes the American Federation of Teachers.

Microsoft, OpenAI, and a US Teachers' Union Are Hatching a Plan To 'Bring AI into the Classroom'

Comments Filter:
  • Good idea. We can save a lot of--well, a LITTLE--money by taking the slightly-more-expensive people out of the equation entirely and just having LLMs teach LLMs.
    • by JBMcB ( 73720 )
      Fun fact: feeding the output of a LLM back into itself makes it slowly output complete nonsense. A podcast I listen to fed the transcript of one of it's shows into NotebookLM and asked it to create a summary. They then fed that summary back into NotebookLM and asked it to do another summary. Every iteration introduced nonsense that had nothing to do with the original summary. The show is a technology and politics podcast, and by the end the summary-of-a-summary was talking about football games and the weath
      • Re:Fun! (Score:4, Informative)

        by gweihir ( 88907 ) on Tuesday July 08, 2025 @10:53AM (#65505180)

        It is a well-known effect called "model collapse" when you do this with training data. With temporary data ("context") you see something similar, namely that LLMs only ever capture a relatively small part of the input data and add noise and statistical correlations to it. You also see that no fact-checking or consistency checking is being done ("does this make sense?"), bacause LLMs are incapable of logical reasoning. After a few iterations, crap accumulates and original meaning vanishes and things turn to mush.

  • Use it for lesson plans and schemes of work then sack it off.
  • But the cloud replaced the nanotech. And the AI still can't do the voices, right?
  • by nightflameauto ( 6607976 ) on Tuesday July 08, 2025 @10:16AM (#65505106)

    I remember when progress felt like a net positive thing. Now it seems like progress is measured like a countdown to the irrelevance of humanity.

    If we eventually get to a point where all school does is teach folks how to properly interact with the AI, and the AIs take over most aspects of actually working, creating art, and most likely living, and interacting with others (see Zuck's offer to provide artificial friends since people no longer make real friends), what will be the point of the humans 'under' the AIs? Somewhere, we seem to have lost the thread. Or maybe we were always just meant to be a stepping stone to technology that could replace us.

    • Re: (Score:3, Insightful)

      by buck-yar ( 164658 )
      Have you seen k-12 test scores lately? No wonder AI is being pitched. They can't seem to teach kids the material, so they have to fall back to teaching them how to use a crutch. 25 years ago when I was in school they didn't let us use graphing calculators on math tests because (in their words) we have to learn how to do it ourselves. They've thrown in the towel. How about become skilled then increase productivity with these tools? IE start using them in 11/12th grade instead of early. There's no way AI in e
      • by nightflameauto ( 6607976 ) on Tuesday July 08, 2025 @12:09PM (#65505450)

        Have you seen k-12 test scores lately? No wonder AI is being pitched. They can't seem to teach kids the material, so they have to fall back to teaching them how to use a crutch. 25 years ago when I was in school they didn't let us use graphing calculators on math tests because (in their words) we have to learn how to do it ourselves. They've thrown in the towel. How about become skilled then increase productivity with these tools? IE start using them in 11/12th grade instead of early. There's no way AI in earlier years is going to make people smarter. What little research has been done suggests using these tools results in people's brains turning off. Opposite of what we want.

        Schools have pretty much stopped being about teaching/learning, and are more of a babysitter for people who need to work full time to keep a roof over their heads. I really don't see how adding AI into the mix will help that situation, but I suppose, a dumb population is an easily manipulated population, and that's one thing all of our political leaders can agree is a net positive.

      • What little research has been done suggests using these tools results in people's brains turning off. Opposite of what we want.

        I rather think that might be the point and the purpose. If you were a brogligarch intent on killing all opposition to your plans for world domination, would you really want to perpetuate a citizenry which can think independently, do actual research, analyze situations, draw conclusions, ant tell you that you're evil and full of shit? Of course you wouldn't.

        Lest you conclude that I'm a rabid and irrational conspiracy theorist, consider that this would be far from the first time that robber barons created edu

    • by MeNeXT ( 200840 )

      We are already there. AI is the next step. Most people today can't think for themselves and have no interest in doing so. They choose options based on popularity and not their needs. The popular options are always the easiest which removes functionality and requires assistance or $ to make it fit their needs. We see this in our schools today from the subjects to the tools used.

  • The only way to make LLMs intelligent is to change the definition of intelligent. So let's change humans first, and adapt the definition later to "as dumb as our own children".
  • AI is already there, just no one is openly admitting it...

    Personally I think A1 skills should be taught instead. Not too many people can cook a perfect medium rare steak with a tasty sauce. It's a much more usable life skill.
    • AI is already there, just no one is openly admitting it... Personally I think A1 skills should be taught instead. Not too many people can cook a perfect medium rare steak with a tasty sauce. It's a much more usable life skill.

      If you need A1 sauce, you either have a low-quality steak, which is fine when you just want some meat and don't want to spend on it, or you royally fucked up a seasoning and cooking routine on a good steak, which is not fine at all, and deserves ridicule.

  • by RobinH ( 124750 ) on Tuesday July 08, 2025 @10:41AM (#65505154) Homepage

    Around 2000 we had the dot com bubble, which was a whole bunch of "irrational exuberance" about over-inflated evaluations of companies that were doing *anything* on the web. The poster-child of the dot-com crash was Pets.com [wikipedia.org]. In the lead-up to the 2008 real estate bubble and crash I used to hear advertisements on the radio for "interest-only mortgages" and people would say things like "real estate is the one thing they're not making any more of" and "real estate prices always go up". Well, then someone in Detroit famously traded his house for an iPhone just to get rid of it. In the early 2020's people were buying NFTs, and the peak Bored Ape Yacht Club NFT price at the time was 10 times the current price.

    See a pattern? We've all been playing with the new AI technology. The demos are amazing, but we know that outside a couple niche uses, there's nowhere near as much value there as the market is implying. At best it's similar to 3D printing's usefulness - helpful for making quick prototypes, etc. There's no actual intelligence; no actual reasoning or thinking happening. It's going to take a huge breakthrough to get to AGI, and sure that might happen tomorrow but it's more likely to be decades away, or maybe not even in our lifetimes.

    How are people so duped by these companies? Is it just blind optimism? Why are we so predisposed to falling for this hype cycle?

    • by gweihir ( 88907 )

      I have seen a pattern here for a long time, and well before the Web hype.

      How are people so duped by these companies? Is it just blind optimism? Why are we so predisposed to falling for this hype cycle?

      People in general are dumb, cannot fact-check and try to conform to expectations of others. That makes them easy to manipulate. Countless examples are available of that effect. For commercial hypes (other than political or religious or moral panics or the like), you always find some scummy people that want to get rich and maybe get power and hence design and fuel the hype. Many hypes do not take off and nobody really notices. By some d

    • Why are we so predisposed to falling for this hype cycle?

      Because we've been through all those prior hype cycles you listed, and countless others as well. They've worn us down and blunted our collective critical faculties.

  • It was time a bunch or really stupid people go to work on that.

    While this could be done in a way that teaches kids to be careful with AI and make sure to not atrophy their own skills, that is not how this will be going.

  • by Mirnotoriety ( 10462951 ) on Tuesday July 08, 2025 @11:26AM (#65505286)
    In the future, as AI becomes ubiquitous in shaping our access to information, could this lead to a monoculture driven by the orthodoxy established by AI companies? This could result in cultural homogenization, reinforce biases, and diminish our ability to think critically.
  • It already read ALL the schoolbooks of the last 100 years in most languages.

    Some people even complain about it.

  • Great Example (Score:4, Insightful)

    by akw0088 ( 7073305 ) on Tuesday July 08, 2025 @12:24PM (#65505518)
    Great Example of wasting taxpayer money dedicated to teachers and teaching for projects that will have minimal impact with maximum costs. If teachers and students want to use AI, they can go to a webpage just as well as anyone else.
  • AI is here -- you can dislike it, but it'd be a mistake to ignore it. I recognize we're worried about AI making us dumber in the same way phones removed our memory for phone numbers. But, this tool is incredibly useful and not going anywhere, just like phones.

    The same way we trained kids on computers, they should be taught AI compentency. With AI competency comes language, critical thinking, and research -- huge skills applicable everywhere, not just AI.

    And when they need help learning a new subject, they should apply that competency to getting the Math/etc. help they need. Using AI to learn a new subject has been far and above the best use case in my life, and with competency training kids (and teachers) can find the same educational impact.

    It doesn't need to be a detriment.

    • Computers in school are just another example of money wasted. Unless it is computer class, computers have only been detrimental and any benefits they provided were not commensurate with their cost. Schools would have been better off hiring more teachers than buying computers.

      • Given that experience, how would you apply that to the current conversation so there's not a repeat?

        How do we improve their success rate while still giving education on essential tools?

  • by KalvinB ( 205500 ) on Tuesday July 08, 2025 @12:31PM (#65505560) Homepage

    Teachers are paid for 40 hours per week. And not well. Current starting salaries are enough to afford to live in a ditch.

    They teach 5 classes per day. They get 1 period free when students cannot come into their classroom. Plus 1 hour before school and 1 hour after school when students can come in.

    The system thinks grading papers and preparing lessons are free services that teachers provide.

    The system will do anything but hire more teachers so that they each can have 3 periods per day and spend the rest of the time grading, prepping, and helping students during their paid hours.

  • OpenAI and Microsoft Bankroll New A.I. Training for Teachers [nytimes.com]: "Microsoft will provide $12.5 million for the A.I. training effort over the next five years, and OpenAI will contribute $8 million in funding and $2 million in technical resources. Anthropic will add $500,000 for the first year of the effort. On Monday, some 200 New York City teachers taking an A.I. workshop at their union headquarters got a glimpse of what the new national effort might look like. A presenter from Microsoft opened by showing an A

  • Kids are going to red team anything that's thrown at them. The limits of what ever prompts used to keep the model safe for kids are going to be thoroughly tested, and cleverly bypassed.

  • some of these same people are responsible for making kids more ignorant than kids the same ages were decades ago. The people currently failing to teach kids to read and write and do math [heytutor.com] "at grade level" are NOT going to make things better by letting Microsoft talk them into introducing AI to the kids.

    We need to get the computers out of the classrooms altogether. Kids have never needed to "learn computers" at school - the young are ALWAYS more easily adaptable to new tech than the old. Any kid will "get" th

System checkpoint complete.

Working...