Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Education

Schools are Now Teaching About ChatGPT and AI So Their Students Aren't Left Behind (cnn.com) 73

Professors now fear that ignoring or discouraging the use of AI "will be a disservice to students and leave many behind when entering the workforce," reports CNN: According to a study conducted by higher education research group Intelligent.com, about 30% of college students used ChatGPT for schoolwork this past academic year and it was used most in English classes. Jules White, an associate professor of computer science at Vanderbilt University, believes professors should be explicit in the first few days of school about the course's stance on using AI and that it should be included it in the syllabus. "It cannot be ignored," he said. "I think it's incredibly important for students, faculty and alumni to become experts in AI because it will be so transformative across every industry in demand so we provide the right training."

Vanderbilt is among the early leaders taking a strong stance in support of generative AI by offering university-wide training and workshops to faculty and students. A three-week 18-hour online course taught by White this summer was taken by over 90,000 students, and his paper on "prompt engineering" best practices is routinely cited among academics. "The biggest challenge is with how you frame the instructions, or 'prompts,'" he said. "It has a profound impact on the quality of the response and asking the same thing in various ways can get dramatically different results. We want to make sure our community knows how to effectively leverage this." Prompt engineering jobs, which typically require basic programming experience, can pay up to $300,000.

Although White said concerns around cheating still exist, he believes students who want to plagiarize can still seek out other methods such as Wikipedia or Google searches. Instead, students should be taught that "if they use it in other ways, they will be far more successful...." Some schools are hiring outside experts to teach both faculty and students about how to use AI tools.

This discussion has been archived. No new comments can be posted.

Schools are Now Teaching About ChatGPT and AI So Their Students Aren't Left Behind

Comments Filter:
  • Especially student in law school.

    "New York lawyers sanctioned for using fake ChatGPT cases in legal brief"

    https://www.reuters.com/legal/... [reuters.com]

    "A U.S. judge on Thursday imposed sanctions on two New York lawyers who submitted a legal brief that included six fictitious case citations generated by an artificial intelligence chatbot, ChatGPT."

    • by gweihir ( 88907 )

      Yep, that is the one. And that is the one where the extreme screw-up could not be hidden.

    • They weren't sanctioned for using chatGPT, they were sanctioned for making up fake cases. Had they made up fake cases by some other method, they would have still been sanctioned. All they had to do was verify the info that their idiot-who-has-read-all-the-internet spat out, and they would have been fine.

      • by CptJeanLuc ( 1889586 ) on Sunday August 20, 2023 @03:55AM (#63781798)

        It is much harder to verify something which someone (something?) else has written, than writing something yourself. At least when _you_ write something, you know why you are writing it - even if that is "man, at at this point I am just inventing stuff". My strong recommendation is to _never_ have ChatGPT or another language model write something for you if the factual content of whatever that is, is somehow important, and you are not willing to check _everything_. At this point, language models would be better for some sort of "I wrote this text, take that text and polish its structure and form, keeping the contents".

        I got to hand it to the AI language models ... they lie so credibly I am not surprised a lot of people get charmed by their eloquence and professional language, sort of like the lure of the alpha demagogue. I have on occasions asked medium-complexity factual quesions to Google Bard (yeah ... it's Bard ... I did not bother with registering with ChatGPT and handing over my soul, but testing Bing which uses ChatGPT also did not impress me), and it got it absolutely wrong - every - single - time.

        • I got to hand it to the AI language models ... they lie so credibly I am not surprised a lot of people get charmed by their eloquence and professional language, sort of like the lure of the alpha demagogue.

          Yes they do, and I think that we need to come to an understanding of exactly why a lot of people are taken in by this bullshit.

          Because bullshit it is.

          I'll make no claims to infallibility, but when I read the stuff generated by ChatGPT, it is a 5 alarm fire of fluff even if it isn't spouting bullshit. And I have trouble figuring hut why anyone would fall for it

          I have on occasions asked medium-complexity factual quesions to Google Bard (yeah ... it's Bard ... I did not bother with registering with ChatGPT and handing over my soul, but testing Bing which uses ChatGPT also did not impress me), and it got it absolutely wrong - every - single - time.

          I threw some softballs at ChatGPT, things I already knew the factual answers to. Took enough time and the answers were vague enough and often plai

        • by vivian ( 156520 )

          I don't think lawyers can just make stuf fup as they go along - they have to find case law that supports their arguments.
          I would have though it's quite easy to use something like ChatGPT to ferret out relevant case law and compared to doing all the research yourself.
          My understanding of how las research works is that you have to dig through a vast archive of legal precedents and current law statutes to find stuff relevant and favourable to your case - a difficult job since indexes obviously can't contain eve

        • It is much harder to verify something which someone (something?) else has written, than writing something yourself.

          Does this advice change when you get information from a Google search or ask a real person?

          I did not bother with registering with ChatGPT and handing over my soul, but testing Bing which uses ChatGPT also did not impress me

          Bing chat is watered down garbage completely different from either GPT-3.5 or GPT-4. Minimum usable ChatGPT IMO is GPT-4.

          and it got it absolutely wrong - every - single - time.

          That's been my experience with Bing chat as well. I found it to be completely worthless.

        • by CAIMLAS ( 41445 )

          I use ChatGPT to create the basic text and format for something I need to produce quickly.

          I don't write as quickly as I'd like - but I can edit others' writing like nobody else's business. Creative process, required for original writing, is weak... criticism and cynicism is high. That's a win.

  • by gweihir ( 88907 ) on Saturday August 19, 2023 @09:17PM (#63781416)

    So their students can partake in the latest mindless hype. Well, at least the smart ones will eventually see how incapable chat-AI actually is.

    • Well, it's no different from drug-education or sex-education.

      We know that abstinence-only education does not work. What works is to teach how the thing actually works, and all the pitfalls and dangers of using it.
      • by gweihir ( 88907 )

        Well, it's no different from drug-education or sex-education.
        We know that abstinence-only education does not work. What works is to teach how the thing actually works, and all the pitfalls and dangers of using it.

        That abysmal and predictable failure is still being used? Talk about utterly dysfunctional fake "teaching" that does much more harm than good. Fortunately, it seems to be an aberration only found in the US and other theocracies.

        • Yeah. It's migrated over here in Australia as well. An author of a popular sex-education book for children was targeted, because US-influenced theocratic groups claim that any attempt to educate children on how the reproductive system works was "pedophilia".

          They literally think teaching children about how their body works is pedophilia. It's like these dickheads haven't gone through sex-ed class before. There was nothing sexual about it. It was "here's how your organs work, here's how reproduction happen
          • In 2023, An Chinese mother complained that [title unknown] ([author unknown], 2014) was the textbook for the local school.

            In 2023, Australian parents complained that a Youtube sex-education series mentioned homosexual oral-sex, (possibly involving under-age males).
            Other parents complained that "Welcome to Sex" (M Kang & Y Stynes, 2023) was sold in department stores.

            In 2018, An Australian mother complained that "The Amazing True Story of How Babies Are Made" (Fiona Katauskas, 2015) was sold in depar

          • Yeah. It's migrated over here in Australia as well. An author of a popular sex-education book for children was targeted, because US-influenced theocratic groups claim that any attempt to educate children on how the reproductive system works was "pedophilia".

            If there is one thing I've learned in life, its that people who obsess over pedophilia are the ones getting turned on by children. The number of fundamentalist and evangelicals getting busted for kiddie diddling are a testament to that.

            They literally think teaching children about how their body works is pedophilia. It's like these dickheads haven't gone through sex-ed class before. There was nothing sexual about it. It was "here's how your organs work, here's how reproduction happens, here are all the diseases you can catch." Christians are forever stuck in juvenile mentality about the human body.

            The brand of christianity you refer to are juvenile, but there is worse going on.

            They see these books and illustrations as erotic devices because it sexually arouses them.

            https://www.npr.org/2022/06/02... [npr.org]

            And it ain't just Southern Baptists.

            • by gweihir ( 88907 )

              If there is one thing I've learned in life, its that people who obsess over pedophilia are the ones getting turned on by children. The number of fundamentalist and evangelicals getting busted for kiddie diddling are a testament to that.

              These people are basically hating themselves and project that outwards as a flawed survival strategy. And as that approach does not work to get the urges under control, children get abused by them. Also works for homosexuality: The most rabid anti-homosexual hate mongers are often actually homosexuals that get caught sooner or later.

          • "Christians are forever stuck in juvenile mentality about the human body." And yet their "Good Book" is filled with all kinds of perverted and graphical sexual acts including rape, not to mention the extreme gore and violence and they encourage their kids to read this filth. Seriously, these nutballs must've ridden to school in a bus so short that their faces were pressed into the windshield.
            • by gweihir ( 88907 )

              There was something about Utha banning the Bible as unfit for children. Ah here: https://www.bbc.com/news/world... [bbc.com]
              Obviously that is a half-measure and obviously religion should be banned for children as that is the actual root of the problem.

              In other news these religious fundamentalists are even more bereft of reason than flat-earthers.

              • "There was something about Utha banning the Bible as unfit for children. Ah here" But muh religous freedum! In this day and age this is an instant career death for any politician to publicly think this, even in the UK and especialy in the US. I suspect that most Christians in the US are the "Sunday" variety who only know what the pastor tells them at the pulpit. The more diehard, the ones who actually read the Bible, are the "LA LA LA I CAN'T HEAR YOU!" type that read through all the nasty things and d
          • by gweihir ( 88907 )

            How repulsive. I do not get where Christians get that deep existential fear of equipment that everybody has in some version and that was, according to their dogma, designed by "God". Yes, I know why most religions push this crap, that is simple: People that do not understand reproduction do not have the means controlling it. And hence the natural urges assure tons of new victims for the malicious meme infection that religion is. But why are the followers going along with something this obviously stupid? May

      • by Alumoi ( 1321661 )

        OK, here's ChatGPT 101:
        1. open website/app
        2. ask your question
        3. copy the answer verbatim
        4. if everything is OK goto step 1
        5. if something is wrong blame the site/app creator

        • by gweihir ( 88907 )

          What if something is wrong in the answer but the person using this mental prosthesis does not realize?

          • by Alumoi ( 1321661 )

            Right, my bad.
            4. don't check anything, just publish/say it.
            5. if nobody complains goto step 1
            6. if something is wrong, blame and sue the site/app creator.

    • by CaptQuark ( 2706165 ) on Sunday August 20, 2023 @01:12AM (#63781650)

      People are always nervous when technology changes.

      In High School we were just barely allowed to use calculators in math class. The students a year before were still using slide rulers and log-table books. The worry then was "What if the batteries in the calculator die? A slide ruler never runs out of power." But our teacher saw the advantage of not needing a huge book of four-digit log tables for trigonometric calculations.

      In college our English professor refused to allow papers that were printed on a dot matrix printer. She said only typed papers or those printed on printers with true descenders. I made a little pocket money typing students papers on my approved printer. A dollar a page wasn't much, but back then I could type in about ten pages an hour. Corrections and reprints were almost free money for me.

      People said that using spell check on homework should not be allowed as it didn't teach students how to spell from memory and to proofread their own work. OK, I might agree with that observation slightly. I've been corrected three times in this post alone, but AI will soon be viewed in the same way. It seems to me to be a inevitable progression from spell check, to Wikipedia citations, to AI suggesting content. You still have to review the suggested content to make sure the output is factual and expressing what you are trying to communicate.

      • by evanh ( 627108 )

        Calculators work exactly as advertised. Chatbots are crap at everything.

        • by gweihir ( 88907 )

          Well, chatbots are pretty good at misleading people and selling their hallucinations. Hmm. Ideal used-car salesperson or politician?

      • People are always nervous when technology changes.

        Some people are. I was in the last class in school to learn on a slide rule. Next year we switched to calculators. I even keep a slide rule in the garage and use it.

        But I can find people who still don't believe in electronic calculators 50 plus years after they were introduced.

        To me, the AI is just jumbling words together in a manner that sounds impressive to some. And also to me, it is not intelligence. It's the sort of thing that IMO, is very impressive and smart sounding to the same people who get ta

        • by gweihir ( 88907 )

          People are always nervous when technology changes.

          Some people are. I was in the last class in school to learn on a slide rule. Next year we switched to calculators. I even keep a slide rule in the garage and use it.

          Slide rules are awesome for teaching! Best way ever to make people really understand addition, multiplication and logarithms/exponentiation. They should have kept them until past logarithms and exponentiation and only then go to calculators.

          But I can find people who still don't believe in electronic calculators 50 plus years after they were introduced.

          To me, the AI is just jumbling words together in a manner that sounds impressive to some. And also to me, it is not intelligence. It's the sort of thing that IMO, is very impressive and smart sounding to the same people who get taken in by grifters and pretend populists.

          Pretty good comparison. And no, what we have in AI today is not A(G)I. Not even a glimmer of actual intelligence in there. Still just pushing 0 and 1 with no understanding and insight whatsoever. I say that as somebody that actually expected AGI to eventually become a re

          • Some people are. I was in the last class in school to learn on a slide rule. Next year we switched to calculators. I even keep a slide rule in the garage and use it.

            Slide rules are awesome for teaching! Best way ever to make people really understand addition, multiplication and logarithms/exponentiation. They should have kept them until past logarithms and exponentiation and only then go to calculators.

            Slide rules changed me from a not very good math student to one who did most of the math in my head. Something clicked when I worked my first simple problem on one.

            Pretty good comparison. And no, what we have in AI today is not A(G)I. Not even a glimmer of actual intelligence in there. Still just pushing 0 and 1 with no understanding and insight whatsoever. I say that as somebody that actually expected AGI to eventually become a reality in my lifetime when I studied CS 35 years back. Now I am a PhD-level engineer and despite following the AI field for all that time, there is nothing there that I can see that would justify the name. The term "AI" is basically a Big Lie and always has been (https://en.wikipedia.org/wiki/Big_lie).

            AI is just the latest buzzword. A lame compilation of web scraping and the software to sortakinda put the data into plain language. And it has no idea if it is correct or bullshit

            It really does remind me of bullshit merchants - and yeah, that big lie issue.

            So much like Learn to code for everyone, (pointless)

            Indeed. Complete waste of time.

            or Women in STEM (what if they don't want to go into STEM careers?)

            I usually have 2-3 women in my 30 student strong CS cours

      • I'm old enough (42) to maybe have been in the last group of elementary students where calculator usage was strongly discouraged. The younger guys I work with (all engineers, mechanical electrical and chemical) are almost incapable of doing arithmetic without a calculator due to their habitual use of one doing school work.

        This is a rather important thing in the field where you are trying to solve complex electrical problems in locations where cell phones are not allowed.

        So yes, a calculator will solve arithm

        • by gweihir ( 88907 )

          So yes, a calculator will solve arithmetic (or nowadays may be a full CAS on your phone) problems accurately and quickly, but you are trading off understanding the arithmetic and the ability to work efficiently without the tool.

          And that is the point. Any engineer should be able to use a slide-rule or a book of tables competently. The new tool does not obsolete the old one. Especially for plausibility checking, digital calculators are completely unusable.

      • Artificial intelligence in our time can help students. But now there are also many websites that can help. Speaking for this, I wanted to advise a good online essay writing service that will help me. An essay is a written composition that presents an author's https://www.nursingpaper.com/e... [nursingpaper.com] perspective, analysis, or argument on a specific topic. It typically includes an introduction, body paragraphs, and a conclusion. Essays can vary in length and style, and they are commonly used in academic, professiona
    • by VeryFluffyBunny ( 5037285 ) on Sunday August 20, 2023 @02:51AM (#63781734)
      They're missing the point. The cold, hard truth is that students getting ChatGPT to research, plan, & synthesise academic assignments for them means that they aren't required to think sufficiently hard & in the right ways, i.e. generatively, about the subject matter. The resulting learning is even less likely to be deep, durable, or transferable than current practices. But yeah, the students will be able to turn in better quality assignments.

      The whole point of academic assignment writing is to get the students to develop their subject matter knowledge, e.g. writing on a topic helps students to identify what they do & don't know, how well they understand it, & identify any misunderstandings that they may have. The written output is a byproduct that'll never be read again. The aim is to improve the student, not the work.

      On the other hand, one way that using generative LLMs could be helpful is in generating worked examples of the researching, planning, & synthesising process for students to learn from but since the vast majority lecturers don't know how to teach the process of writing, I'm not holding my breath.
      • But TFA? Yeah, the "OMG!!! DON'T GET LEFT BEHIND!!!" and giving outrageous salaries for low-skilled work is pretty poor reporting. The reality is, we'll end up being paid less, not more, & there'll be fewer jobs around when much of our jobs becomes automated & de-skilled.
        • The fear of getting left behind is a common trap academics fall for. When I taught in grad school, the old professors all thought it was so important to allow students to take notes on laptops. Otherwise, they would be resisting change and holding students back! They were so afraid of being seen as luddites that all their students were just dicking around on social media and cheating right in front of them.

          • Re: laptops, it's even worse than that. Even if they only use their laptops for note taking, it's less effective than handwritten notes. See: https://pubmed.ncbi.nlm.nih.go... [nih.gov]

            Ideally, students should be taught the most effective ways to take notes & how to use them for revision. It can make the difference between passing comfortably & failing.
            • by gweihir ( 88907 )

              I found that too. I always took notes on paper. Some years ago my new boss though that was "inefficient" and I tried note-taking on a computer. Guess what, the cognitive load is much higher doing it on a computer and the quality of both the notes and the interviews I was conducting dropped. I am back to taking notes on paper. An interface optimized over several 1000 years is just way better than some newfangled crap that is not even 100 years old.

              • The rationale suggested by many researchers is that typing is faster than handwriting. So fast, in fact that most students can type what they hear verbatim, which only requires superficial processing & therefore results in poorer memory of the content (See Craik & Lockhart's Levels of Processing Model: https://en.wikipedia.org/wiki/... [wikipedia.org]).

                In contrast, handwriting is so slow that students have to come up with ways of selecting, shortening, & summarising the content in order to be able to keep up
                • by gweihir ( 88907 )

                  Probably true. I remember especially math lessons where you would have to note down the proofs and what the lecturer said and that was just infeasible. Summarizing required understanding. And that made all the difference.

                  I do found that cognitive load is a problem too, especially when taking note with substandard MS "Office" crap. These interrupt, change fonts, require complex interactions all the time and that does just not work. Just using a text-editor is better.

                  • Another thing they've found is that performance on tests on electronic devices, e.g. desktops, laptops, & tablets, is poorer than in paper format. More specifically, they found that in reading comprehension, it was the depth of understanding, i.e. being able to connect the sub-points/ideas to get the general point/idea. So while digital formats are OK for learning & memorising the constituent parts of a field, they're detrimental to developing the kinds of understanding that education typically aims
                    • by gweihir ( 88907 )

                      I have not noticed that yet, but I have done only two online-tests so far and all others on paper. But interesting point, I will have a look when I grade the next online one whether I see some noticeable differences.

        • "OMG!!! DON'T GET LEFT BEHIND!!!" It's becoming obvious that this is another flavor of "Won't someone pleeeze think of the children!", with the same mindlessness and love of simpleminded slogans and soundbites and quick instant goodfeels behind it. So what happen's if little Chuckums does "fall behind" and get further and further distanced from the moving bandwagon of "left behind" bleaters?
      • by rea1l1 ( 903073 )

        There is no doubt that a lot of students will abuse this technology resulting in ultimately learning less.

        There is also a certain segment that will learn better due to better explanations from the AI, especially with college students studying under research professors who don't speak the language they are teaching in and couldn't care less how well their students learn the material.

        The solution is extremely simply: grade nothing but in person tests.

    • So their students can partake in the latest mindless hype. Well, at least the smart ones will eventually see how incapable chat-AI actually is.

      Ain't that the truth, How's that "Women in STEM" and Everyone learns to code thing working out for thosetrendy folks?

      It might have succeeded if they had used blockchain, amirite? 8^/

  • AI is a fad, ten years from now .. and let me tell you right now you can come back to this comment .. bookmark it. Nobody will be using AI any more than 3D television. You will never be able to use AI, in any form, in your real job. You'd get fired within a day, it's as simple as that. AI is a gimmick, a side show at the county fair. The human brain can do any real world task AI can do, except better.

    This comment generated by AI.

    • I bought my 3d tv with FTX coins, decorated my place with tulips and invited my bored ape pals over to watch my 3d tv with me.

    • AI is a fad, ten years from now ..

      My dad still thinks the Internet is a fad. Any day now, people will come to their senses.

      You will never be able to use AI, in any form, in your real job.

      I already use AI, every day, in my real job.

      You'd get fired within a day

      I get paid to solve problems, not to "do my own work".

      If I can solve problems faster by stealing code, cut-and-pasting from Stackoverflow, or using ChatGPT, my boss is happy.

      • by MeNeXT ( 200840 )

        If I can solve problems faster by stealing code, cut-and-pasting from Stackoverflow, or using ChatGPT, my boss is happy.

        Until the day you are called out on copyright infringement.

      • by Alumoi ( 1321661 )

        I already use AI, every day, in my real job.

        Spam farm or customer support, right?

    • AI is already reducing workloads to the point where one LLM-supported writer can now do the work of 5-10 in the same time. You think PR & marketing agencies, who publish copywriting excessively voluminously, care whether a human sub-contractor or a machine generated it?

      I can already see a whole range of tasks that generative LLMs can substantially reduce the workload of.

      In other areas, I've read about plans to replace film & TV extras with AI generated characters, thereby substantially reducin
      • You think PR & marketing agencies, who publish copywriting excessively voluminously, care whether a human sub-contractor or a machine generated it?

        That's going to depend on whether they want to own the results. If someone can prove that ChatGPT created a marketing firm's successful ad campaign, then that firm will have no legal recourse when its competitors use the ads for their own campaigns.

        • What, repeat their ads again? That argument doesn't make any sense. Ad agencies steal everything from everyone & change it a little bit to avoid lawsuits.
  • by tiqui ( 1024021 ) on Saturday August 19, 2023 @11:02PM (#63781522)

    "educators" will teach students "cutting edge" stuff.... that will be obsolete long before they enter the workplace.

    GOOD teachers/professors will teach their students how to read, write, and REASON. Well-educated, self-motivated, LITERATE people can teach themselves (or easily find others to teach them) any newfangled whiz-bangy stuff they need to learn years later when they actually need to learn it and use it and it's the current new thing. THIS used to be called "education".

  • I can expect some against anything remotely modern sap telling the public school class "The Bible is God's ChatGPT!"
  • asking the same thing in various ways can get dramatically different results

    They're teaching the students the modern equivalent of casting magic spells, basically. And I'm sure the 'magic' won't change at all between now and when the students graduate. We have reached the end of creation, after all. This is the way things will alllllllways be.

  • by Opportunist ( 166417 ) on Sunday August 20, 2023 @05:01AM (#63781874)

    We used to teach to the test, i.e. look at the test, then teach the kids these questions, learn them by heart, regurgitate the results and have good results because they basically knew what was expected from them for this test. They still knew jack shit, but they got good grades.

    That didn't help with essays and such because, well, it's kinda hard to learn essays by heart.

    That loophole is now being closed, but this time, they don't even have to learn the subject anymore. They pretty much now learn how to tell their ghost writer what to write.

    It's kinda like teaching kids how to google instead of teaching them how to find the answer. And with the same result, they will take whatever that thing spits out, whether true or not, and copy it.

    Because they STILL don't learn how to gauge information for veracity and how to debunk bullshit.

  • Now we'll start getting articles that say "Should everyone be an AI programmer?"

  • At least they're not teaching kids ChatCRT. Or ChatLGBTP
  • Why not just go straight to the final solution and eliminate the middleman? Just enroll clones of chatgpt, not actual students. Use the classes to train the clones. Then send them to work for IBM,

My sister opened a computer store in Hawaii. She sells C shells down by the seashore.

Working...