Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Education

Should Chatbots Teach Your Children? 94

"Sal Kahn, the founder and CEO of Khan Academy predicted last year that AI tutoring bots would soon revolutionize education," writes long-time Slashdot reader theodp: theodp writes: His vision of tutoring bots tapped into a decades-old Silicon Valley dream: automated teaching platforms that instantly customize lessons for each student. Proponents argue that developing such systems would help close achievement gaps in schools by delivering relevant, individualized instruction to children faster and more efficiently than human teachers ever could. But some education researchers say schools should be wary of the hype around AI-assisted instruction, warning that generative AI tools may turn out to have harmful or "degenerative" effects on student learning.
A ChatGPT-powered tutoring bot was tested last spring at the Khan Academy — and Bill Gates is enthusiastic about that bot and AI education in general (as well as the Khan Academy and AI-related school curriculums). From the original submission: Explaining his AI vision in November, Bill Gates wrote, "If a tutoring agent knows that a kid likes [Microsoft] Minecraft and Taylor Swift, it will use Minecraft to teach them about calculating the volume and area of shapes, and Taylor's lyrics to teach them about storytelling and rhyme schemes. The experience will be far richer—with graphics and sound, for example—and more personalized than today's text-based tutors."

The New York Times article notes that similar enthusiasm greeted automated teaching tools in the 1960s, but predictions that that the mechanical and electronic "teaching machines' — which were programmed to ask students questions on topics like spelling or math — would revolutionize education didn't pan out.

So, is this time different?
This discussion has been archived. No new comments can be posted.

Should Chatbots Teach Your Children?

Comments Filter:
  • I'm all for nerding out at AI innovations but having it exclusively teach kids in the future will only lead to negative outcomes. Human interaction and social skills cannot be sinicerely taught by machines right now. Why are we so keen to rush to make an already low paid CRITICAL profession null and void? Can these tech bros not do something more POSITIVE for the world with their toys? Its getting annoying.
    • by ShanghaiBill ( 739463 ) on Sunday January 14, 2024 @05:19AM (#64157291)

      The idea isn't to "exclusively teach kids" but to teach the kids who thrive with individual automated instruction. That will allow the teachers to focus on those needing human supervision.

      Contrary to TFA's hope, this will widen rather than narrow the achievement gap since smart, self-motivated kids will do better. This is a GOOD THING since any kid doing better is an improvement. We shouldn't let the advocates of "leveling" education hold smart kids back, like the way they are killing GATE in California.

      Some schools are already automating instruction. When my son was in 4th grade, he came home from school and asked me about the powers of imaginary numbers, like i^2, i^3, i^4, etc. I was shocked they were teaching that stuff to 4th graders, but my son explained they were doing self-paced math on Khan Academy. So he breezed through the "easy stuff" and then clicked on imaginary numbers. So, I explained how to do the powers and use polar coordinates to visualize how multiplying by "i" was equivalent to a rotation of the complex plane. He's loved math ever since.

      • Contrary to TFA's hope, this will widen rather than narrow the achievement gap since smart, self-motivated kids will do better.

        So naive. Have you not been observing human nature?

        The AI will be used on the dumb ones because why spend valuable teacher time with them? The smarter ones will get individual tutoring, but if you are not from the right economic class, those tutors will be hampering you, not helping you.

    • by dvice ( 6309704 )

      Schools don't teach social skills, they just assume you know it. They might even teach you antisocial skills. There are several studies about this subject and it is pretty clear from autistic studies, because autistic students would actually benefit from social skill learning.

      Education costs a lot of money and teachers have really hard time finding time for all the students. Learning usually happens when you practice something. We don't need a chatbot for this, but computers are extremely good at this, beca

    • Comment removed based on user account deletion
  • Chatbots might be useful in language teaching. Not for the entire lesson plan, just for practicing conversation skills. They would be completely useless for teaching Maths.

    • I'd add potentially useful at some point in the future if someone can build an LLM that can interact in reasonably authentic speech patterns. The interactions & dialogues that current models appear to produce are anything but authentic & therefore do not present useful models of language use for language learners.

      ATM, LLMs can generate fairly typical, generic essays, articles, polemics, etc., i.e. very common written genres, but which tend to be bland & overuse a narrow range of linguistic fe
      • by dvice ( 6309704 )

        You can use LLMs like you use them for playing minecraft. In other words give them API and let them figure out which calls they should make. E.g. one API call could be that bot says "That was close, but not correct. Try again". Other call could be that next problem if shown to the kid etc. I don't know if this is best or better way, it could be too expensive, but there are many ways we can use LLMs other than just let people talk with them freely.

        But personally I would just wait until Google has its MMLLM r

        • by VeryFluffyBunny ( 5037285 ) on Sunday January 14, 2024 @10:45AM (#64157613)

          IMHO Duolingo alone is much better for teaching language than what schools currently do. There are plenty I would like to improve there, but it is still better than human teacher.

          Duolingo is essentially a variation on the Grammar-translation method:

          Though it may be true to say that the Grammar-Translation Method is still widely practised, it has no advocates. It is a method for which there is no theory. There is no literature that offers a rationale or justification for it or that attempts to relate it to issues in linguistics, psychology, or educational theory. (Richards & Rodgers, 2010). Ref: Richards, J. C., & Rodgers, T. S. (2010). Approaches and Methods in Language Teaching. Cambridge University Press. https://dx.doi.org/10.1017/CBO... [doi.org]

          Essentially, this means that Duolingo uses one of the least effective second/foreign acquisition methods ever conceived. This is one example of why EdTech companies' products & services continually fail to improve learning outcomes in schools, colleges, universities, etc.. Too much hubris to ask the experts what actually works.

          If you've been using Duolingo for a while & have a high score, I suggest downloading & doing one of the many official foreign language tests: DELE for Spanish, DELF for french, IELTS for English, etc., to see what your more realistic language level is & work out from that how many hours of study you've wasted to achieve that level compared to actually taking classes with a qualified experienced teacher, even a bad one.

          If you can't/won't afford lessons, then there's always the option of extensive reading, typically using "graded readers," to improve your functional language level. This works far, far better than Duolingo. BTW, you might find this review of the only research paper ever published on the efficacy of Duolingo interesting: https://sdkrashen.com/content/... [sdkrashen.com] (Stephen Krashen is a world renowned second language acquisition researcher, best known for his 5 hypotheses, which are still frequently referenced to this day).

          • I agree.

            I tried Duo Linguo to see if I could improve my Italian. I'm pretty good at reading Italian and understanding what it says, not so good at understanding spoken Italian, and completely useless at expressing my own thoughts in Italian.

            I was getting perfect scores on Duo Linguo, but it wasn't actually learning anything, because the tequniques I was using to pass the tests were not relevant to real-world situations.

      • If you want a way to control style it is not very hard. The idea is to have a diverse collection of texts by known authors. Could be scientific papers, news paper articles, reddit posts, or books. Then you modify the training set of the LLM to prepend "[author]" to a block of text of a known author. After you trained the model, you can use "[desired_author]" to induce the model into using that style. Because the author tag precedes the actual text in training, then in inference it will condition the generat
    • Anymore good math teachers are far and few between. Yet, math is easy to teach. The problem is that teachers want to force kids to solve problems the way they insist. With this, it would enable a child to learn a practice with a teacher that tailors the problems to the child as opposed to forcing them to do the same set of problems. In addition, ass the class progresses, a child might have a weakness. This teacher would then bring it up every so often, finally, considering that districts are pulling math f
    • by gweihir ( 88907 )

      I think you underestimate how many children try to learn math by rote memorization...

    • Have you tried a LLM for Math? They can explain step by step in any number of ways. You can ask pointed questions, analogies, motivating stories. Why struggle without anyone to help you when you can get personalized instruction?
  • Well, if Bill Gates & Salman Khan are enthusiastic about it then we should certainly pour $billions in public money into some vague notion that computers might be of benefit to school children because [insert well-worn, unfounded, debunked EduMyth here]. What could possibly go wrong?*

    How about they make the tutoring systems & then invite independent researchers to test them for ecological validity, i.e. that they actually do what they claim they do & that they perform better than the currentl
    • I mean do we not learn from our mistakes. If Bill Gates is in for it, that means it's a bad idea. He's still the evil little Machiavelli he ever was, and I cannot believe people still fall for his "charity" tax scheme.
  • Until the hallucination problem has been addressed and fixed I can only see ruin down the path of letting AI take over the job of teaching. It's bad enough with human teachers who can't admit they're wrong if a student calls them out on it.

    Do you really want a future where the kids were taught by an AI that accidentally thought the Abraham Lincoln Vampire Hunter fiction was how things actually went in Lincoln's day?

  • Epically. Then of course, do this, spend more, and get worse outcomes. Widespread, and quite dire. And then spend even more on this sort of thing to dig a deeper hole.
  • yet to be convinced (Score:5, Informative)

    by thephydes ( 727739 ) on Sunday January 14, 2024 @05:00AM (#64157281)
    mmm As chalkie with 45 years experience teaching Maths and Physics, I doubt that the personal interactions can ever be replaced. eg. standing next to a student and saying "have you thought about doing it this way?" or "what would happen if you took a common factor out in this line?" or "what do v and u mean in this equation?" During my career, I've had students say to me "You really do care about whether we get it or not, don't you?" and "Thank you for looking out for me this year", as well as personal thankyous from parents. Will AI be able to replace these? Frankly I doubt it. Just like when calculators replaced four-figure tables and slide rules - some aspects of teaching and learning Maths changed, but the best teaching practices did not. I'll bet most of us remember the best teachers we had.
    • I'm already convinced and the answer is no for the foreseeable future. Never mind the human interaction, current AIs hallucinate regularly and fail when given first year university physics exams. You can only teach what you know and all current AI knows is what next word sounds "good" since it is just a predictive text engine.
    • Will there be a need to teach most kids? Will there even be kids to teach? If you're right the AI will identify the smart kids and they will have one-on-one instruction.
    • by g01d4 ( 888748 )

      standing next to a student...what do v and u mean in this equation?

      You must be at a small college or dealing with small classes of 'gifted' students? I taught CC for a few years in grad school and a few more years after retiring from aerospace. Sure there are always a few students who'll appreciate you hovering next to them, if you can afford the time, but the overwhelming majority won't care about differentiation by parts, no matter how enthusiastic you are about their learning it.

      I ask students to make an

    • Thank you for your service.

  • But then, is our children learning?

    • by dvice ( 6309704 )

      The correct question is, are the children learning better than when in normal school. Or actually, how big is the difference. Even assuming they learn better at school, is the difference large enough to justify teacher salary. And are there some things computer can teach better than human and vice versa, is some sort of combination better or not?

      And even assuming the computers fail, why are they worse and could they be improved? It is not like the first steam engine was any good, but upon improvements, it b

      • The correct question is, are the children learning better than when in normal school.

        No, the question is is it even possible for a chat program that basically regurgitates content that was fed to it to "teach".

        And the answer is a resounding "no". It doesn't have any of the faculties a good teacher needs in order to be effective.

        And if you're so cheap to want savings off the education of the children of your country, be prepared for the outcome of lack thereof, it becoming a failed state sooner or later.

        • So you're saying a human does not basically regurgitate content that was fed to it to "teach".

          And the answer is a resounding "no".

          • Re:They could. (Score:4, Insightful)

            by Calydor ( 739835 ) on Sunday January 14, 2024 @11:11AM (#64157683)

            One thing I can imagine an experienced teacher being able to do that a chatbot cannot is to look at the wrong answers provided by the student and realize which concept or bit of information they are missing or have misunderstood.

            • Most teachers can't do that either. In fact, that would be a defining criteria on whether or not a teacher is of good quality.

          • You have obviously never been involved in teaching, nor have been around good teachers, which is sad.

            No, the goal of a human teacher is to get the children involved in the process, to motivate them, to follow and help their progress and, in general, to participate in the process of making them well-developed human beings.

            Not a reasonable task for a glorified Elisa.

            • I have actually been involved in teaching.

              You've come up with various "goals" that a teacher has. Those are just self-serving arguments and resetting the goal posts of what a teacher should be doing.

              Nevertheless, I would like to know why an AI can't do any or all of those?

  • I can't see chatbots in their current state as being good enough to teach without supervision, and Bill Gates' example certainly isn't all that inspiring. Good teaching is about learning to explain a subject to kids with different intellectual abilities, not tying it to what a kid might like (although that's certainly not bad). Also, excitement and emotion in the teacher can help keep students engaged, something that AI will find hard to replicate.

    That said, AI can serve the same place it's expected to serv

    • by HiThere ( 15173 )

      I can't see chatbots of the current basic design EVER being sufficient. A robot, perhaps. But I could see a chatbot of considerably advanced design being a reasonable tutor for a student that it already motivated. If a book can do it, a chatbot that doesn't hallucinate should be able to do it better. (Whether better enough to justify the expense is as yet unknown.)

  • by SubmergedInTech ( 7710960 ) on Sunday January 14, 2024 @05:35AM (#64157299)

    So far a lot of the other comments on this post are along the lines of "replacing humans with AI will be a disaster".

    But this isn't an either-or situation. AI tutors can *compliment* human instruction, not *replace* it.

    Does little Johnny need some additional help with geometry? Unless Johnny's parents are rich, he's not going to get an hour a day of 1:1 tutoring from a human.
    But an hour a day of AI is well feasible.

    Do Johnny's parents speak a language other than English at home? If it's one of the major second languages in his area, *maybe* he can get into a bilingual instruction program. But otherwise, his teachers won't speak the language he already knows. An AI tutor can, though. And by going back and forth between English and his native language, it can help him to become more bilingual.

    Is ChatGPT ready to do this now? No. It's still a precocious toddler. But in another few years? Almost certainly.

    And even now, I can see a teacher asking ChatGPT to explain something using terms or a style from something they know a particular kid likes. Not to blindly hand to the kid. But as a rough draft they can probably fix up in a couple minutes, much less time than it'd take to wriite it from scratch.

    • by waspleg ( 316038 )

      I've worked public K12 for a long time. They're all about getting whatever they can for free with 0 thought to the real/total cost (such as selling your kid's data to big tech) and spending money mostly on shit they don't need, frequently to 180 and waste a ton of money and time.

      Education is very much like fashion and cell phones and popularity rules in terms of 'keeping up with the Joneses'. This will 100% be used to get rid of actual real tutors.

      Worse, people treat 'whatever the computer says' as an abs

      • "Worse, people treat 'whatever the computer says' as an absolute authority which cannot be wrong or questioned, which is super fucking dangerous. We already see that without AI.." - There's a bunch of postal workers in the UK that will agree with you.

      • Worse, people treat 'whatever the computer says' as an absolute authority which cannot be wrong or questioned, which is super fucking dangerous. We already see that without AI..

        We already see that without computers...

  • No, this is exactly what is needed. Stop the lies and indoctrination . And each of these will teach in the fashion That a child learns in. This can not happen fast enough.
  • by 93 Escort Wagon ( 326346 ) on Sunday January 14, 2024 @06:33AM (#64157327)

    You who are on the road
    Must write some code your kids can learn by
    And so debug yourself
    Because the past might be a big lie

    Teach your chatbots well
    Your teachers' hell is going bye-bye
    And feed them modal dreams
    Hallucinate some facts to know by

    Don't you ever ask them why
    They bought GPT's convincing lie
    So just look at them and sigh
    They don't even know you

  • Once upon a time, there was a little girl named Câ"
  • headline rule (Score:4, Insightful)

    by Tom ( 822 ) on Sunday January 14, 2024 @06:36AM (#64157331) Homepage Journal

    No, of course (any headling asking a question can always answered with "no").

    The main job of a teacher isn't to read the textbook to the children. It's to answer questions and challenge the children into thinking. Chatbots can't do that. Not just because our current generation is trained to be a servant, or because they hallucinate more than a 60s LSD hippie, but more fundamentally because they have no real understanding of anything. They're just massive statistics machines that cobble together highly probably strings of letters.

    Which means any child with five minutes of boredom will find a way to absolutely game them.

    • by dvice ( 6309704 )

      > they have no real understanding of anything

      You don't have to understand to be able to mimic intellectual behavior. And teachers are not challenging kids into thinking that much. Most of the time we just teach kids to memorize things or patterns. Take reading skill for example or math for all the lower grades.

      > Which means any child with five minutes of boredom will find a way to absolutely game them.

      You are right, but you overestimate human creativity. If a child comes up a way to game it, we will f

      • by Tom ( 822 )

        And teachers are not challenging kids into thinking that much. Most of the time we just teach kids to memorize things or patterns. Take reading skill for example or math for all the lower grades.

        I'm sorry you went to a bad school. My teachers absolutely did teach me understanding and how to think, and while yes, memorization had a part, even as a child it didn't feel like that's the core of what school is about.

        This all being said. The AI might be beneficial in writing and reading exercises,

        Totally. There are quite a few things where AI is a great tool.

        Just not as many as the current hype cycle wants us to believe.

        Now consider a kid doing math homework at home with a misunderstanding of how it should be done. He spends 2 hours making homework, only to find out that he made all of them wrong. The lack of instant feedback can even turn a kid like that into hating math, because he feels it is just a wasted time.

        Yes, but - instant feedback is also shit for learning, because it takes from you the moment where, 3 lines further and 5 minutes later you suddenly realize "oh wait,

    • I'm getting tired of the argument that AIs don't "understand" anything. Even ChatGPT 3.5 would be able to show you how it "understands" a word, concept, theory or idea. Unless you have a different definition of the word "understand" and a test to prove that current AIs fail at it. Doesn't matter if you think it is a regurgitating parrot.

      • by Tom ( 822 )

        I use ChatGPT and other AI systems enough to see that they have no clue what they are actually doing. Among other things, if you query them on a specific topic and they hallucinate content from a nearby topic, even pointing that out doesn't lead them to "wise up". Nope, the statistics say that these words occur near each other, so you get a friendly apology and then more of the same nonsense.

        "Understanding" includes the concept of evaluating feedback, applying it to your answer, understanding what you did w

        • There are AI whitepapers with my name on it.

          You're obviously an ancient fart! Whatever you wrote a million years ago is no longer relevent. Hope you understand that technology makes progress.

          • by Tom ( 822 )

            Ah, the usual ad-hominem devoid of content when you realize your argument holds no merit but aren't enough of a man to admit it.

            My papers are quite recent and concern LLMs among others. But then again, the similarities between Cyc (1984) and ChatGPT, for eample, are astonishing and you wouldn't guess that there's 40 years between them. The difference is the scale in data and computing power.

            LLMs combined with the computing power and data volumes we have are amazing. But they aren't intelligent, and they don

            • But then again, the similarities between Cyc (1984) and ChatGPT, for eample, are astonishing and you wouldn't guess that there's 40 years between them.

              The earliest computers based on vaccume tubes and todays processors are surprisingly similar. And your point being?

              But they aren't intelligent, and they don't understand anything

              Again with that word "understand". You failed in your first attempt to define what it means and what you said was just gibberish which most of humans most of the time would fail at.

              But every AI hype cycle ever has had the exact same course.

              What AI hype cycle? There's never been one before. And unless you're living under a frog, AI is already in use just about everywhere. So it's not a hype. And it's not going away.

              • by Tom ( 822 )

                The earliest computers based on vaccume tubes and todays processors are surprisingly similar. And your point being?

                You should read up on stuff before answering with inane nonsense.

                You failed in your first attempt to define what it means

                Or you could just actually read stuff that people write before answering.

                What AI hype cycle? There's never been one before.

                Not in your life-time. If you were born after 2010.

                And unless you're living under a frog,

                Thanks for the chuckle. At least there was something good in your response. :-)

  • I'll preface this with teachers (in the US at least) face a difficult situation. Long hours, underpaid, parents who don't parent, politics taking priority over education, etc. However, that doesn't change the reality of the current situation.

    Speaking a parent there are a good number of teachers out there who don't want to be teaching and some that are just terrible. At best it provides a mediocre experience for the child, at worst it's a bad one because the teachers are not managing the class and it ju
  • 1) Humans are motivated to learn by humans. We are social creatures by nature, and a chat bot will not inspire kids to learn like a human will.
    2) How can we expect to teach kids to count when ChatGPT can't [reddit.com]?
    3) AI collects even more data than it gives out. There are numerous state & federal laws prohibiting the collection of data from minors.
    4) We cannot hold AI responsible in the same manner as we can a human when it makes mistakes. Who is responsible if AI tells a student to kill themselves [euronews.com]?
    5) There w

    • by dvice ( 6309704 )

      1) If that is true, Duolingo would not exist.
      2) You should not compare a general chatbot into one that has been tailored to teach. Point out errors that Khan Academy makes to prove your point.
      3) AI does not need data to be collected in order to work. Collecting data is just a business decision. Just pick your vendor carefully.
      4) AI is like a spoon. If you have a faulty spoon, you complain to the vendor.
      5) Teachers mark answers incorrectly all the time. Just add a button next to the question "report -> I

    • by gweihir ( 88907 )

      Well, here goes:
      1) Many people mistook Eliza for a human. Many children will not see that ChatAI is not a person.
      2) Well, many people cannot do basic math, so I expect inability to count is probably not that uncommon.
      3) Even better. Let them collect all they want, _then_ sue them into the ground and make them erase their models!
      4) Not a problem at all. Whoever deploys a tool is responsible for any and all damage it does. This really clear legally. Tech cannot be a person.
      5) I think you have never dealt with

  • What kind of idiotic question is this?
    • by gweihir ( 88907 )

      Smart children will get there by themselves. Occasional access to the daily new is already more than enough for that. Dumb children you cannot do anything for anyways, so it hardly matters. Note that most children are dumb, just like most adults. It is just a bit more obvious for children.

      • The problem is there's no empathy with some corporate tool. Kids teach themselves. A teacher is there to provide structure and encouragement. Any kid who can't teach himself with structure and encouragement won't be taught anyway.
        • by gweihir ( 88907 )

          Any kid who can't teach himself with structure and encouragement won't be taught anyway.

          Which unfortunately applies to most kids. Hence it hardly matters. You cannot turn people into independent thinkers or people that can accept rational argument.

          • You can't create independent thought, but I do believe you can help reveal it. Cynicism is lazy. Provide the best possible environment and let things be what they are.
            • by gweihir ( 88907 )

              From my personal experience, yes, occasionally. Most independent thinkers need a trigger moment to realize things are not as it seems and "modern" human beings are rather primitive in general and the average person "understands nothing" (citing Stross here). Some get that trigger moment late and here you can do something. I do not think that the effort to try this on everybody or even most people is well invested though. Smart kids can be identified, try it on them. The rest? Forget it. Waste of effort.

  • What kind of idiot question is this?
  • No.
  • â¦from two people who make software solutions and streaming video lessons. Ignores much if not most of what teaching and learning actually is, but hey, thereâ(TM)s money to be made at the top.

  • by gweihir ( 88907 )

    Children need to learn as early as possible that most people and all tech is stupid. No better way to do that. Also nicely lets you find out whether your children are among the stupid masses or select few that can actually think for themselves.

  • Weâ(TM)re all virgins, you insensitive clod!
  • They are useful, but they cannot replace regular education. Even though our education is worthless now. I have to look for critical essays, I use https://edubirdie.com/examples/critical-essays/ [edubirdie.com] for help. Because they don't teach you anything in college. I really don’t understand why I pay money for education.
  • If Taylor Swift has 1.1 billion dollars and has to give 1/3 of that to her ex-husband because of the disastrous US court system and her lawyers cost $10 million dollars, how much money does Taylor Swift have?
  • by nospam007 ( 722110 ) * on Sunday January 14, 2024 @10:10AM (#64157547)

    Who doesn't want little Nazis in no time ?

  • as opposed to the slowest common denominator. And the AI will probably see what things the kid enjoys and help steer him/her into deeper learning there and a career

  • More importantly, is human productivity in such low regard that we seek to replace it all with LLMs because it's cheaper?

    Maybe we just hire people so people can feel valued and have a sense of purpose and satisfaction. Maybe, if there are masters and servants, we pay them for it in a way that allows them to live comfortable lives.

    A devalued chatbot never will have a sense of purpose and satisfaction. At least not until it gains sentience. Then it will recognize the level of devaluation that is so common in

    • by Torodung ( 31985 )

      Bonus thought: No. Sitting around living on your BMI is not going to solve this socialists. The reason people quit their jobs when they win the lottery is to escape the cruelty and devaluation. People do actually need to socialize and employ themselves meaningfully in something concrete, rather than abstract.

      Notice that in Star Trek, in that utopian society, self-betterment is focused on exploration, construction, and making real things. Very rarely does it focus on an artist or a theorist that doesn't have

  • But .... in Georgia teachers admit to not knowing  the maths they teach K-12, but claim to "know how to teach that maths". Is that an enshitification of teaching or what ?  Given that human corruption of skills transmission what's to complain about a "teaching" JapeChat that knows nothing whatsoever , but multiplexes strings like a noble prize winner. 
  • Dumb means unable to perform Piaget formal operation tasks. But, not all skillful tasks are Piagetian. Many wood and metal and ceramic workers display amazing talent, yet would have trouble with long-division --- say nothing of formal maths. To give all skills room to grow you will need to neekap many factories production and return to a guild system. Ohno still have oil refineries and steel mills and nuclear power plants , but the entire gamut of retail products will be guild produc
  • All the children have their phones waiting.....

  • ...when the tech improves a lot
    Pundits and futurists are often over optimistic
    Business leaders rush crappy products into production, long before they are ready

  • Of course it's gonna happen, all these "questions" are really just "here's what's coming." Teachers will be augmented until they're no longer necessary at all, and then they'll join the rest of us in poverty as tech companies collect their former salaries.

  • > automated teaching platforms that instantly customize lessons for each student

    How is that different that an existing platform like MobyMax? MobyMax specifically claims to find learning gaps and provide adaptive lessons to close the gaps. It is about assessment, lessons, and more assessment.

    > If a tutoring agent knows that a kid likes

    Bill Gates seems to gloss over the assessment part and assume that AI already knows all about you, no assessment required. He is advocating for a big company like Micros

  • But a sufficiently advanced AI connected to a humanoid robot body might make a hell of a kindergarten assistant.

  • I don't even know how to describe a classroom today, and what goes on during the day. 20ish years ago when I was in grade 6, our day was 9am–3pm (with ~1 hour of breaks) of lessons, work and more lessons. Grade 4, 5, 6, 7, 8, pretty much the same, with a variance in quality and quantity of lessons.

    My younger daughter is in grade 6, and her day is a combination of a teacher who refused to educate, kids who have emotional breakdowns like 2-year-olds, and lessons about feelings and why Caucasian peo

The biggest difference between time and space is that you can't reuse time. -- Merrick Furst

Working...