How 'Hour of Code' Will Teach Students About Issues with AI (code.org) 17
Started in 2013, "Hour of Code" is an annual tradition started by the education non-profit Code.org (which provides free coding lessons to schools). Its FAQ describes the December event for K-12 students as "a worldwide effort to celebrate computer science, starting with 1-hour coding activities," and over 100 million schoolkids have participated over the years.
This year's theme will be "Creativity With AI," and the "computer vision" lesson includes a short video (less than 7 minutes) featuring a Tesla Autopilot product manager from its computer vision team. "I build self-driving cars," they say in the video. "Any place where there can be resources used more efficiently I think is a place where technology can play a role. But of course one of the best, impactful ways of AI, I hope, is through self-driving cars." (The video then goes on to explain how lots of training data ultimately generates a statistical model, "which is just a fancy way of saying, a guessing machine.")
The 7-minute video is part of a larger lesson plan (with a total estimated time of 45 minutes) in which students tackle a fun story problem. If a sports arena's scoreboard is showing digital numbers, what series of patterns would a machine-vision system have to recognize to identify each digit. (Students are asked to collaborate in groups.) And it's just one of seven 45-minute lessons, each one accompanied by a short video. (The longest video is 7 minutes and 28 seconds, and all seven videos, if watched back-to-back, would run for about 31 minutes.)
Not all the lessons involve actual coding, but the goal seems to be familiarizing students (starting at the 6th grade level) with artificial intelligence of today, and the issues it raises. The second-to-last lesson is titled "Algorithmic Bias" — with a video including interviews with an ethicist at Open AI and professor focused on AI from both MIT and Stanford. And the last lesson — "Our AI Code of Ethics" — challenges students to assemble documents and videos on AI-related "ethical pitfalls," and then pool their discoveries into an educational resource "for AI creators and legislators everywhere."
This year's installment is being billed as "the largest learning event in history." And it's scheduled for the week of December 4 so it coincides with "Computer Science Education Week" (a CS-education event launched in 2009 by the Association for Computing Machinery, with help from partners including Intel, Microsoft, Google, and the National Science Foundation).
This year's theme will be "Creativity With AI," and the "computer vision" lesson includes a short video (less than 7 minutes) featuring a Tesla Autopilot product manager from its computer vision team. "I build self-driving cars," they say in the video. "Any place where there can be resources used more efficiently I think is a place where technology can play a role. But of course one of the best, impactful ways of AI, I hope, is through self-driving cars." (The video then goes on to explain how lots of training data ultimately generates a statistical model, "which is just a fancy way of saying, a guessing machine.")
The 7-minute video is part of a larger lesson plan (with a total estimated time of 45 minutes) in which students tackle a fun story problem. If a sports arena's scoreboard is showing digital numbers, what series of patterns would a machine-vision system have to recognize to identify each digit. (Students are asked to collaborate in groups.) And it's just one of seven 45-minute lessons, each one accompanied by a short video. (The longest video is 7 minutes and 28 seconds, and all seven videos, if watched back-to-back, would run for about 31 minutes.)
Not all the lessons involve actual coding, but the goal seems to be familiarizing students (starting at the 6th grade level) with artificial intelligence of today, and the issues it raises. The second-to-last lesson is titled "Algorithmic Bias" — with a video including interviews with an ethicist at Open AI and professor focused on AI from both MIT and Stanford. And the last lesson — "Our AI Code of Ethics" — challenges students to assemble documents and videos on AI-related "ethical pitfalls," and then pool their discoveries into an educational resource "for AI creators and legislators everywhere."
This year's installment is being billed as "the largest learning event in history." And it's scheduled for the week of December 4 so it coincides with "Computer Science Education Week" (a CS-education event launched in 2009 by the Association for Computing Machinery, with help from partners including Intel, Microsoft, Google, and the National Science Foundation).
Re: (Score:1)
Protesting against futility seems...futile. No one is getting convinced to code by this crap. It has about the same utility as femininehygeine.org trying to convince men to use tampons and pads. There were always women who coded but they are a vast minority of the total, which won't change regardless of how much encouragement you offer, or how many jobs you relabel as being part of software development that just aren't.
Try asking a STEM-inclined woman of college age about coding and take note of the answ
Re:code.org are disgusting sexists.Boycott them. (Score:4, Insightful)
Otherwise I agree.
Re: code.org are disgusting sexists.Boycott them. (Score:2)
Having recently witnessed multiple instances of "white men" losing their positions and/or jobs because they were "white men", why should "white men" support such initiatives? It certainly isn't in our interest.
Now we're getting fucked; but we aren't the ones who were disadvantaging others. Those people have theirs.
Me? I have to eat and put a roof over my head just like everyone else.
Re: (Score:2)
This was a sit-down and shut up, or you will be fired, type of training. Ultimately, I made the mistake of questioning a similar statement at a different training, and was fired.
Re: (Score:1)
Nothing is as fragile as a white male ego.
What is it about white males that make their ego so fragile quonset? Do white females and Indian males have less fragile egos? Please by all means speak up and share more of these bigoted prejudicial tropes. Tell us how you really feel about white males.
Re: (Score:1)
code of ethics (Score:2)
hey kids, read this code of ethics, it contains all the rules that you must follow but governments and large corporations can ignore. now sit back quietly and enjoy a nice school-provided, sugar-filled snack and some pills while we work on replacing you with ai
Re: code of ethics (Score:2)
No it won't (Score:2)
self-congratulatory and self-serving group (Score:3)
They're stuff is mostly drag and drop (think Scratch) kind of fluff, the reason they have and 'hour of code' is because what they offer is so shallow. It's all hype and no substance, I know, I was forced to participate in it at my school because no-nothing team 'leaders' and admins thought "Oooh shiney!" and also "marketing photo op!" -- I wouldn't see marketing all year in my CS/Robotics classes, but come 'hour of code' suddenly they show up -- wtf.
The sooner code.org goes away, the better -- they're a bunch of self-promoters. Plus because of they have a certain ideology leaning, a lot of schools feel an allegiance to them -- what schools should focus on is what curriculum WORKS, not about the appearance of the curriculum.
Re: (Score:2)
Tesla? (Score:3)
Really? They're using someone from Tesla to talk/teach about AI? The company whose cars keep slamming into emergency vehicles with lights on [cbsnews.com] when autopilot is engaged? The company whose cars are phantom braking [makeuseof.com] and people who paid thousands of dollars for the software are suing? The company whose cars randomly accelerate [zdnet.com]?
This is the company you want to bring in? For as much as it pains me, Microsoft would be a better choice.
Re: Tesla? (Score:2)