Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Education IT News Technology

Professors Rejecting Classroom Technology 372

CowboyRobot writes "The January edition of Science, Technology & Human Values published an article titled Technological Change and Professional Control in the Professoriate, which details interviews with 42 faculty members at three research-intensive universities. The research concludes that faculty have little interest in the latest IT solutions. 'I went to [a course management software workshop] and came away with the idea that the greatest thing you could do with that is put your syllabus on the Web and that's an awful lot of technology to hand the students a piece of paper at the start of the semester and say keep track of it,' said one. 'What are the gains for students by bringing IT into the class? There isn't any. You could teach all of chemistry with a whiteboard. I really don't think you need IT or anything beyond a pencil and a paper,' said another."
This discussion has been archived. No new comments can be posted.

Professors Rejecting Classroom Technology

Comments Filter:
  • by Joe_Dragon ( 2206452 ) on Monday February 11, 2013 @03:07PM (#42862755)

    and the professors don't want to teach and have the big lectures that at times are just out of the textbook and are sleep though.

  • by koan ( 80826 ) on Monday February 11, 2013 @03:08PM (#42862771)

    I wonder if there is some element of job loss associated with it.

  • by brian1078 ( 230523 ) on Monday February 11, 2013 @03:11PM (#42862835) Homepage

    They only interviewed 42 faculty members for this study? Seems like too small of a sample to come to any kind of conclusion.

    Faculty at the large public research university I work at have embraced the technology that has been provided to them.

  • by cab15625 ( 710956 ) on Monday February 11, 2013 @03:12PM (#42862863)
    An alternative perspective is that the research faculty want the hopeless cases to realize as soon as possible that their niche is not in the subject that the professor teaches, and are teaching primarily to the better students. Why do you think med. schools in North America still want students to jump through the hoop of first year chemistry? Is it because every MD out there needs to know how to titrate? Or is it because if you can't even learn something as trivial as titration, the med. schools know that your chances of safely learning about surgery, anaesthetics, and prescription medication (including doses) are almost zero.
  • It depends (Score:2, Insightful)

    by Anonymous Coward on Monday February 11, 2013 @03:13PM (#42862869)

    Honestly, it really depends on the subject and the lesson whether or not technology is going to help. Technology for the sake of technology is money that could have been used on things that matter.

    I teach English and I'll use technology, but it's mostly technology that's a decade old and only for certain things. In fact I tend to avoid using it because I'm then at the mercy of the hardware to be functioning when I need it and I can't shuffle my lesson around if I need to.

  • by helixcode123 ( 514493 ) on Monday February 11, 2013 @03:15PM (#42862907) Homepage Journal

    I don't see technology as inhabiting much of the universe of effective teaching. A good teacher with deep subject understanding and good communication skills is always going to be better than a crappy teacher festooned with the latest IT.

  • English (Score:5, Insightful)

    by mspohr ( 589790 ) on Monday February 11, 2013 @03:16PM (#42862941)

    My wife teaches English (composition) at a local University and she used "Blackboard" for the sylabus, supplementary reading material and communication with the students. She also put up a few short lectures (combination of slides and voice over narration) on a few of the important topics in her classes.
    I think this is about the limit of possible use of technology for this type of class where learning depends on sitting with a student and their paper and working on how to make it better. I think that technology is over-sold in education.

  • So what? (Score:5, Insightful)

    by Tridus ( 79566 ) on Monday February 11, 2013 @03:17PM (#42862951) Homepage

    I'm not sure I care. I had classes with lots of fancy tech, and classes with next to none where everything was done on paper. It made no particular difference to how good the class was, or what I got out of it.

    Occasionally there's a good reason for it (submitting 50 pages of code by printing it out really makes no sense at all), but in my experience most of the time the technology costs a lot of money and doesn't really add anything of value. If the prof actually wants to teach and knows how to do it, the class is going to be good even if he's using stone tablets. If he considers teaching to be that thing he has to do in between research projects, it's going to suck no matter how much tech you throw at it.

    They could probably get better outcomes if instead of spending the money on tech, they spent it on instructors who want to teach so the professors that don't can go do the research they actually want to do instead. Everyone is happier that way.

  • by MBGMorden ( 803437 ) on Monday February 11, 2013 @03:19PM (#42862989)

    I think you answered your own dilemma there; at my university most of the CS professors equate programming with writing out algorithms on paper.

    To some degree they're right. Computer Science isn't Software Engineering, just as Physics isn't the same as Mechanical Engineering. Its really about data structures and algorithms more than it is about software. You must learn programming languages but mostly as a vehicle to demonstrate concepts.

    I think some of the confusion would be lessened if they called it Computational Science rather than "Computer" Science.

    That said, in the modern world. I would expect some level of online precense from everything. I think a lot of the "collaborative learning environment" stuff like online discussion forums is a bit of a waste (people will just use existing communications technologies if they want to collaborate), but at a minimum putting a syllabus online isn't much work. Being able to check your grades isn't a bad idea either.

  • by CastrTroy ( 595695 ) on Monday February 11, 2013 @03:20PM (#42863015)
    Most of the CS profs aren't really programmers, but true computers scientists, and really computer science has very little to do with computers, or programming. Also, most of the professors have probably been around for a long time, and know what works and what doesn't work. They want you to hand in hard copies of stuff so that they don't have to deal with any excuses about how the system lost your assignment. The only problem I would really have with handing in hard copies is that nobody uses floppies anymore, which is what I used to hand in my assignments on, and USB sticks and SD cards are a little too expensive to be passing around to teachers for assignments. They really should make Low capacity SD cards for really cheap so that people can us them for passing data around in cases where you might not get the SD card back.
  • by nebular ( 76369 ) on Monday February 11, 2013 @03:20PM (#42863017)

    The professors don't grasp the tech because they haven't used it themselves. They don't see how much more information they can present to students with these tools. Chemistry can be taught using only a whiteboard, but if you put some of that information in an easily accessible and dynamic format that can be used outside the classroom then you can cover so much more.

    It's not about them rejecting technology, it's about them rejecting an overhaul of their teaching methods to best use the tools at their disposal.

    The old adage is "Those who can't, teach", but I would say it's more like "Those who can't adapt, teach"

  • by Anonymous Coward on Monday February 11, 2013 @03:22PM (#42863079)

    Why? What do "today's tech/IT settings" bring to the table that is of actual benefit to the learning environment? How does a big CMS and computers help teach a university course? I'm not saying there aren't uses and benefits, but that is the question that is posed. Your summary dismissal of the university system does not remotely answer that question and in fact lends pretty heavy evidence that formal education is sorely lacking in today's tech/IT settings. It seems to me that the university system is exactly cut out for today's needs...people with little grasp on critical thinking, literature, culture, history, logic and reasoning, writing, debate. The games played in the media and in politics wouldn't work if the people demanded better. But they don't know better precisely because many people have tried to use a degree as a job training program and we've apparently let them, so long as the tuition gets paid. That's the problem.

    Technology should serve a purpose. You seem to think that purpose should serve technology.

  • by jythie ( 914043 ) on Monday February 11, 2013 @03:23PM (#42863087)
    I think it would be more accurate to say the old college system is not cut out for the needs of today's vendor commission expectations.
  • by Sarten-X ( 1102295 ) on Monday February 11, 2013 @03:24PM (#42863115) Homepage

    It's not that they aren't comfortable with computers, but rather that they know the computers' failings.

    Sure, that online testing package is nice, but it can't prevent cheating like a proctored in-person test can. Posting syllabi is nice and all, but students use that as a way to just read the book before the exam rather than attend class. Having a real-time chat for office hours is a nice shiny toy, but it's not really useful for demonstrations or sketches.

    Then, of course, to actually use any of those features, there's a time investment required to learn the specific mechanism the system uses. Your CS professors already know how to put a video online, should they choose to do it. Learning to do it through the fancy new system is just a waste of time. It's not a new capability to them like it is to professors in other departments who may not know how to set up their own content server. It's just the same old crap, with the same old problems, but now it takes longer to do it.

    Last I knew, my alma mater's CS professors each just ran their own server, configured however they liked. Some used them extensively, and some didn't.

  • by gstoddart ( 321705 ) on Monday February 11, 2013 @03:25PM (#42863137) Homepage

    At my university, the CS department are, counter-intuitively, some of the most reluctant to use our online capabilities and classroom presentation tech.

    I don't find that counter-intuitive -- the longer you work with technology the less you want to use it for the sake of using it. And there's lots of students who would simply read the syllabus and then show up for the exam thinking they've got it covered without knowing what the professor actually taught in class.

    I'd say about half of the CS profs still want everything handed in hard-copy and don't even post their syllabi online

    Supposedly, Donald Knuth had his secretary print out his emails.

    You would think programmers would be more comfortable with computers.

    If it helps the problem sure, if it's just busy work, not so much. Sometimes, technology doesn't really add anything but extra steps of little value.

    I find at work someone always is pushing us to do all of our work in some form of social media like Sharepoint. And it's not something that helps me get my work done (in fact it usually makes it harder), it's something that the people in charge of these can point to and bray about the adoption of it. A discussion thread is more trouble than it's worth for most things I find.

  • by Overzeetop ( 214511 ) on Monday February 11, 2013 @03:27PM (#42863161) Journal

    Imagine this: you have a notebook of your course content - basically and outline and examples - you've used for years. Each year, you walk into class grab a marker and go to town on the whiteboard. Nobody can get ahead of you, everybody has to concentrate on what you're saying or miss the details, and you can actively let your theories blossom infront of them. By the third or fourth time you've taught the class, you spend almost no time at all preparing. Each class can get a customized window of your knowledge that suits them. If you make an error, you just say "oops" and change the mark on the board by erasing the last one with your sleeve and everybody fixes it with a pencil. Done.

    Now, in the name of "connectedness" and "interactivity" you are expected to produce a full picture book of your entire semester's class work and examples, all worked to the nth degree. Everybody is supposed to download them and you just point at the board as your slides go by. There's no way to correct them on the fly, and any corrections you make require everyone to update their local copy. Those that take notes have to insert the new slides and just hope that the pagination doesn't change so they have to redo the whole back half of the presentation. Everybody is working from their laptop or their tablet, so nobody is really "taking notes" - even the good equipment sucks at it - and half are off checking facebook or playing games.

    It's not wonder profs are loathe to incorporate stuff into their lectures - more work for them, less interaction from the students. The whole idea of having a professor is getting a customized version of the class. Otherwise you could just go out and buy the (e-)text, take the exam and skip college altogether. It's not a business presentation where nobody gives a shit, and pretty slides makes up for the lack of real content. It's actual learning.

    College professors aren't, in general, very high on my list of respected professions, but I've got to side with them in this case. There are lots of things IT can do to help out, but in the classroom the experience should be very human and very hands on. /rant

  • by dbc ( 135354 ) on Monday February 11, 2013 @03:27PM (#42863177)

    It's always a tough sell to get someone to buy into a major change in methodology for a marginal improvement that is not clearly demonstrable. The only way to sell any new technology is to clearly demonstrate a marked advantage to adopting the new technology, with a demonstration that is clear and awakening. Thus it was ever so.

    My translation of the summary is "I made my pitch, but people keep asking me: 'Why bother?', I shouldn't have to answer that! They are so mean! WAAAHHHH"

  • by Ziggitz ( 2637281 ) on Monday February 11, 2013 @03:28PM (#42863193)

    Students are the ones who are to gain from IT in the class room, not professors. Easily accessible and detailed syllabus online? Professor already has it memorized. Easy access to slides and notes from classes? Doesn't help the professor. Online study material? Again, does nothing for the professor. Online submission of coursework? Professor might actually take longer to grade it or even have to print it out to hardcopy, or else learn to use a software solution to mark the paper. Professors aren't motivated to use it because it means changing their existing process and they see no direct benefit to themselves.

  • by conorpeterson ( 2718139 ) on Monday February 11, 2013 @03:29PM (#42863207)

    I'll say this as a cynical adjunct: the instructors who are the most integrated with CMS are the instructors who are likeliest to be replaced by a MOOC. Not to discount online learning, but since I prefer it the old-fashioned way I've changed my approach to emphasize the strengths of conventional classroom instruction. My IT needs are a lab, projector, audio system, LAN file share for course materials and submissions, and a whiteboard - anything more is likely to be more trouble than it's worth.

  • by KalvinB ( 205500 ) on Monday February 11, 2013 @03:34PM (#42863297) Homepage

    People who make a living with technology know what it's good for.

    That's why they use is sparingly (and to greater benefit) than instructors that fully embrace a bunch of expensive junk with no actual educational value.

    Whiteboard, projector, laptop, document camera. That's my ideal set of technology for a classroom.

  • by Hozza ( 1073224 ) on Monday February 11, 2013 @03:37PM (#42863363)

    I have to say, I agree with them that the best way to teach is often writing everything by hand on a whiteboard. Why? It's the best way to create interaction. Talking over a PowerPoint presentation is only slightly better than just giving people a book to read. Working out everything out by hand in the lecture lets the students see how you work through the problem, and, critically, they see you make mistakes. Spotting these mistakes and either correcting them for you, or seeing how you approach going back and correcting them, is one of the most important things for the students to learn. In their later careers its often more important than the actual content of the lecture itself.

    So, yes, it's helpful if a course has a good website, and some simple CMS may be useful too, but it is absolutely critical that many of the lectures are still done by hand.

  • by Phillip2 ( 203612 ) on Monday February 11, 2013 @03:39PM (#42863407)

    The idea that technology necessarily improves the way we do things is the fallacy in your argument. In practice, many people avoid this technology because it is really not worth the hassle for didactic gain that it brings.

    Want to use a whiteboard? Take a pen. Want to use an "innovative" tablet approach -- well make sure the battery is charged, take your gear to the lecture theatre, discover that it doesn't work in the lecture theatre you are in.

    The second point is that most "e-learning environments" are lowest common denominator. I asked once how big a file can I upload? Pretty big came the answer, think the limit is 60Mb or so. Not so useful when I want to upload an 7Gb ISO, or a 100Mb data set. Use of these environments is largely limited to uploading your powerpoints because uploading your powerpoints is all that they will do reliably.

  • by Jim Hall ( 2985 ) on Monday February 11, 2013 @03:53PM (#42863655) Homepage

    I am an IT Director / CIO for a small liberal arts university, and I've discussed this issue on my blog [umn.edu] about IT leadership in higher ed. What many of us in technology sometimes forget is that technology is fairly new to the workforce, and that includes faculty. Remember, the PC was only introduced to office desktops in the 1980s (unseen mainframes in server rooms don't count). If people enter the workforce in their 20s and retire in their 60s, that's a 40-year work generation. So computers have only been part of the workplace for less than a work generation. There are still a lot of people out there who remember doing their work without technology.

    And faculty are less likely than, say, accountants to embrace change. Accountants realized that they could use the computer to add up the numbers and create a spreadsheet to track the income & expenses. People in sales used the computer to write letters and other communication. But for faculty, their job is teaching and for that they have relied on a chalkboard (or whiteboard) for pretty much their entire careers, going back to undergrad. Powerpoint was a stretch for some faculty, but Powerpoint isn't much more than a "captured" version of their whiteboard talk, so many faculty took to Powerpoint as a means of delivering lectures.

    One of the faculty at my university often uses the phrase "Technology should be like a rock; it should be that simple to use." And there's a lot to that. Faculty want technology that is easy to use. They don't want to tinker with technology, they don't want to try the latest thing. Faculty only want technology when it supports what they need to do for instruction.

    And that's where we in IT see things differently, of course. For us, technology isn't just our job, it's often our passion. We got involved with technology as a career path (programming, desktop support, server admin, databases, etc) because we were pretty much doing that already (building web pages, building our own computers, installing our own OS, etc) and what better job than to get paid doing what you love? So campus technology folks are going to gravitate to the latest technology: the Raspberry Pi, smartboards, video capture, and the like. And we get confused when the faculty don't want to use it, as TFA mentions.

    Faculty will adopt technology when they need it to do the job of teaching. The article includes some quotes along those lines.

    "I went to [a course management software workshop] and came away with the idea that the greatest thing you could do with that is put your syllabus on the Web and that's an awful lot of technology to hand the students a piece of paper at the start of the semester and say keep track of it." What makes it easier for faculty to focus on teaching? Learning how to put a PDF on the web (or a course management tool like Moodle) when they've never done that before, or printing out a syllabus and asking the students not to lose it.

    "What are the gains for students by bringing IT into the class? There isn't any. You could teach all of chemistry with a whiteboard. I really don't think you need IT or anything beyond a pencil and a paper."

    One quote that highlighted when faculty were interested in using classroom technology: "They're undergraduates - you need to attract their attention before you can teach them anything." Because that helps the faculty in the job of teaching students, which is the most important thing. In this case, using some technology in the classroom may help get the attention of students, which the professor says you need to do "before you can teach them anything."

    I'd also remind anyone working in campus technology to remember three important questions when trying to effect change on campus:

    1. Is it the right change to make?
    2. Are the right people behind the change?
    3. Is the campus ready for this change?
  • by starfishsystems ( 834319 ) on Monday February 11, 2013 @04:03PM (#42863837) Homepage
    I think you've captured the essential value debate right here.

    It's okay if a person's goal in life is to be the equivalent of a factory race-car driver, taking the new software around the track, putting it through its paces, competing against others to determine which strategies and deployments and use cases are the most viable. There's a place in the world for that sort of talent, just as there's a place for people who want to occupy themselves with filmmaking or graphic arts.

    But using a tool is not the same as engineering it, and engineering is not the same as science, and science is not the same as math, and math is not the same as philosophy. I'd argue that a substantial part of an undergraduate education involves developing an awareness of these distinctions. What's important are the ideas and modes of thought that support a particular discipline. So, for example, science undergrads are not exposed to number theory because it will have direct application in their careers. Number theory is a way of opening a conversation about the essential nature of abstraction.

    Now, if someone wants to come along and make a really cool documentary about number theory, with powerful animations and interviews with contemporary mathematicians and a sound track to die for, more power to them. But please, let's not confuse the vehicle with the journey.
  • by Anonymous Coward on Monday February 11, 2013 @04:13PM (#42863983)

    I'm a CS Prof and the online course support software we are supposed to use would get a very poor grade if any of my students wrote it.

    I suspect the reason many of us prefer not to use such technology is because it impedes, not enhances, the student experience. It's hard to find information, damn difficult to edit it, tricky to make the marks add up right, and shows the wrong people information they should not see (like other students' marks).

    Have you tried reading etextbooks as opposed to paper books? It is harder to find information when you do not know exactly what you want (no easy way to flip through material), and they also discourage prolonged reading of the kind necessary to develop sound understanding, as opposed to quick answers to a question on hand.

    It reminds me of the old chestnut of the Americans spending $1m to make a pen that would write in space, while the Soviets used pencils. Not all new technology is better or makes a job easier.

  • by SilverJets ( 131916 ) on Monday February 11, 2013 @04:54PM (#42864583) Homepage

    Gee whatever did the professors and students do for all those decades of university courses before the invention of computer networks?

  • Reminds me of my first programming class, many many years ago - before a lot of you were born. It was a pseudo-assembly course, with a make-believe assembly language with 13 instructions, including add, subtract, multiply and divide. 36 or 39 statrted the course: 13 of us took the final, and three of us thought it was a Micky Mouse course, while the other 10 were barely treading water.

    We figured it was weed out for the folks who read You Can Make Big Money With Computers on the inside of a matchbook cover.

    I'd be shocked, shocked I tell you, if a lot of folks taking the first computer class weren't there because a) they confused it with playing games, or b) you can make big money with computers, and it cost less and is less yucky than medical school.

                    mark

  • by dj245 ( 732906 ) on Monday February 11, 2013 @05:11PM (#42864841) Homepage

    Most of the CS profs aren't really programmers, but true computers scientists, and really computer science has very little to do with computers, or programming. Also, most of the professors have probably been around for a long time, and know what works and what doesn't work. They want you to hand in hard copies of stuff so that they don't have to deal with any excuses about how the system lost your assignment. The only problem I would really have with handing in hard copies is that nobody uses floppies anymore, which is what I used to hand in my assignments on, and USB sticks and SD cards are a little too expensive to be passing around to teachers for assignments. They really should make Low capacity SD cards for really cheap so that people can us them for passing data around in cases where you might not get the SD card back.

    I think you missed the point entirely. A hard copy is a paper copy. The point of the hard copy is that you "open" it instantly. No inserting a CD and hoping that the student wrote the CD correctly, that their CD writer is compatable with your CD reader, that their media isn't garbage. No juggling a stack of flash cards or USB sticks and trying to figure out whose is whose. No having to deal with that guy who didn't cough up the money for version X of the software, and your version Y has several small but annoying compatability bugs. No having to juggle dozens of emails with attachments for each assignment.

    Printed paper. The student's name is somewhere on the first page. You can start reading it instantly. Unless they really screwed up and used tiny or unreadable fonts, it is compatible with your eyes. Paper size is basically standard, and you can stack up all the papers and keep them together easily. Everybody can spend their time more productively doing better things.

  • by N0Man74 ( 1620447 ) on Monday February 11, 2013 @05:17PM (#42864915)

    I was going to mod you, but I couldn't find +1 Sad.

  • by UnknownSoldier ( 67820 ) on Monday February 11, 2013 @05:19PM (#42864963)

    > Posting syllabi is nice and all, but students use that as a way to just read the book before the exam rather than attend class.

    And who is paying for the class? We're not in kindergarten anymore where you need mandatory attendance for mommy and daddy.

    I've had my share of shitty teachers where it was more efficient for me to just read and do the exercises in the textbook then to waste my time listening to a prof that couldn't teach.

    The better teachers find ways to engage students by asking them questions then to simply spew useless facts.

  • by SillyHamster ( 538384 ) on Monday February 11, 2013 @05:26PM (#42865055)

    No juggling a stack of flash cards or USB sticks and trying to figure out whose is whose.

    ... and hoping that their anti-virus and your anti-virus was kept up to date.

  • by ceoyoyo ( 59147 ) on Monday February 11, 2013 @05:48PM (#42865367)

    Actually, many undergrads act like they're in kindergarten, and mommy and daddy and/or the taxpayers who are paying appreciate a little bit of nannying. Most professors hate it.

    If you actually don't need to go to class, then don't. I didn't, for certain classes. But don't expect the professor to go out of his way to help you. You've got the textbook, Google and the whole Internet. If you're so good why do you need the professor's notes and lectures online and packed up nicely for you?

    Yes, better teachers find ways to engage students... in class.

  • by Sarten-X ( 1102295 ) on Monday February 11, 2013 @06:00PM (#42865577) Homepage

    Better still are the teachers whose questions are spurred by the students' classroom experiences, who reinforce the knowledge while simultaneously encouraging curiosity, but that enriching experience will be lost on the students who decide in the first two sessions that participation isn't worth their time.

    You the student aren't paying the professor to teach the class. You're paying the university for the privilege of learning from the class that they're paying for. It's not really the professor's problem whether you get your money's worth or not, but it is his problem to determine whether you've adequately learned the material or not. Sure, you might be able to answer some exam questions to cover university-mandated bullet points, but the exam can't really cover all the details of the course material.

    The great lie of education is that the diploma means you know something. Rather, it just means you've demonstrated to a group of experts in a particular field that you should also be considered an expert in that field to a particular degree of mastery. Of course, each of those experts may set their own requirements for proof, within the limits upon which the group as a whole has agreed.

    This is not to say there aren't shitty teachers out there, or even ones whose teaching style doesn't work for some particular student. That's no excuse for missing material out of one's own arrogance. The student who skips class isn't entitled to credit if they hate their professor, any more than an employee who doesn't show up at work is entitled to a salary if they hate their boss.

  • Re:English (Score:4, Insightful)

    by nbauman ( 624611 ) on Monday February 11, 2013 @06:17PM (#42865833) Homepage Journal

    Not sure about English composition, but there are other subjects that can benefit from technology: visualisation, learning with feedback outside the classroom, gamification... and other than just improving learning effectiveness, could you think of a way where technology could help a teacher effectively teach a class of 1000 rather than 30 or so? Or reduce the cost of learning so you can justify the expense for a far larger group? I can... and I am not the only one. We're not there yet, though.

    Teach a class of 1000 rather than 30? In a class of 30, a teacher can get to know every student by the end of the year. Students get to know each other. A class of 1000 is an assembly line. It's a mob. What's your measure of success? Students per dollar?

    I took a class in modern poetry, and I still remember a guy who was a car designer, who was taking classes in his retirement. He would tell us obscure things about poems by Wallace Stevens and Ezra Pound that in the news when the poems were written. In my freshman humanities course, one guy was an atheist. One guy was a Jesuit-educated Catholic. There were marxists and army veterans. After a while you could get to know how these people approached the world.

    I also took lecture classes of 300 in physics. The teacher basically read his notes. He answered questions, but it wasn't the same.

    Humans evolved in the last 100,000 years or whatever to deal with each other in family-sized groups of about 6 to 30. You can't have the same kind of communications and interactions in groups much larger than that.

  • by Sentrion ( 964745 ) on Monday February 11, 2013 @07:16PM (#42866515)

    Who else among academia are going to understand better that skills are usually made redundant by technological advance? Education is in high demand and salaries for professors have never been better. Why jeopardize that by replacing themselves with technology? After all, they know all too well that if they did it right a small board room of top tier professors could teach a whole nation with the right technology and eliminate the need for tens of thousands of workers drawing upper-middle-class salaries. It sure would be great for the few on top but the majority probably know that it wouldn't be them.

    Managers may buy machines that replace workers, but they won't invest in computers to replace management. Same is true for physicians, lawyers, Professional Engineers, politicians, salesmen, accountants, real estate agents, stock brokers, licensed tradesmen (plumber, electricians, etc.). Only the most protected, organized, and proactively defended professions and trades will be able to withstand the dual effects of modernization (automation and information technology) and globalization (chasing cheap labor to the four corners of the earth). To succeed these professionals MUST convince their clientele, customers, bosses, managers, government officials, and the public at large that their job cannot be replaced by inhuman technology. The decision makers must be made to believe that their job requires a human touch, face time, or "intuition". Workers who wish to maintain their middle-class status and lifestyle need to establish strong and politically connected labor unions or trade organizations and/or pay for legislation that requires that their particular job can only be performed by a licensed individual with a specified level of education and experience. Those workers who do not will or already have become just another redundant commodity in a global labor pool of struggling masses. Relying primarily on years of experience and above average intellect to do a job essential to human civilization will not be enough to ensure viability.

    Going forward aspiring professionals who wish to rise above the masses will need to be businessmen and think like a shrewd Fortune 500 executive. Insist that as the top [insert title] at your organization it is essential that you are given a seat on the board of directors and are granted an officer position in the company. When times are good demand a major portion of the windfall profits plus a portion now of the anticipated future earnings. When times are lean demand pay increases and "retention bonuses" to motivate you to stick it out with the firm. If the company is failing and gets bailed out by the government, demand a bonus to compensate you for your successful lobbying efforts. If the budget for your project increases, demand for your compensation to increase proportionally. Whenever anybody asks you to settle for a lesser role, title, or compensation, stomp your feet and slam the table and insist loudly with serious facial expression that the company will fail without your unique genius and expertise in your field.

    Always maintain a personal website for your part-time consulting "side business" and list as many outrageous claims as possible that cannot be verified or substantiated. Publish a "list price" for turn-key solutions that would be several times higher than you are actually paid to do similar projects for your current employer. Publish your hourly consulting rate that is five to ten times higher than your equivalent hourly wage based on your salary at 40 hours per week. And every quarter increase your prices at a rate that matches the rising cost of healthcare, education, or energy, whichever is higher at the moment. When you go to your employer, customer, or shareholders, and they balk at the increase in compensation you are demanding to do your day job, just remind them of how much they are saving compared to what you earn consulting on the side. Show them your website and remind them periodically of how well your side gig is doing.

  • by 24-bit Voxel ( 672674 ) on Monday February 11, 2013 @08:01PM (#42866853) Journal

    You make a good point, but as a college instructor myself for about 10 years now I know the real reason.

    It's extra work. End of story. Nobody wants to do extra work for nothing.

  • by starfishsystems ( 834319 ) on Monday February 11, 2013 @08:46PM (#42867161) Homepage
    That's fine. You're points are all valid, but they don't address the underlying issue of resource scarcity. (Nor do mine, directly.) We face the perennial challenge of how to divide our scarce resources between explorations in breadth or depth.

    Classically, we've found depth to be the critical dimension. This goes back to Plato and beyond, though reexamined by Kant, Fichte and Hegel. If you neglect to understand a subject in depth, you may well fail to capture some of its essential properties. Any interdisciplinary synthesis made on that basis will then be flawed. Therefore synthesis is a final step in applying knowledge, not a preliminary one.

    For example, Ph.D studies - not just in the sciences but in all fields - are explicitly framed as exercises in depth. Thesis supervisors routinely have to point this out to grad students, in order to redirect their very natural tendency to go off exploring in all directions. I went through this stage myself. Probably everybody does.

    Sure, specialization creates arbitrary barriers between disciplines. So does modularity create arbitrary barriers between components. So does all individuation of subject from object, agency from action, et cetera. All dualistic thinking has this particular shortcoming. We accept that because we gain a powerful analytical tool in return. And sometimes we forget that we have made such a choice.

    The converse, I have to point out, is not "hard-earned wisdom" but the default way that people function when they impose no particular discipline on their studies. We don't need universities to teach that. It comes for free, as part of the human condition.

    And so we come around again to the question of classroom technology. It's easy to indulge in endless fiddling with bits and pieces of technology in the name of education. Occasionally, such fiddling may produce a valuable new synthesis of ideas. Stephen Wolfram sincerely believes this about Mathworld - that there's no telling what might happen if you facilitate mathematical exploration and let human nature take its course. I have nothing against it, only against claims that undisciplined exploration is the best way or the only way to conduct a search for knowledge.

    Look, everyone wants to be in on the synthesis part. But anyone can dabble in multiple subjects. What makes you think you're qualified to make any real contribution if you have no depth of experience?
  • by rtb61 ( 674572 ) on Tuesday February 12, 2013 @02:17AM (#42868795) Homepage

    Chemistry is also where computers can most effectively be used. Simulations are the best use of computers in education. Tweaking of the formulaes that are the basis of simulations, complicated outputs that are more readily understood by breaking them down into the component algorithms and how those outcomes are affected by altering those algorithms. A newer style of teaching that simply could not be done with pen and paper. Being able to work with large very complex models where more can be learnt, by having ready access to each part of the puzzle that creates the whole. Which part of group of parts you look at, analyse and learn, whilst being able to grasp them more effectively as parts of whole, better fundamental learning.

There are two ways to write error-free programs; only the third one works.

Working...