Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Education

Twenty Years of Dijkstra's Cruelty 727

WatersOfOblivion writes "Twenty years ago today, Edsger Dijkstra, the greatest computer scientist to never own a computer, hand wrote and distributed 'On the Cruelty of Really Teaching Computer Science' (PDF), discussing the then-current state of Computer Science education. Twenty years later, does what he said still hold true? I know it is not the case where I went to school, but have most schools corrected course and are now being necessarily cruel to their Computer Science students?" Bonus: Dijkstra's handwriting.
This discussion has been archived. No new comments can be posted.

Twenty Years of Dijkstra's Cruelty

Comments Filter:
  • by JoshDM ( 741866 ) on Tuesday December 02, 2008 @11:07AM (#25959567) Homepage Journal

    Dijkstra's Cruel Font link [tinyurl.com], so we at least get something recent(-ish) out of this article.

  • by Anonymous Coward on Tuesday December 02, 2008 @11:25AM (#25959839)

    I study at the university of Eindhoven, where dijkstra was professor. He certainly left his mark on the curriculum here with constructing programs by proof by using hoare triples and much more.

    A lot of the staff here see him as sort of a role model, with the same proofs, handwriting and abrasive personality.

    But now the last year most of those subjects are being removed, or reworked to something more manageble. They where so disconnected from reality (proving horners scheme in so many ways gets old quick).

    Dijkstra's quote "Computer Science is no more about computers than astronomy is about telescopes" is in my opinion just wrong. Astronomers need to use a telescope and understand its operation.

  • Re:Hmmm... (Score:5, Interesting)

    by DarenN ( 411219 ) on Tuesday December 02, 2008 @11:27AM (#25959871) Homepage

    There's nothing exceptionally wrong with Java as a starting language

    Yes, there is. It insulates the student from some concepts that are important and because it's so aggressively object orientated even the standard "Hello World" program requires quite a bit of glossing over by the teacher.

    As a result, it tends to get waved away as "magic" or "this will be explained later" but there's so much waved away that the students get disconnected. For instance, to simply output a line to a command line in Java you're looking at
    System.out.println("output");
    whereas with c++ (for instance) you have
    cout << "output" << endl;
    As someone who's teaching this stuff, the second is easier to explain in detail and doesn't rely on saying "don't worry what System.out is".

    The other prime example when teaching object orientation is garbage collecting. Students who learn in Java are significantly more difficult to teach about dynamic memory and the necessity of cleaning up after themselves than those who've learned in other languages that don't abstract this away. It's much easier to switch from C/C++ to Java than the other way around.

    The standard way of teaching basic programming is procedural, then functional, then object orientated then onwards. Using Java to teach in that cycle is nuts. How useful that cycle IS is another question, of course :)

  • Re:Hmmm... (Score:4, Interesting)

    by MadnessASAP ( 1052274 ) <madnessasap@gmail.com> on Tuesday December 02, 2008 @11:32AM (#25959943)

    I think they should teach low level first, teach students assembly first and work up from their. They don't need to create anything fancy in assembly just make sure that they understand how a computer works and does things rather then the abstracted model that higher level languages give you.

  • by chrb ( 1083577 ) on Tuesday December 02, 2008 @11:33AM (#25959965)

    My old university dropped C and replaced it with Java for all CS courses apart from Operating Systems. I asked one of the professors why - he said many students complained that dealing with pointers was too hard, and that the rise of Java and Java programming jobs meant C was obsolete and pointless, that the issue of programming languages came up on prospective student visit days, and that in order to be "commercially attractive" to these potential students they had to switch. We even used to do assembly language programming in the first year - now, the replacement course teaches students how to use Eclipse for Java programming.

    Several years later I was back tutoring, and I was very disappointed to find out that I had to explain pointers and pointer arithmetic to people who were almost at the end of their Computer Science degree, and who didn't understand why their code was crashing with "null references" when "Java was supposed to get rid of all that memory stuff!".

  • Re:Hmmm... (Score:5, Interesting)

    by swordgeek ( 112599 ) on Tuesday December 02, 2008 @11:47AM (#25960201) Journal

    Dijkstra's comments were right on the mark, and fairly obvious to people outside of CS. They were only contentious within the field, for some odd reason.

    The thing is, Computing Science should be approached in the same manner as most other science fields: A BSc in computing should be about theory, research, and pushing the state of the art. A modicum of programming is probably necessary to accomplish that, but programming should understood in the abstract--without the emphasis on 'this command, this language.' Learning to be a programmer (a) should be a division of computer engineering, or (b) probably not a degree at all. More like a one or two year college certificate.

    Chemistry, Physics, Biology, Math, and so forth, are all degrees aimed at research and study, not commercial production. Why not computing?

  • by mbone ( 558574 ) on Tuesday December 02, 2008 @11:55AM (#25960343)

    I am very glad I never had him as a professor.

    By the way, I am a physicist, and IMHO all of the best physicists develop a physical intuition about their physics, and that is certainly true with those who deal with quantum mechanics. Listen to the recorded lectures of Richard Feynman, for example.

  • Re:Hmmm... (Score:3, Interesting)

    by argiedot ( 1035754 ) on Tuesday December 02, 2008 @11:58AM (#25960383) Homepage
    I strongly disagree. I see this attitude everywhere and while people who are already really motivated, you will kill off even the least interest in a student who could later find it all fascinating.

    You have to engage students. They aren't automatons who will simply take in any information at the same rate. I understand that you're talking about undergraduate college, and that the system is probably different depending on which country you live in, but you have to start at a level where it is easily possible for a student to _do_ things, and that's how you pique someone's interest and sustain it.

    I've been through two computer science courses (one introductory, one advanced) in college as part of my Math degree, and through a year in school of being taught C, and while to me it was interesting (I had a computer much earlier than any of my classmates - back when I was just out of primary - so I had an advantage), quite a few of my classmates were obviously struggling with the "This seems so useless, just learning a bunch of stuff for no reason." idea.

    I know, they're in a Math course, and stuff like that, but there are a lot of people who will probably be capable of first-rate programming, but they will likely be scared away by having to write a 10 line cryptic bunch of codes just for Hello World.

    Besides, Computer Science is about Computer Science anyway, not just the instruction set of one particular processor.
  • Back to the future (Score:5, Interesting)

    by Bucc5062 ( 856482 ) <bucc5062@gmai l . c om> on Tuesday December 02, 2008 @11:59AM (#25960403)

    As I read through his writings it brought me back to my time at Moravian College circa 1979. I just started taking CS classes and in that same year Dr Brown, Head of the CS Department pulled out all the IBM mainframe systems and installed a PDP 11/45. Gone were the COBOL courses replaced by c, RATFOR, PASCAL, Fortran et al. I loved it and hated it at the same time.

    Like the presentation, Dr Brown taught us programming before we really saw the computer. His focus was not on Language, but on concept. As he so well put to us, once done with our intro class we could work anywhere in any language. I believed it then and found it to be a true statement. At the end of that intro class he took the last three weeks and taught sort algorithms. The catch was each sort was analyzed in a different language. I chuckle when I read posts of youngsters that say "I learned Java, or C++ in college". I learned Programming in college then went on to figure out what language suited my economic and intellectual needs.

    Cruelty in Computer Science? I am grateful for that kind of cruelty to this day. Since college I have had to adjust my knowledge as times and needs change. I have had the pleasure of working with RPG, COBOL, Java, FORTRAN, and even the bastard child Visual Basic. Unlike some, I do not look down at any language for each has its benefits for the task. What I do dislike is working on code written by persons who thought that "Learn to Code Java in three Weeks" made them a programmer; that language X is the best and only language out there.

    Dr. Dijkstra says "Universities should not be afraid to teach radical novelties". What things could be discovered if that concept was embraced again.

  • Exactly (Score:3, Interesting)

    by chfriley ( 160627 ) on Tuesday December 02, 2008 @12:00PM (#25960409) Homepage

    That is spot on. It is the difference between an electrical engineer and an electrician. They don't do the same things and getting an EE to wire your house is as stupid as getting an electrician to design a CPU. Or you don't go to the engineer at GM who designed the engine of the car to change your oil (although given the state of Detroit that might change).

    There is a difference between a CS degree and an IT or Software Engineering some other more hands-on degree. Yes, a CS or Comp Eng degree should have coding, that is a must because you need to implement a stack or queue or linked list to really understand it. Ditto for LISP or Prolog in AI or knowing a bit of C if you are into OS design (e.g. for Unix/Linux, so you can understand it). But the focus is on theory so that you can code efficiently (or tell the IT people, look, your coding is good, but the design is O(n^2) and you could code it so that it is O(n) or whatever). Or to tell them P NP or whatever.

    I was just starting grad school at UCSD in CS when he wrote that paper and there are two separate fields here just like a builder vs an architect or plenty of other examples. One is not better than the other, just different with a different focus and just depends on what the person is interested in and wants to be skilled at. It is not widely understood though. When people find out I was a CS person, I get, "Oh, can you fix Windows for me?"

  • Re:The Text (Score:3, Interesting)

    by WhiplashII ( 542766 ) on Tuesday December 02, 2008 @12:24PM (#25960815) Homepage Journal

    More than not being "great", he seems to be rather foolish...

    1) His main premise is that "software engineering" cannot exist because software cannot be proved correct, only proved wrong. Well, I got news for ya - rocket engineering is the same way. So is electronics. So are bridges! Or do you think that having the SRB on the shuttle burn through the main tank was by design?

    2) He goes further to say that foolish mortals (unlike himself) learn by analogy, and so can't handle the truth, etc. Then, hilariously, he goes on to say that the only true way to look at programming is as deriving a formula! Imagine that, a mathematician describing engineering as deriving a formula! No comfortable analogies here...

    3) Then he talks about how computers are "symbol manipulators". OK, but that is not very useful - computers are really devices that get things done that we want done. Some people want photons in pretty patterns on their screens. Some people want the control surfaces of aircraft actuated in ways that save time/money/lives. But a computer/program is useless without the output mechanism.

    Some of his conclusions are good - lines of code is an anti-metric. But in general, this paper was awful!

  • Re:Recursive Descent (Score:3, Interesting)

    by russotto ( 537200 ) on Tuesday December 02, 2008 @12:27PM (#25960891) Journal

    Dijkstra was a genius and made many contributions to Comp. Sci. But his suggestion that a program (really a program design) should be accompanied by a formal proof has problems at both ends of the development cycle: how do you prove that the formal specification is what the customer wanted, and how do you prove that the code actually implements the design?

    When I was in the CS curriculum in the University of Maryland shortly after the publication of this paper, one of the mandatory freshman CS courses took this approach.

    The fatal problem of the approach quickly became apparent: the formal specification had to be as detailed as the program, the process of proving the program met the specification was just as complex _and error-prone_ as the process of writing the program in the first place.

  • by 91degrees ( 207121 ) on Tuesday December 02, 2008 @01:13PM (#25961669) Journal
    I've frequently toyed with a low level C-like language (maybe a hacked version of an existing C implementation) that supports these. Integers would behave as structures so you could access my_int.carry as though carry was a boolean member. I have a problem with what to do when passing an integer as a parameter though. 1D and 2D Vectors as built in types seems like a fairly obvious extension. I'd also want to do bitwise casts and built in ability to separate the mantissa and exponent from floats. Never has got past the "wouldn't it be good if..." stage though.

    I thought C had support for atomic operations. Am I wrong there?
  • by Theovon ( 109752 ) on Tuesday December 02, 2008 @01:15PM (#25961727)

    Towards the end of the essay, an introductory course is described where the student, as programmer, is required to build a formal mathematical definition of his program and prove that his program conforms to the definition.

    At The Ohio State University, we teach precisely this in the form of the "Resolve" programming language. For every function and every block of code, one must provide both the procedural code and a set of logical constraints that describe the effects of the process. For instance, if a function's job is to reverse the order of items in a list, then the internals of the function are accompanied by a logical constraint that tracks the movement of items from one list to another and ensures that the operation is totally consistent at every step, in terms, for instance, that the two lists sum to a constant size that is the same size as the input, and that the concatenation of the lists in a particular way yields the original list. (I'm summarizing.)

    An active area of research here is on automatic provers that take your code and your logical definitions and actually prove whether your code and defintions match or not.

  • Mathematics (Score:3, Interesting)

    by ProteusQ ( 665382 ) <dontbother@nowher[ ]om ['e.c' in gap]> on Tuesday December 02, 2008 @01:15PM (#25961731) Journal

    I haven't read this essay before, and have only had time to skim part of it now, but Dijkstra's criticism of mathematicians has merit, IMHO. [I have an MS in Math, so I don't claim to be an expert.] It's been over 40 years since the introduction of nonstandard analysis (including hyperreals and later surreal numbers), but it's still not a mainstream topic. In fact, it boggles my mind why a professor or department would choose to teach their students "hard" analysis (Bartle or Royden) instead of "soft" analysis (Rudin) -- "soft" analysis uses topological results to arrive at key theorems faster, while "hard" uses Real Analysis as if it were the only option on the market. Not that there's anything wrong with using Rudin and Royden in tandem, but to ignore topological results altogether smacks of willful ignorance. To paraphrase Stephen King: do you expect brownie points for being ignorant of your own discipline?

    The light on the horizon is the development of proof via programming, as covered by /. I'm sure there will be 50+ years of mathematicians screaming and kicking to avoid its introduction into the mainstream, but that will change the first time a computed proof that could not have been developed in one lifetime via the usual methods earns someone an Abel Prize. Until then, I suspect Dijkstra's point will still stand.

  • by DarkOx ( 621550 ) on Tuesday December 02, 2008 @01:35PM (#25962087) Journal

    No C is a great teaching language if what you want is to teach the science of computing and how the machines do their work. You have to do some work with students in at the very least C and some Assembly if you want them to get an understanding of computer design and architecture. You don't have to do these things if you want to teach software engineering. They are different fields or at least different disciplines within a larger field, like an Oncologist is a type of medical doctor and so is an Endocrinologist but they have very different educational backgrounds after some shared fundamentals.

  • by OrangeTide ( 124937 ) on Tuesday December 02, 2008 @01:41PM (#25962183) Homepage Journal

    I've done nothing but C (not even C++) programming for the last decade in various full time and consulting positions.

    Linux is all C and the job market for Linux kernel, driver and system developers has been pretty active for many years now. Using QNX and vxWorks is all C programming too (not counting the tiny bits of assembly you have to stick into your programs).

    This is why I think it's important that people learn some assembly language once they are past the basic syntax of C. They don't have to become experts in the assembly, but being able to write (and debug!) a few basic programs in some assembly would be a good experience. Like a hello world(one that calls the system call, and one that calls a puts function), string reverse and maybe a linked list bubble sort. If you can get those 3 done in assembly after you've been exposed to C, it should make pointers (and arrays) a lot easier to understand.

    I believe being able to debug is as important as being able to program.

  • The Truth (Score:3, Interesting)

    by huckamania ( 533052 ) on Tuesday December 02, 2008 @01:46PM (#25962253) Journal

    Dijkstra's Cruelty is the nickname UT cs students gave to the course his wife taught. It was a required course when I was there and it was 2 semesters long. I think it was called "Software Development" but should have been called "Fantasyland Development". Total waste of time and energy. I never saw him on campus, except maybe at graduation.

    The best classes were given by people who either were working in the real world or had some experience in the real world and were trying to get their masters or doctorate. The 'professors' going for or having tenure were the worst.

  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Tuesday December 02, 2008 @02:05PM (#25962577) Journal

    Last I checked, no one had quite figured out how to teach computer science -- your first course in college would serve mainly to flunk those who didn't know, or couldn't learn very quickly on their own, because the course itself certainly isn't going to teach you much.

    However, the specific things Djikstra is talking about are not problems.

    Consider his insistence that "bugs" should be instead called "errors", because this is less metaphorical and closer to the truth -- placing the blame where it belongs (the programmer), and allowing a lower tolerance -- a program cannot be "mostly correct"; it is either correct (with no errors) or it is incorrect (has errors).

    This is to ignore one of the most important things to understand about modern computer science: Culture, or, more importantly, communication.

    "Error", even in context in computer science, could mean several things. It could be programmer error, it could be an error in the rest of the software, it could be a hardware error, or an error status returned from any of these, even user error.

    "Bug" has a very specfic meaning. When you say you've "squashed a bug", there is very little doubt as to what you mean.

    Yes, it was initially attached to a metaphor. But as Djikstra so clearly points out, computer science is a radical novelty -- and such radical novelties tend to borrow words from older vocabulary, from other languages, from fiction, and invent some of their own. They do this not because of intellectual laziness, but because they need a vocabulary of their own -- yes, jargon serves a purpose.

    And yet, it is not such a radical novelty that old ideas will have no effect. Certainly, some understanding of mathematics will be beneficial -- and although calculus may not be needed most of the time, the way in which it forces you to think will exercise the same parts of your brain that tough programming will.

    I would argue, though, that computer science is not only informed by mathematics. It is also informed by softer sciences -- philosophy, creative writing, and languages.

    Certainly, parts of it are very much a hard science -- the best sorting algorithm for a given situation, for instance -- but these are often confined to small, easily replaced cases. After all, it should not be difficult to swap sorting algorithms in most programs.

    Other parts are very much a creative work -- an attempt to find the clearest possible way to express an idea, both to the computer, and to the next programmer. It is here that we most often talk about elegance.

    Both are necessary -- without the creative structuring of a program, and proper communication of that structure to all involved, we would not be able to swap sorting algorithms at will. Without a clear understanding of how the performance of a program is impacted by various choices, that beautiful, creative code may never run.

    And that is why it is so challenging to teach. A good programmer is going to be using both halves of his (or her) brain, if not at the same time, then certainly throughout developing a given program.

    And that is also why it's not going to be easy to learn.

    If Djikstra were alive, he could learn a thing or two from Why The Lucky Stiff.

  • Re:The Text (Score:5, Interesting)

    by 1729 ( 581437 ) <slashdot1729@nOsPAM.gmail.com> on Tuesday December 02, 2008 @02:17PM (#25962783)

    Actually, Dijkstra spent a lot of time showing how to prove a program's correctness.

    He did. In fact, he spent more time proving the program correct than it took to write, test, run, debug, and fix, the program, and then the proof still has to be checked for correctness. I learned the Dijkstra techniques for proving code. Even something as painfully simple as proving a loop invariant holds and would terminate was mind-numbingly difficult and tedious, and still fails to be correct in the presence of concurrency. Somehow the program proof advocates lost sight of Gödel's incompleteness theorems.

    I'm not an advocate of Dijkstra's approach. However, does the Incompleteness Theorem really come into play here? I can't think of any useful algorithm for which I wouldn't be able to formally describe and verify the pre- and post-conditions. Can you think of any naturally-arising examples of algorithms for which undecideability might be an issue?

  • Re:The Text (Score:5, Interesting)

    by raddan ( 519638 ) on Tuesday December 02, 2008 @02:36PM (#25963099)
    1. Computers are physical machines. The bounds of those machines are, in many cases, not fully understood, and in other cases, too complex for a single person to understand fully.

    2. True-- but you're forgetting about the execution domain. Dijkstra points out that computers are simply "symbolic manipulators", and this is certainly true, but that does not make them general-purpose symbolic manipulators in the same way that a human is. A programmer must go to great lengths to ensure that, i.e., the number 1/10 or pi is preserved throughout the calculation chain, and doing so is computationally expensive. Sometimes prohibitively so. This is where engineering comes in, because if there's one thing engineers are really good at, it's deciding when something is "good enough" or not.

    3. Sure, if you fully understand the phenomena. Are you telling me that your computational model fully accounts for turbulence?

    What Dijkstra does not seem to understand is that engineering does not eschew mathematics. Engineers use the same theoretical knowledge that mathematicians and physicists do— they use the same analytical tools. Engineering is, rather, a superset of those analytical tools. It includes some new tools in order to deal with the complexity of certain tasks that are above the ability of most normal humans to solve. It is remarkably good at this.

    Throwing out engineering because it will never solve the problem fully is like throwing the baby out with the bath water. Better solutions will emerge— functional programming, for instance, is very promising in many ways. I've read Dijkstra before, and I have great respect for him particularly because of his actual experience building large software systems. But this paper makes him sound like a bitter old man; maybe he didn't like the direction the field was moving.
  • by shutdown -p now ( 807394 ) on Tuesday December 02, 2008 @02:44PM (#25963251) Journal

    And, regardless of what else you do, learn Haskell. You'll probably never actually use it, but that knowledge will be useful elsewhere; and, on top of that, you'll know what true elegance in programming language design is.

  • Re:The Text (Score:2, Interesting)

    by Anonymous Coward on Tuesday December 02, 2008 @02:58PM (#25963509)

    Except that there is great doubt regarding Ada's abilities and achievements.

    See Doron Swade's The Difference Engine for more detail.

    And the original poster did say "greatest computer scientist'. Even if you grant Ada what the legends claim, she would still be outclassed by Turing - as would we all.

  • by TheRaven64 ( 641858 ) on Tuesday December 02, 2008 @03:10PM (#25963705) Journal
    Smalltalk (or, Self), Haskell, Lisp, Prolog and Erlang are all languages that fit into the 'learning these will make you a better programmer even if you never use them' category. I'd also through an assembly language (pretty much any one will do, although if you learn B5000 asm you will develop a passionate hatred for all modern architectures) in that pile.
  • by V.11.1997 ( 1284252 ) on Tuesday December 02, 2008 @05:11PM (#25965903)

    Edsger Dijkstra, the greatest computer scientist to never own a computer

    Dijkstra did own a computer. He bought a Macintosh:

    "Even after he succumbed to his UT colleagues' encouragement and acquired a Macintosh computer, he used it only for e-mail and for browsing the World Wide Web".

    Source: U. Texas CS Department (In Memoriam): http://www.cs.utexas.edu/users/EWD/MemRes(USLtr).pdf [utexas.edu]

  • Re:The Text (Score:1, Interesting)

    by Anonymous Coward on Tuesday December 02, 2008 @09:25PM (#25969557)

    A watchdog timer is a timer that resets a CPU system if a timeout is reached. It is a way of attempting to achieve reliability in the presence of less reliable hardware.

    Watchdog timers are routinely used as a defense against the possibility of software design errors. A watchdog is a simple way to guarantee that if the system crashes for ANY reason, your vital hard realtime control loops get restarted. If you engineer the system carefully, it won't even skip one single sense-compute-response cycle.

    Also:

    Do you believe it doesn't? The atmosphere stops gamma rays from hitting your equipment. These gamma rays change the value of variables (as in, the gamma ray flips the DFF circuit to the opposite value in your CPU). That means you cannot rely on variables to make sure your loop exits. Therefor loops are bad.

    1. You basically can't build CPUs without protection against SEUs these days. At current process nodes they don't even work at sea level if they don't have ECC protection on all the caches, registers, and even (or so I've heard) datapath circuits.

    2. If you are designing a control system where you must run your control loop at a given frequency or Something Bad Happens, leaving for loops out doesn't get you off the SEU hook. Yes, SEUs could corrupt a loop index variable, but they can just as easily crash the CPU itself (that is, corrupt its internal state to something inconsistent such that it ceases making forward progress executing the user program, or worse, flails around doing strange things). If you're concerned about SEUs interrupting your control function, YOU MUST INCLUDE A RAD-HARDENED HARDWARE WATCHDOG. No ifs, ands, or buts.

    3. Simple defensive programming techniques can reduce the probability of unintended infinite loops to nearly zero.

    I have a multimillion dollar budget that says I am not - how about you?

    For all your posturing, you sure don't seem to be thinking all that clearly about these issues. I don't even work on anything which has to be as reliable as space hardware and I can easily poke holes in the things you're saying.

  • Re:The Text (Score:2, Interesting)

    by blonkm ( 1402997 ) on Tuesday December 02, 2008 @09:58PM (#25969907)
    Well, undecidability, maybe no. But provability - yes. There are plenty of algorithms that yield results, but we haven't the faintest idea why. The models don't fit into regular mathematics or at least not any mathematics that we understand or have developed yet. I am talking about neural nets and genetic algorithms. Say a sample neural net/genetic algorithm can be trusted and 'verified' to result in a useful solution perhaps 90% of the time, measured on a large amount of tests. How are you going to prove mathematically that this 90% will hold over time, if you don't even know how to describe the solution (the solution is developed by the program itself over time). It's inherent to chaos theory that you cannot predict what will happen at that level of 'randomness'. But please prove me wrong! (pun intended)

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...