Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Education

Twenty Years of Dijkstra's Cruelty 727

WatersOfOblivion writes "Twenty years ago today, Edsger Dijkstra, the greatest computer scientist to never own a computer, hand wrote and distributed 'On the Cruelty of Really Teaching Computer Science' (PDF), discussing the then-current state of Computer Science education. Twenty years later, does what he said still hold true? I know it is not the case where I went to school, but have most schools corrected course and are now being necessarily cruel to their Computer Science students?" Bonus: Dijkstra's handwriting.
This discussion has been archived. No new comments can be posted.

Twenty Years of Dijkstra's Cruelty

Comments Filter:
  • The Text (Score:5, Informative)

    by eldavojohn ( 898314 ) * <eldavojohn@gm a i l . com> on Tuesday December 02, 2008 @09:46AM (#25959273) Journal
    For those of you looking for some old fashioned HyperText Markup Language, here is a transcription of the 892 KB PDF to HTML by Javier Smaldone [utexas.edu].

    While the handwriting is a novelty (and the PDF is actually small for a PDF), I question how long that server is going to last.

    Also (and yes this is nitpicking), I must contest this:

    Edsger Dijkstra, the greatest computer scientist to never own a computer

    I submit for consideration Alan Turing [wikipedia.org] who may have designed the ACE and worked on the earliest computer (The Manchester Mark I [wikipedia.org]) although never really owned it or any other computer. I think a lot of people identify him as not only a hero of World War II but also the father of all computers.

    • Re:The Text (Score:4, Insightful)

      by dkleinsc ( 563838 ) on Tuesday December 02, 2008 @10:11AM (#25959629) Homepage

      I'd think Ada Lovelace would have a better claim there, given that not only did she never own a computer, they didn't even exist and there was at the time no such thing as a "program".

    • Re: (Score:3, Informative)

      The Dijkstra wiki link said he owned a Mac for email and web browsing. How can they expect us to read the links if they aren't going to themselves.

    • Re: (Score:3, Interesting)

      by WhiplashII ( 542766 )

      More than not being "great", he seems to be rather foolish...

      1) His main premise is that "software engineering" cannot exist because software cannot be proved correct, only proved wrong. Well, I got news for ya - rocket engineering is the same way. So is electronics. So are bridges! Or do you think that having the SRB on the shuttle burn through the main tank was by design?

      2) He goes further to say that foolish mortals (unlike himself) learn by analogy, and so can't handle the truth, etc. Then, hilario

      • Re:The Text (Score:5, Insightful)

        by theaveng ( 1243528 ) on Tuesday December 02, 2008 @11:50AM (#25961279)

        No.

        (1) His argument is that to discuss "software engineering" is as silly as to discuss "algebra engineering" or "formula engineering" in math courses. A program is simply a formula to be executed - nothing more - says the Computer Science professor.

        (2) Programs manipulate numbers. Mathematic formulae manipulate numbers. It's an entirely reasonable conclusion that he has reached that a program is merely a formula.

        (3) Putting pretty pictures on screen or manipulating airplane surfaces (my specialty) is still just formula execution.

        • Re: (Score:3, Insightful)

          by WhiplashII ( 542766 )

          I agree that is what he is saying - I disagree that it is a reasonable thing to say ;)

          A program is nothing without the computer it runs on. I could agree that if you were building programs for the purpose of knowledge (like he does, presumably), you are not an engineer - you are a mathematician. But that is not what computers are! Computers are a way to make something happen. By using generic computers running programs, we can more easily make complicated stuff happen.

          The airplane surface you actuate i

          • The Truth (Score:3, Interesting)

            by huckamania ( 533052 )

            Dijkstra's Cruelty is the nickname UT cs students gave to the course his wife taught. It was a required course when I was there and it was 2 semesters long. I think it was called "Software Development" but should have been called "Fantasyland Development". Total waste of time and energy. I never saw him on campus, except maybe at graduation.

            The best classes were given by people who either were working in the real world or had some experience in the real world and were trying to get their masters or

            • Re:The Truth (Score:5, Insightful)

              by colmore ( 56499 ) on Tuesday December 02, 2008 @01:21PM (#25962835) Journal

              I think this is a clear case of computer science and software engineering (without going into Dijkstra's assessment of that term) being different beasts.

              Both the theoretical and immediately practical implementation of software are interesting and important, but they're studied in different ways by different people and trying to mash the two together tends to create more conflict than interdisciplinary synergy.

          • Re:The Text (Score:5, Insightful)

            by hypnotik ( 11190 ) on Tuesday December 02, 2008 @12:47PM (#25962277) Homepage

            I think you are missing his point, perhaps intentionally.

            The vast majority of software is engineered - engineers are trying to get a specific outcome, they are not trying to calculate something.

            Computer programs, he argues, are nothing more than long proofs. Each function you write is equivalent to a predicate in logical calculus, or a function in mathematics.

            If you were only interested in outcome, you could write a program that multiplies two numbers together as a long series of "if" statements. But you'd most likely miss some possible values for inputs.

            However, if you were interested in it being correct for all possible inputs, you would use the mathematical operation * or use a loop to calculate the correct answer.

            I think that is the argument he is making and as a University professor, I tend to agree. I've seen some of my students test an array by using an if-statement for every single element of the array, where as a loop would have been infinitely more suitable.

            Only one should be deemed correct. But if you adapt the "engineering" and "outcome" point of view, both are correct.

            • Re:The Text (Score:4, Insightful)

              by orclevegam ( 940336 ) on Tuesday December 02, 2008 @01:03PM (#25962527) Journal
              This reminds me of the problem that physicists have been trying to grapple with for a long time between the energy view of the world and the forces view. Some things are simpler to model as a set of interacting forces, others as a set of energy transformations. Of course both are equally correct models, but some things are easier to model using one or the other. Likewise taking the algorithm, versus outcome ways of modeling programming you see something similar, where a problem can be modeled with either one, but some problems are easier to model with one than the other.
        • Re:The Text (Score:5, Interesting)

          by raddan ( 519638 ) on Tuesday December 02, 2008 @01:36PM (#25963099)
          1. Computers are physical machines. The bounds of those machines are, in many cases, not fully understood, and in other cases, too complex for a single person to understand fully.

          2. True-- but you're forgetting about the execution domain. Dijkstra points out that computers are simply "symbolic manipulators", and this is certainly true, but that does not make them general-purpose symbolic manipulators in the same way that a human is. A programmer must go to great lengths to ensure that, i.e., the number 1/10 or pi is preserved throughout the calculation chain, and doing so is computationally expensive. Sometimes prohibitively so. This is where engineering comes in, because if there's one thing engineers are really good at, it's deciding when something is "good enough" or not.

          3. Sure, if you fully understand the phenomena. Are you telling me that your computational model fully accounts for turbulence?

          What Dijkstra does not seem to understand is that engineering does not eschew mathematics. Engineers use the same theoretical knowledge that mathematicians and physicists do— they use the same analytical tools. Engineering is, rather, a superset of those analytical tools. It includes some new tools in order to deal with the complexity of certain tasks that are above the ability of most normal humans to solve. It is remarkably good at this.

          Throwing out engineering because it will never solve the problem fully is like throwing the baby out with the bath water. Better solutions will emerge— functional programming, for instance, is very promising in many ways. I've read Dijkstra before, and I have great respect for him particularly because of his actual experience building large software systems. But this paper makes him sound like a bitter old man; maybe he didn't like the direction the field was moving.
      • Re:The Text (Score:5, Insightful)

        by 1729 ( 581437 ) <slashdot1729&gmail,com> on Tuesday December 02, 2008 @11:55AM (#25961347)

        More than not being "great", he seems to be rather foolish...

        1) His main premise is that "software engineering" cannot exist because software cannot be proved correct

        Actually, Dijkstra spent a lot of time showing how to prove a program's correctness. See his "A Discipline of Programming", for example.

    • Re:The Text (Score:4, Funny)

      by Potor ( 658520 ) <farker1@gmai l . com> on Tuesday December 02, 2008 @02:41PM (#25964283) Journal
      What about Beowulf? I mean, that's over a millennium ago, and he certainly left his mark on computer scientists.
  • by account_deleted ( 4530225 ) on Tuesday December 02, 2008 @09:49AM (#25959303)
    Comment removed based on user account deletion
    • by chrb ( 1083577 ) on Tuesday December 02, 2008 @10:33AM (#25959965)

      My old university dropped C and replaced it with Java for all CS courses apart from Operating Systems. I asked one of the professors why - he said many students complained that dealing with pointers was too hard, and that the rise of Java and Java programming jobs meant C was obsolete and pointless, that the issue of programming languages came up on prospective student visit days, and that in order to be "commercially attractive" to these potential students they had to switch. We even used to do assembly language programming in the first year - now, the replacement course teaches students how to use Eclipse for Java programming.

      Several years later I was back tutoring, and I was very disappointed to find out that I had to explain pointers and pointer arithmetic to people who were almost at the end of their Computer Science degree, and who didn't understand why their code was crashing with "null references" when "Java was supposed to get rid of all that memory stuff!".

      • Re: (Score:3, Insightful)

        almost at the end of their Computer Science degree, and who didn't understand why their code was crashing with "null references" when "Java was supposed to get rid of all that memory stuff!".

        If they can almost finish a degree and still be surprised when Java dies with a null reference, then the problem isn't that they were taught to program using Java, it's that they were taught Java very badly...

      • by OrangeTide ( 124937 ) on Tuesday December 02, 2008 @12:41PM (#25962183) Homepage Journal

        I've done nothing but C (not even C++) programming for the last decade in various full time and consulting positions.

        Linux is all C and the job market for Linux kernel, driver and system developers has been pretty active for many years now. Using QNX and vxWorks is all C programming too (not counting the tiny bits of assembly you have to stick into your programs).

        This is why I think it's important that people learn some assembly language once they are past the basic syntax of C. They don't have to become experts in the assembly, but being able to write (and debug!) a few basic programs in some assembly would be a good experience. Like a hello world(one that calls the system call, and one that calls a puts function), string reverse and maybe a linked list bubble sort. If you can get those 3 done in assembly after you've been exposed to C, it should make pointers (and arrays) a lot easier to understand.

        I believe being able to debug is as important as being able to program.

  • by noldrin ( 635339 ) on Tuesday December 02, 2008 @09:58AM (#25959423)
    Sounds like a typical computer science professor. Mine usually couldn't use a computer at all. And yes, mine were generally very cruel. Giving examples that months later they figure out were wrong, making us code with pen and pencil, teaching fake assembly languages and fake operating systems.

    I'm glad I left, cause I can actually now use a computer, unlike much of the coders I come across. If you like computers, don't go into computer science. That is for people who enjoy math and theory.
    • by Malc ( 1751 ) on Tuesday December 02, 2008 @10:19AM (#25959745)

      Why is fake assembly and fake OS cruel? It's computer science, not a vocational tech course. They've presumably tried to bypass the issues of real-world systems that distract you from learning the point. Once you've got the basic concepts, any OS and any language become approachable - why would you want to learn something specific that would be out-of-date in short measure? Seems rather myopic to me.

      • by SatanicPuppy ( 611928 ) * <Satanicpuppy@nosPAm.gmail.com> on Tuesday December 02, 2008 @10:31AM (#25959921) Journal

        In a practical field where you can do real assembly and work on a real OS, you ask why doing fake make-work is cruel?

        Theory is fine, but theory shouldn't trump practical application in a field where practical applications are everywhere.

        • Re: (Score:3, Informative)

          by MikeBabcock ( 65886 )

          University vs. College.

          Comp Sci. is not a trades course; go to a local community college and take a technology or programming course if you want real-world examples.

          Computer Science is about learning to understand computing, whether you use real or completely fictional interfaces.

        • Fake assembly languages and operating systems are typically taught because they illustrate the concepts in a general way. I was taught a simple, made-up, register-to-register assembly language in a couple of modules. Since then I've done a bit of PowerPC and a bit of x86 assembly coding for things that couldn't be expressed in C (atomic operations and vector stuff), and I am very glad that I wasn't forced to learn x86 asm back then. It would have confused the issues and resulted in my learning less.
      • by SageinaRage ( 966293 ) on Tuesday December 02, 2008 @10:39AM (#25960067)
        It's the equivalent of a biology class detailing the possibilities of life, by examining chemical interactions, without actually examining any actual living organisms.
      • by Darth_Burrito ( 227272 ) on Tuesday December 02, 2008 @12:20PM (#25961821)

        Rule Number 1 of Computer Science - Don't reinvent the wheel. Everyone who invents their own education focused language that's only used at their school is violating the first rule of computer science. At my university, the first year of programming is taught in a java like variant of C++ called Resolve C++. Why is it bad?

        1. It hurts students applying for co-ops/internships because none of the employers no what it means.
        2. It hurts students because there are about a thousandth of a percent of the sources of information about Jo-bob's made up educational language as there are for something like Java.
        3. It hurts students because they don't feel like they can go home and make useful projects in weird language XYZ in their own time.
        4. It wastes time because you are rolling your own instead of reusing something tried and true. As the needs of the modern world change, will your made up language be able to change with it, or will it stagnate like everything else in academia?
        5. By rolling your own when a veritable shitstorm of acceptable solutions exist, you are setting a horrible example for your students.
        6. Most importantly, while students are not always right, they absolutely hate it when they are forced to learn some professors pet project and if something useful is only slightly harder why piss them off.
    • Boy do you need to go back to school. Edsger wrote more and better stuff in his lifetime than anyone here on Slashdot. Did you ever get directions from Google Maps or Mapquest? Thank Edsger -- his shortest path algorithm is what they all use, and by the way, he wrote that before you were born, most of you. You know the semaphores used in the multi-cpu Linux kernels? Yep, you owe Edsger for them, too. And programming languages like C, Pascal, etc.? He helped write the first Alogol compiler, the great-grand-daddy of them all, once again before most of you were born.

      Just because he eschewed the run-break-fix approach so beloved of the folks who are spewing billions of lines of error-laden code into the world today, doesn't mean he hadn't forgotten more about writing code than most folks here have ever learned. And yes, he advocated developing code formally, and he liked to do it with pen and paper.

      So learn about who you're making snide comments about, and show some respect. When people are still using any algorithm you came up with 30 years from now, you will have the right to say something about Edsger Dijkstra.

      • by EWAdams ( 953502 ) on Tuesday December 02, 2008 @12:40PM (#25962169) Homepage

        ... in this article he's insisting that a carpenter has to be a physicist before he should be allowed to build a house. (Yes, that's arguing from analogy -- deal with it.)

        The fact is that the world needs a hell of a lot of running code in a hurry. Millions of lines of it. We don't have the luxury of treating a realtime airline-pricing-optimization manager as a lovely formal system that we can write out in pencil. We have to get it up and running, then fix bugs and add features as time permits, because of a phenomenon that Dijkstra doesn't take into account: IT'S NEEDED *NOW*.

        I also think he's being unfair by suggesting that modern educational institutions are anything like as hidebound as medieval ones. First, medieval universities were not intended for inquiry in the first place; they were intended to prepare young men for the priesthood -- i.e. to teach them doctrine, which was not subject to inquiry. No institution except maybe a seminary is as restrictive as that these days. Second, it doesn't seem to have occurred to him that learning by analogy is how people learn *effectively.* He may decry teaching children about arithmetic by using apples because it's not a formal system, but a five-year-old doesn't have enough knowledge to know what a formal system IS. Starting a five-year-old with Principia Mathematica is just pointless. And your basic coding grunt who wants to build websites doesn't need to be taught JavaScript as a formal system either.

  • Cruel to be kind (Score:5, Insightful)

    by MosesJones ( 55544 ) on Tuesday December 02, 2008 @09:59AM (#25959433) Homepage

    The aim of a really good degree (as opposed to a lecture driven box ticking one) is to be cruel, you want to feel that your head is going to explode and that your subject really is an absolute bitch.

    Then you graduate and find out the real world is easier than the theory.

    Cruelty is important in a good education to make you achieve in the real world. An easy flow through degree gets you the cert but gives you unrealistic expectation of how hard the real world can be.

    Personally my degree was a mind bending bitch of mumbling lecturers and impossible (literally in some cases) questions that covered everything from quantum mechanics "basics" and abstract computing theory through to how to dope a transistor.

    It was cruel, it was unusual... it was great.
     

    • by 140Mandak262Jamuna ( 970587 ) on Tuesday December 02, 2008 @10:51AM (#25960281) Journal
      The cruelty is not being inflicted upon the students, it is on the teachers. All these dumb kids got through high schools with endemic grade inflation, not knowing the first thing about anything but with some supreme self confidence. Teaching them anything is difficult. Can't figure out simple free body diagrams in freshman physics, get all confused by the p orbitals and electrons in chmistry, can't differentiate a polynomial if their life depended on it, but they all come and sit in the computer science class and expect to be taught using innovative interesting creative techniques that require no effor on their part. Being a college prof dealing with freshman is a punishment, I tell you.
  • by JoshDM ( 741866 ) on Tuesday December 02, 2008 @10:07AM (#25959567) Homepage Journal

    Dijkstra's Cruel Font link [tinyurl.com], so we at least get something recent(-ish) out of this article.

  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Tuesday December 02, 2008 @10:25AM (#25959835)
    Comment removed based on user account deletion
  • Recursive Descent (Score:3, Insightful)

    by tomhath ( 637240 ) on Tuesday December 02, 2008 @10:26AM (#25959853)
    "Right from the beginning, and all through the course, we stress that the programmers task is not just to write down a program, but that his main task is to give a formal proof that the program he proposes meets the equally formal functional specification."

    Dijkstra was a genius and made many contributions to Comp. Sci. But his suggestion that a program (really a program design) should be accompanied by a formal proof has problems at both ends of the development cycle: how do you prove that the formal specification is what the customer wanted, and how do you prove that the code actually implements the design?

    I've seem automatic testing products that claim to do both, but in order to make them work you have to specify every variable and every line of code as the "requirements", then compare what the tool thinks the result should be to the output of the program. And yes, the vendor suggested that the business analysts write the formal requirements; you can imagine how well that worked.

    • Re: (Score:3, Interesting)

      by russotto ( 537200 )

      Dijkstra was a genius and made many contributions to Comp. Sci. But his suggestion that a program (really a program design) should be accompanied by a formal proof has problems at both ends of the development cycle: how do you prove that the formal specification is what the customer wanted, and how do you prove that the code actually implements the design?

      When I was in the CS curriculum in the University of Maryland shortly after the publication of this paper, one of the mandatory freshman CS courses took

  • by Ukab the Great ( 87152 ) on Tuesday December 02, 2008 @10:27AM (#25959873)

    "Right from the beginning, and all through the course, we stress that the programmer's task is not just to write down a program, but that his main task is to give a formal proof that the program he proposes meets the equally formal functional specification."

    Where exactly do semi-formalized, poorly thought-out specifications handed to you half-written out on a napkin and constantly subject to change fit into the programmers task and Dijkstra's world?

  • GIGO (Score:4, Insightful)

    by TheWoozle ( 984500 ) on Tuesday December 02, 2008 @10:32AM (#25959947)

    The glaring hole in Dijkstra's argument is that most software is built to automate what used to be manual processes, and they therefor have to mimic a human-centric process... which is inherently illogical, inefficient and rife with nonsense.

    In the world outside the ivory tower, programmers do not have the freedom to create completely logical, functionally complete programs that can be reduced to a mathematical proof.

    Next time your boss comes to you with an project to write an application, show him this paper and explain that what he's asking for is "medieval thinking" and see if you can then explain to him why you should keep your job if you don't want to do it.

  • Self-confidence (Score:4, Insightful)

    by dgenr8 ( 9462 ) on Tuesday December 02, 2008 @10:45AM (#25960177) Journal

    All I have to do is read one paper like this to be reminded why I stayed out of academia. Ah the smugness, the hypocrisy, the great irony. A "radical novelty" this essay is not.

    There are plenty of truths out there yet to be discovered. Unfortunately most academics lack the self-confidence to go looking for them and instead find clever new ways to twist ideas around.

  • by SageinaRage ( 966293 ) on Tuesday December 02, 2008 @10:47AM (#25960205)

    The thing that confuses me the most about this paper is his hatred for using anthropomorphic metaphors to describe the functioning of a program. It confuses me partly because his examples of why it's bad don't seem to actually address his complaint, or anything like it. But also, because the more I think about it, they seem to fit very well.

    Program components may not be living things, but they do run autonomously. They perform actions, frequently without any input from me. They seem to do things and interact with other components on their own - why not describe it as such? There's also the fact that he doesn't give any alternate way of describing what's going on inside a program.

    • by Max Webster ( 210213 ) on Tuesday December 02, 2008 @11:30AM (#25960927)

      I think the problem is the false assumptions and analogies that get introduced by these lines of thinking. If a network is "this guy talking to that guy", your thinking will be constrained by what you know about human conversation. If there's a problem, someone can talk louder, slower, etc. and the analogy holds. But if the solution involves something that has no equivalent in the day-to-day world, how are you going to conceptualize it?

      My pet peeve, that descends from this same idea, is from the teaching of object orientation. A car is a type of vehicle; a truck is a type of vehicle; they both have wheels; maybe the number of wheels is different; maybe each has a different turning radius or procedure for backing up.

      Great, now you understand inheritance, polymorphism, member functions, etc. However, in practice, you use object orientation to avoid zillions of "if" statements, special case code, large blocks of almost-but-not-quite duplicated code.

      In my experience, someone who learned OO by simple analogies is likely to be locked into thinking "I have to write every program by making a tree of classes like with the cars and trucks". Rather than "there are different ways to structure the code, and a good OO way will get rid of a lot of branching, magic numbers, and redundant code".

  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Tuesday December 02, 2008 @10:48AM (#25960225)

    There are some good quotes atributed to him, but one particular one that goes to show how very wrong even experts can be:

    It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.

    In my experience this is utter arrogant rubbish. Not being able to teach good programming to people who know Basic only stems from the inability of people like Dijkstra to teach. One of the reasons I usually stear clear of all to academic IT enviroments and programming languages. Like Ruby or Java.

    • by russotto ( 537200 ) on Tuesday December 02, 2008 @11:09AM (#25960557) Journal

      It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.

      In my experience this is utter arrogant rubbish.

      You have been trolled (by Dijkstra).

    • Re: (Score:3, Insightful)

      by Fallingcow ( 213461 )

      I like the part where he complained about kindergarten teachers bootstrapping kids into the unnatural, abstract world of mathematics with *gasp* addition of one pile of apples to another.

      He's clearly never tried to teach a 5-year-old about math. Kids can get stuck at very low levels of understanding if you don't guide them up the ladder of abstraction a bit, and examples like that are key.

  • by russotto ( 537200 ) on Tuesday December 02, 2008 @10:50AM (#25960269) Journal

    I don't think I really understand what Dijkstra is getting at here. Can someone explain it to me with a car analogy?

  • by mbone ( 558574 ) on Tuesday December 02, 2008 @10:55AM (#25960343)

    I am very glad I never had him as a professor.

    By the way, I am a physicist, and IMHO all of the best physicists develop a physical intuition about their physics, and that is certainly true with those who deal with quantum mechanics. Listen to the recorded lectures of Richard Feynman, for example.

  • Sorta Kinda Yeah (Score:4, Insightful)

    by Aaron_Pike ( 528044 ) on Tuesday December 02, 2008 @10:55AM (#25960345) Homepage
    He makes some good points, and I agree with a lot of Dijkstra's philosophies, there. I'm totally digging on the concept of radical novelty; I'm a big fan of teaching by cognitive dissonance.

    However, I can't get behind the man's call to teach without compiling and running programs. Well, to be fair, I'd have no problems teaching a freshman college course that way, but I'm teaching teenagers. I want the students to be able to unleash what they have wrought (or at least see that they've created a program that works). That's the bait that hooks them deeper into the coolness that is computer science or software engineering or programming or pattern engineering or thinkyness or whatever you care to call it.

  • Back to the future (Score:5, Interesting)

    by Bucc5062 ( 856482 ) <bucc5062@[ ]il.com ['gma' in gap]> on Tuesday December 02, 2008 @10:59AM (#25960403)

    As I read through his writings it brought me back to my time at Moravian College circa 1979. I just started taking CS classes and in that same year Dr Brown, Head of the CS Department pulled out all the IBM mainframe systems and installed a PDP 11/45. Gone were the COBOL courses replaced by c, RATFOR, PASCAL, Fortran et al. I loved it and hated it at the same time.

    Like the presentation, Dr Brown taught us programming before we really saw the computer. His focus was not on Language, but on concept. As he so well put to us, once done with our intro class we could work anywhere in any language. I believed it then and found it to be a true statement. At the end of that intro class he took the last three weeks and taught sort algorithms. The catch was each sort was analyzed in a different language. I chuckle when I read posts of youngsters that say "I learned Java, or C++ in college". I learned Programming in college then went on to figure out what language suited my economic and intellectual needs.

    Cruelty in Computer Science? I am grateful for that kind of cruelty to this day. Since college I have had to adjust my knowledge as times and needs change. I have had the pleasure of working with RPG, COBOL, Java, FORTRAN, and even the bastard child Visual Basic. Unlike some, I do not look down at any language for each has its benefits for the task. What I do dislike is working on code written by persons who thought that "Learn to Code Java in three Weeks" made them a programmer; that language X is the best and only language out there.

    Dr. Dijkstra says "Universities should not be afraid to teach radical novelties". What things could be discovered if that concept was embraced again.

  • by Plekto ( 1018050 ) on Tuesday December 02, 2008 @11:32AM (#25960965)

    Fantastic article. I especially like this part:

    (1) The business community, having been sold to the idea that computers would make their lives easier, is mentally unprepared to accept that they only solve the easier problems at the price of creating much harder ones.

    So true. I deal with this every day. Despite the high tech wizardry around us, business still runs pretty much the same. Just, the management is all too happy to throw its problems at someone else. I can't remember how many times in the past that I've had a client, boss, or manager ask me something that is impossible and tell me "fix it" or "make it work".

  • by Device666 ( 901563 ) on Tuesday December 02, 2008 @11:37AM (#25961043)

    "I mean, if 10 years from now, when you are doing something quick and dirty, you suddenly visualize that I am looking over your shoulders and say to yourself 'Dijkstra would not have liked this', well, that would be enough immortality for me." - Edsger Wybe Dijkstra

    A lot of software engineers like to work with new technologies, new paradigms, new code design patterns, new software development technologies and other forms of complexity. Quality Assurance rules by checklists and testing, only fixing symptoms. Every coder has it's own ideology what is correct code or the correct way to do it. Correctness proven by superficial subjective quality standards, beautiful crafted hacks included.

    Edsger found ways to mathematically prove programs to be correct, requiring a very high level of math skills (the same level which is needed to prove the correctless a mathematical theory). This utopistic objective quality standard. Stuff for the real hard core developers who have plenty of time.

    But most haven't the time. In the end time-to-market is key. Swift hackers remain the heroes of business who craft applications which get used in the real world.

    However some pragmatical things I thank Edsger Wybe Dykstra for: invention of the stack, his low opinion about the GOTO statement; shortest path-algorithm, also known as Dijkstra's algorithm; Reverse Polish Notation and related Shunting yard algorithm; Banker's algorithm; the concept of operating system rings; and the semaphore construct for coordinating multiple processors and programs. His charismatic remarks about what we would typically consider software engineering are entertaining and humbling, examples:

    • "The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offence."
    • "APL is a mistake, carried through to perfection. It is the language of the future for the programming techniques of the past: it creates a new generation of coding bums."
    • "The question of whether Machines Can Think... is about as relevant as the question of whether Submarines Can Swim"
    • "The competent programmer is fully aware of the limited size of his own skull. He therefore approaches his task with full humility, and avoids clever tricks like the plague"
    • "Write a paper promising salvation, make it a 'structured' something or a 'virtual' something, or 'abstract', 'distributed' or 'higher-order' or 'applicative' and you can almost be certain of having started a new cult."
    • "The required techniques of effective reasoning are pretty formal, but as long as programming is done by people that don't master them, the software crisis will remain with us and will be considered an incurable disease. And you know what incurable diseases do: they invite the quacks and charlatans in, who in this case take the form of Software Engineering gurus. "
    • FORTRAN, 'the infantile disorder', by now nearly 20 years old, is hopelessly inadequate for whatever computer application you have in mind today: it is now too clumsy, too risky, and too expensive to use.
    • "In the good old days physicists repeated each other's experiments, just to be sure. Today they stick to FORTRAN, so that they can share each other's programs, bugs included."
    • "It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration."
    • "Programming is one of the most difficult branches of applied mathematics; the poorer mathematicians had better remain pure mathematicians."
    • "We can found no scientific discipline, nor a hearty profession, on the technical mistakes of the Department of Defense and, mainly, one computer manufacturer."
  • False dichotomy (Score:5, Insightful)

    by Tony ( 765 ) on Tuesday December 02, 2008 @11:38AM (#25961073) Journal

    Most of y'all are presenting a false dichotomy. It's not "Either learn abstract formalism OR learn practical languages." You can do both, you know.

    I have met too many people who think that, because they can write some tangled, fucked-up C++, they are software engineers. Never mind the fact that they couldn't learn LISP, Objective-C, Java, or any number of other useful languages, as they don't know the first thing about actual computing.

    Teaching Java or C++ doesn't matter. Sure, you need classes on practical application of your knowledge. But if you ignore what Dijkstra says here, you're going to end up with a bunch of code monkeys who have to test every element of the set, rather than test the rules of the set.

    In my experience, those who started off learning theory, then learned how to apply that theory in practical situations, are far better programmers than those who are taught "practical" languages.

    There's some very good advice in that paper. Calling him "out of touch" is a bit shortsighted.

  • by Theovon ( 109752 ) on Tuesday December 02, 2008 @12:15PM (#25961727)

    Towards the end of the essay, an introductory course is described where the student, as programmer, is required to build a formal mathematical definition of his program and prove that his program conforms to the definition.

    At The Ohio State University, we teach precisely this in the form of the "Resolve" programming language. For every function and every block of code, one must provide both the procedural code and a set of logical constraints that describe the effects of the process. For instance, if a function's job is to reverse the order of items in a list, then the internals of the function are accompanied by a logical constraint that tracks the movement of items from one list to another and ensures that the operation is totally consistent at every step, in terms, for instance, that the two lists sum to a constant size that is the same size as the input, and that the concatenation of the lists in a particular way yields the original list. (I'm summarizing.)

    An active area of research here is on automatic provers that take your code and your logical definitions and actually prove whether your code and defintions match or not.

  • Mathematics (Score:3, Interesting)

    by ProteusQ ( 665382 ) <dontbother&nowhere,com> on Tuesday December 02, 2008 @12:15PM (#25961731) Journal

    I haven't read this essay before, and have only had time to skim part of it now, but Dijkstra's criticism of mathematicians has merit, IMHO. [I have an MS in Math, so I don't claim to be an expert.] It's been over 40 years since the introduction of nonstandard analysis (including hyperreals and later surreal numbers), but it's still not a mainstream topic. In fact, it boggles my mind why a professor or department would choose to teach their students "hard" analysis (Bartle or Royden) instead of "soft" analysis (Rudin) -- "soft" analysis uses topological results to arrive at key theorems faster, while "hard" uses Real Analysis as if it were the only option on the market. Not that there's anything wrong with using Rudin and Royden in tandem, but to ignore topological results altogether smacks of willful ignorance. To paraphrase Stephen King: do you expect brownie points for being ignorant of your own discipline?

    The light on the horizon is the development of proof via programming, as covered by /. I'm sure there will be 50+ years of mathematicians screaming and kicking to avoid its introduction into the mainstream, but that will change the first time a computed proof that could not have been developed in one lifetime via the usual methods earns someone an Abel Prize. Until then, I suspect Dijkstra's point will still stand.

  • I can relate... (Score:5, Insightful)

    by gillbates ( 106458 ) on Tuesday December 02, 2008 @01:17PM (#25962781) Homepage Journal

    But I think his arguments are centered around a misunderstanding of terms. It's simple academic dishonesty to which he objects:

    • Computer Science is a discipline of mathematics, a true science.
    • Computer Engineering is an engineering discipline, concerned with how to use the principles of computer science to create working systems. Provable correctness is a must; you don't get to respin a board until it works. Generally speaking, things have to work right the first time.
    • Software Engineering is a vocational discipline, which requires some knowledge of computer science, in the same way a construction foreman needs to know basic math. Software engineering requires both an understanding of programming and the corporate structure for producing software. But it is more a collection of specific pragmatic methods than an exact science.
    • Computer Programming is a vocational discipline. It also requires a basic understanding of computer science, but for some jobs, notably business applications, it is enough to merely understand the language du jour. Regardless of how terrible the code is, from a business perspective, someone who produces code which can be shipped in a short period of time is a good programmer. The corporate bean counters could care less about things like correctness and maintainability, and are more interested in the state of accounts receivable. The programmer who helps out the accounts receivable side will be better liked regardless of the quality of his code, and probably promoted to management.
  • by volkris ( 694 ) on Tuesday December 02, 2008 @01:37PM (#25963125)

    IMO there needs to be a starker contrast between computer science and computer engineering, just as there's a contrast between "real" engineering and, say, physics.

    Those who just want to be able to program, who are focused purely on employability right out of college, can look for computer engineering courses teaching the popular programming languages. These people can be fine programmers, ready to start... so long as the language popularity hasn't changed by the time they graduate. It would be sort of advanced vocational program, just like any other engineering.

    But the real scientists, those who want to experience and express code on deeper levels, should be looking for something very different, that which Dijkstra describes. Just as a scientist and an engineer can work on the same project contributing different skills, the computer scientist has his place even in the real world.

    The two really are different mentalities, and it seems that the mixture of the two leads to situations that are non-ideal for either.

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...