Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Education

Twenty Years of Dijkstra's Cruelty 727

WatersOfOblivion writes "Twenty years ago today, Edsger Dijkstra, the greatest computer scientist to never own a computer, hand wrote and distributed 'On the Cruelty of Really Teaching Computer Science' (PDF), discussing the then-current state of Computer Science education. Twenty years later, does what he said still hold true? I know it is not the case where I went to school, but have most schools corrected course and are now being necessarily cruel to their Computer Science students?" Bonus: Dijkstra's handwriting.
This discussion has been archived. No new comments can be posted.

Twenty Years of Dijkstra's Cruelty

Comments Filter:
  • engineering (Score:2, Insightful)

    by Lord Ender ( 156273 ) on Tuesday December 02, 2008 @10:56AM (#25959399) Homepage

    Regardless of the state of Computer Science, what most students studying the subject really are after is software engineering. The world doesn't need more people arguing over P=NP; it needs people who can build (and manage projects to build) software systems which solve real-world problems.

  • Re:Hmmm... (Score:4, Insightful)

    by ExtraT ( 704420 ) on Tuesday December 02, 2008 @10:58AM (#25959419)

    I agree that teachers disconnected from reality are bad, but the alternative is even worse. Look at what too much bitching got us: they teach JAVA as the primary programming language in universities nowadays! How sadistically cruel is that?

  • by noldrin ( 635339 ) on Tuesday December 02, 2008 @10:58AM (#25959423)
    Sounds like a typical computer science professor. Mine usually couldn't use a computer at all. And yes, mine were generally very cruel. Giving examples that months later they figure out were wrong, making us code with pen and pencil, teaching fake assembly languages and fake operating systems.

    I'm glad I left, cause I can actually now use a computer, unlike much of the coders I come across. If you like computers, don't go into computer science. That is for people who enjoy math and theory.
  • Cruel to be kind (Score:5, Insightful)

    by MosesJones ( 55544 ) on Tuesday December 02, 2008 @10:59AM (#25959433) Homepage

    The aim of a really good degree (as opposed to a lecture driven box ticking one) is to be cruel, you want to feel that your head is going to explode and that your subject really is an absolute bitch.

    Then you graduate and find out the real world is easier than the theory.

    Cruelty is important in a good education to make you achieve in the real world. An easy flow through degree gets you the cert but gives you unrealistic expectation of how hard the real world can be.

    Personally my degree was a mind bending bitch of mumbling lecturers and impossible (literally in some cases) questions that covered everything from quantum mechanics "basics" and abstract computing theory through to how to dope a transistor.

    It was cruel, it was unusual... it was great.
     

  • Re:Hmmm... (Score:5, Insightful)

    by fbjon ( 692006 ) on Tuesday December 02, 2008 @11:02AM (#25959477) Homepage Journal
    There's nothing exceptionally wrong with Java as a starting language, though I may be biased since that's what we had. In any case, my uni has now switched to Python, which is probably even better.
  • by mangu ( 126918 ) on Tuesday December 02, 2008 @11:05AM (#25959527)

    CS programs producing students who know loads and loads of theory and can't write a damn line of actual code.

    I agree with that, but it isn't only in CS courses that programming should be taught.

    The problem I see in current engineering and sciences courses is that they don't teach numerical analysis. Engineers and scientists today try to do everything in matlab or excel, except for those that do postgraduate courses, who often try to do things in fortran.

    Programming languages are tools that anyone involved with advanced uses of computers should learn to use. If you are a professional you should know how to use professional tools.

  • Re:The Text (Score:4, Insightful)

    by dkleinsc ( 563838 ) on Tuesday December 02, 2008 @11:11AM (#25959629) Homepage

    I'd think Ada Lovelace would have a better claim there, given that not only did she never own a computer, they didn't even exist and there was at the time no such thing as a "program".

  • Re:Hmmm... (Score:5, Insightful)

    by RobKow ( 1787 ) on Tuesday December 02, 2008 @11:13AM (#25959645)

    I'd have to say in recruiting software engineers I have much more of a problem with theory-light code monkeys than I do with non-coders that are well-versed in CS theory. With the former you wind up with people who can't leave whatever language they're most familiar with and don't really understand why what they're doing works (cargo cult programming). It's easier to teach good coding practices in the field than it is CS theory.

    My technical interviews aren't full of riddles or obscure CS theory questions, but I ask a series of pointed questions to see if the candidate has a good familiarity with the various language families (not just particular languages), common data structures (they should at least have encountered them, even if they need to look them up to implement them), and can talk in terms of pseudocode and algorithms instead of just library functions and language idiom. Language experience is a plus, but definitely not required.

  • by Malc ( 1751 ) on Tuesday December 02, 2008 @11:19AM (#25959745)

    Why is fake assembly and fake OS cruel? It's computer science, not a vocational tech course. They've presumably tried to bypass the issues of real-world systems that distract you from learning the point. Once you've got the basic concepts, any OS and any language become approachable - why would you want to learn something specific that would be out-of-date in short measure? Seems rather myopic to me.

  • by thermian ( 1267986 ) on Tuesday December 02, 2008 @11:21AM (#25959781)

    I don't think you can lay the blame for students knowing less on the department that they attend.

    Mine taught a good mix of coding and theory, but we still had morons who didn't do enough coding to actually learn their craft well, and people who learned the coding but didn't learn enough theory to get decent course grades.

    The point is, while at university studying computer science or any other subject it is your own responsibility to teach yourself around the subjects you are introduced to in the classroom.

    I was taught using Java and Delphi, and yet finished my degree as a pretty competent C coder, in spite of never having attended a class on that language.

    I also studied a great deal more around the subjects then many of my peers. Those who did the same as me tended to do well on graduation, I went on to more years of poverty as a Ph.D student myself.

  • Re:engineering (Score:5, Insightful)

    by Lord Ender ( 156273 ) on Tuesday December 02, 2008 @11:22AM (#25959787) Homepage

    I don't know what you think FrontPage has to do with anything. Perhaps you're just trolling?

    Software engineers should understand use case analys, user interface design, project management and finance, and many other important subjects "computer science" curricula ignore while beating students over the head with details theory. Understanding issues of scalability is good (though often actual testing is used in the engineering world for practical reasons), but we don't need four years of that while ignoring more important topics.

    I'm not saying exhaustive study of the mathematical theory of computation is bad. I'm saying students are badly served at most universities by focusing on that at the expense of other topics.

  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Tuesday December 02, 2008 @11:25AM (#25959835)
    Comment removed based on user account deletion
  • by mmkkbb ( 816035 ) on Tuesday December 02, 2008 @11:26AM (#25959849) Homepage Journal

    Honestly, it's most important to learn how to learn new languages than to learn any specific one. The specific language will change far more often than the concepts they represent.

  • Recursive Descent (Score:3, Insightful)

    by tomhath ( 637240 ) on Tuesday December 02, 2008 @11:26AM (#25959853)
    "Right from the beginning, and all through the course, we stress that the programmers task is not just to write down a program, but that his main task is to give a formal proof that the program he proposes meets the equally formal functional specification."

    Dijkstra was a genius and made many contributions to Comp. Sci. But his suggestion that a program (really a program design) should be accompanied by a formal proof has problems at both ends of the development cycle: how do you prove that the formal specification is what the customer wanted, and how do you prove that the code actually implements the design?

    I've seem automatic testing products that claim to do both, but in order to make them work you have to specify every variable and every line of code as the "requirements", then compare what the tool thinks the result should be to the output of the program. And yes, the vendor suggested that the business analysts write the formal requirements; you can imagine how well that worked.

  • Re:Hmmm... (Score:5, Insightful)

    by amorsen ( 7485 ) <benny+slashdot@amorsen.dk> on Tuesday December 02, 2008 @11:27AM (#25959883)

    i.e. CS programs producing students who know loads and loads of theory and can't write a damn line of actual code.

    That's because CS programs are misnamed. Most coding should be done by engineers, not scientists. A Master in Physics doesn't necessarily qualify you to build bridges either.

  • by SatanicPuppy ( 611928 ) * <SatanicpuppyNO@SPAMgmail.com> on Tuesday December 02, 2008 @11:31AM (#25959921) Journal

    In a practical field where you can do real assembly and work on a real OS, you ask why doing fake make-work is cruel?

    Theory is fine, but theory shouldn't trump practical application in a field where practical applications are everywhere.

  • GIGO (Score:4, Insightful)

    by TheWoozle ( 984500 ) on Tuesday December 02, 2008 @11:32AM (#25959947)

    The glaring hole in Dijkstra's argument is that most software is built to automate what used to be manual processes, and they therefor have to mimic a human-centric process... which is inherently illogical, inefficient and rife with nonsense.

    In the world outside the ivory tower, programmers do not have the freedom to create completely logical, functionally complete programs that can be reduced to a mathematical proof.

    Next time your boss comes to you with an project to write an application, show him this paper and explain that what he's asking for is "medieval thinking" and see if you can then explain to him why you should keep your job if you don't want to do it.

  • by SageinaRage ( 966293 ) on Tuesday December 02, 2008 @11:39AM (#25960067)
    It's the equivalent of a biology class detailing the possibilities of life, by examining chemical interactions, without actually examining any actual living organisms.
  • Self-confidence (Score:4, Insightful)

    by dgenr8 ( 9462 ) on Tuesday December 02, 2008 @11:45AM (#25960177) Journal

    All I have to do is read one paper like this to be reminded why I stayed out of academia. Ah the smugness, the hypocrisy, the great irony. A "radical novelty" this essay is not.

    There are plenty of truths out there yet to be discovered. Unfortunately most academics lack the self-confidence to go looking for them and instead find clever new ways to twist ideas around.

  • Re:Hmmm... (Score:3, Insightful)

    by camperdave ( 969942 ) on Tuesday December 02, 2008 @11:46AM (#25960185) Journal
    He's right, though. Writing the code is secondary. What matters is algorithms, data structures, proper analysis. Anyone with a computer and an internet connection can learn how to write a line of code. Heck, a modern integrated development environment almost writes the code for you. However, knowing what to write is a lot more important than knowing how to write it. One can write iterative code for traversing a linked list, or one could write recursive code. Knowing which is best can be what separates the hacker from the script kiddie.
  • by SageinaRage ( 966293 ) on Tuesday December 02, 2008 @11:47AM (#25960205)

    The thing that confuses me the most about this paper is his hatred for using anthropomorphic metaphors to describe the functioning of a program. It confuses me partly because his examples of why it's bad don't seem to actually address his complaint, or anything like it. But also, because the more I think about it, they seem to fit very well.

    Program components may not be living things, but they do run autonomously. They perform actions, frequently without any input from me. They seem to do things and interact with other components on their own - why not describe it as such? There's also the fact that he doesn't give any alternate way of describing what's going on inside a program.

  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Tuesday December 02, 2008 @11:48AM (#25960225)

    There are some good quotes atributed to him, but one particular one that goes to show how very wrong even experts can be:

    It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.

    In my experience this is utter arrogant rubbish. Not being able to teach good programming to people who know Basic only stems from the inability of people like Dijkstra to teach. One of the reasons I usually stear clear of all to academic IT enviroments and programming languages. Like Ruby or Java.

  • by 140Mandak262Jamuna ( 970587 ) on Tuesday December 02, 2008 @11:51AM (#25960281) Journal
    The cruelty is not being inflicted upon the students, it is on the teachers. All these dumb kids got through high schools with endemic grade inflation, not knowing the first thing about anything but with some supreme self confidence. Teaching them anything is difficult. Can't figure out simple free body diagrams in freshman physics, get all confused by the p orbitals and electrons in chmistry, can't differentiate a polynomial if their life depended on it, but they all come and sit in the computer science class and expect to be taught using innovative interesting creative techniques that require no effor on their part. Being a college prof dealing with freshman is a punishment, I tell you.
  • Re:Hmmm... (Score:5, Insightful)

    by doti ( 966971 ) on Tuesday December 02, 2008 @11:52AM (#25960295) Homepage

    you serious?

    to me, system.out.println looks way more reasonable than this "cout << endl" thing.

  • Sorta Kinda Yeah (Score:4, Insightful)

    by Aaron_Pike ( 528044 ) on Tuesday December 02, 2008 @11:55AM (#25960345) Homepage
    He makes some good points, and I agree with a lot of Dijkstra's philosophies, there. I'm totally digging on the concept of radical novelty; I'm a big fan of teaching by cognitive dissonance.

    However, I can't get behind the man's call to teach without compiling and running programs. Well, to be fair, I'd have no problems teaching a freshman college course that way, but I'm teaching teenagers. I want the students to be able to unleash what they have wrought (or at least see that they've created a program that works). That's the bait that hooks them deeper into the coolness that is computer science or software engineering or programming or pattern engineering or thinkyness or whatever you care to call it.

  • by tucuxi ( 1146347 ) on Tuesday December 02, 2008 @12:11PM (#25960591)

    I think both approaches (top-down and bottom-up) make sense. You learn C very fast if you can think of it as a high-level assembler. And learning assembler teaches you a lot about what computers are all about and what they can and cannot do.

    But learning algoriths on C or other low-level, manage-your-own-memory languages is unnecessarily painful and error-prone. The algorithm exists independently of a specific language incarnation. Learn the algorithms in a language that makes it easy to concentrate on the problem and not get lost in a thousand small implementation details.

    You reach enlightenment when you can bridge the gap from very low level to the highest levels. But it is folly to try to do everything at once.

  • by Amazing Quantum Man ( 458715 ) on Tuesday December 02, 2008 @12:13PM (#25960621) Homepage

    Oh, I don't know....

    When I was in college, we (the students) were pushing for the CIS department to offer a course in...

    wait for it...

    VAX assembly.

    That's real useful now, isn't it?

  • Re:Hmmm... (Score:5, Insightful)

    by edwinolson ( 116413 ) on Tuesday December 02, 2008 @12:24PM (#25960807) Homepage

    I find it ironic that, to establish your argument that Java hides implementation details, you used a C++ example employing operator overloading such that the mere existence of functions is utterly concealed.

  • That sounds like just about every biology class I've ever heard of.

  • by EastCoastSurfer ( 310758 ) on Tuesday December 02, 2008 @12:25PM (#25960851)

    There is nothing wrong with learning Java, although that's not what you should be doing for 4 years during a CS degree. During my degree some classes used Java, some C, Python, whatever the teacher felt like that semester (or was required because of the class). It was the students responsibility to learn the language on your own time. Class time was for learning about the theory that underpins *all* languages and other big topics that span multiple languages like OO, etc....

  • Re:The Text (Score:5, Insightful)

    by chunkyq ( 995864 ) on Tuesday December 02, 2008 @12:26PM (#25960873)
    Turning? TurNing?!
    Woe be to us, for all is certainly lost.
  • by Max Webster ( 210213 ) on Tuesday December 02, 2008 @12:30PM (#25960927)

    I think the problem is the false assumptions and analogies that get introduced by these lines of thinking. If a network is "this guy talking to that guy", your thinking will be constrained by what you know about human conversation. If there's a problem, someone can talk louder, slower, etc. and the analogy holds. But if the solution involves something that has no equivalent in the day-to-day world, how are you going to conceptualize it?

    My pet peeve, that descends from this same idea, is from the teaching of object orientation. A car is a type of vehicle; a truck is a type of vehicle; they both have wheels; maybe the number of wheels is different; maybe each has a different turning radius or procedure for backing up.

    Great, now you understand inheritance, polymorphism, member functions, etc. However, in practice, you use object orientation to avoid zillions of "if" statements, special case code, large blocks of almost-but-not-quite duplicated code.

    In my experience, someone who learned OO by simple analogies is likely to be locked into thinking "I have to write every program by making a tree of classes like with the cars and trucks". Rather than "there are different ways to structure the code, and a good OO way will get rid of a lot of branching, magic numbers, and redundant code".

  • by Plekto ( 1018050 ) on Tuesday December 02, 2008 @12:32PM (#25960965)

    Fantastic article. I especially like this part:

    (1) The business community, having been sold to the idea that computers would make their lives easier, is mentally unprepared to accept that they only solve the easier problems at the price of creating much harder ones.

    So true. I deal with this every day. Despite the high tech wizardry around us, business still runs pretty much the same. Just, the management is all too happy to throw its problems at someone else. I can't remember how many times in the past that I've had a client, boss, or manager ask me something that is impossible and tell me "fix it" or "make it work".

  • Boy do you need to go back to school. Edsger wrote more and better stuff in his lifetime than anyone here on Slashdot. Did you ever get directions from Google Maps or Mapquest? Thank Edsger -- his shortest path algorithm is what they all use, and by the way, he wrote that before you were born, most of you. You know the semaphores used in the multi-cpu Linux kernels? Yep, you owe Edsger for them, too. And programming languages like C, Pascal, etc.? He helped write the first Alogol compiler, the great-grand-daddy of them all, once again before most of you were born.

    Just because he eschewed the run-break-fix approach so beloved of the folks who are spewing billions of lines of error-laden code into the world today, doesn't mean he hadn't forgotten more about writing code than most folks here have ever learned. And yes, he advocated developing code formally, and he liked to do it with pen and paper.

    So learn about who you're making snide comments about, and show some respect. When people are still using any algorithm you came up with 30 years from now, you will have the right to say something about Edsger Dijkstra.

  • by Device666 ( 901563 ) on Tuesday December 02, 2008 @12:37PM (#25961043)

    "I mean, if 10 years from now, when you are doing something quick and dirty, you suddenly visualize that I am looking over your shoulders and say to yourself 'Dijkstra would not have liked this', well, that would be enough immortality for me." - Edsger Wybe Dijkstra

    A lot of software engineers like to work with new technologies, new paradigms, new code design patterns, new software development technologies and other forms of complexity. Quality Assurance rules by checklists and testing, only fixing symptoms. Every coder has it's own ideology what is correct code or the correct way to do it. Correctness proven by superficial subjective quality standards, beautiful crafted hacks included.

    Edsger found ways to mathematically prove programs to be correct, requiring a very high level of math skills (the same level which is needed to prove the correctless a mathematical theory). This utopistic objective quality standard. Stuff for the real hard core developers who have plenty of time.

    But most haven't the time. In the end time-to-market is key. Swift hackers remain the heroes of business who craft applications which get used in the real world.

    However some pragmatical things I thank Edsger Wybe Dykstra for: invention of the stack, his low opinion about the GOTO statement; shortest path-algorithm, also known as Dijkstra's algorithm; Reverse Polish Notation and related Shunting yard algorithm; Banker's algorithm; the concept of operating system rings; and the semaphore construct for coordinating multiple processors and programs. His charismatic remarks about what we would typically consider software engineering are entertaining and humbling, examples:

    • "The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offence."
    • "APL is a mistake, carried through to perfection. It is the language of the future for the programming techniques of the past: it creates a new generation of coding bums."
    • "The question of whether Machines Can Think... is about as relevant as the question of whether Submarines Can Swim"
    • "The competent programmer is fully aware of the limited size of his own skull. He therefore approaches his task with full humility, and avoids clever tricks like the plague"
    • "Write a paper promising salvation, make it a 'structured' something or a 'virtual' something, or 'abstract', 'distributed' or 'higher-order' or 'applicative' and you can almost be certain of having started a new cult."
    • "The required techniques of effective reasoning are pretty formal, but as long as programming is done by people that don't master them, the software crisis will remain with us and will be considered an incurable disease. And you know what incurable diseases do: they invite the quacks and charlatans in, who in this case take the form of Software Engineering gurus. "
    • FORTRAN, 'the infantile disorder', by now nearly 20 years old, is hopelessly inadequate for whatever computer application you have in mind today: it is now too clumsy, too risky, and too expensive to use.
    • "In the good old days physicists repeated each other's experiments, just to be sure. Today they stick to FORTRAN, so that they can share each other's programs, bugs included."
    • "It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration."
    • "Programming is one of the most difficult branches of applied mathematics; the poorer mathematicians had better remain pure mathematicians."
    • "We can found no scientific discipline, nor a hearty profession, on the technical mistakes of the Department of Defense and, mainly, one computer manufacturer."
  • False dichotomy (Score:5, Insightful)

    by Tony ( 765 ) on Tuesday December 02, 2008 @12:38PM (#25961073) Journal

    Most of y'all are presenting a false dichotomy. It's not "Either learn abstract formalism OR learn practical languages." You can do both, you know.

    I have met too many people who think that, because they can write some tangled, fucked-up C++, they are software engineers. Never mind the fact that they couldn't learn LISP, Objective-C, Java, or any number of other useful languages, as they don't know the first thing about actual computing.

    Teaching Java or C++ doesn't matter. Sure, you need classes on practical application of your knowledge. But if you ignore what Dijkstra says here, you're going to end up with a bunch of code monkeys who have to test every element of the set, rather than test the rules of the set.

    In my experience, those who started off learning theory, then learned how to apply that theory in practical situations, are far better programmers than those who are taught "practical" languages.

    There's some very good advice in that paper. Calling him "out of touch" is a bit shortsighted.

  • by orclevegam ( 940336 ) on Tuesday December 02, 2008 @12:39PM (#25961093) Journal
    Agreed, with the exception that I think that assembly should be taught prior to Java or even C. With C the concept of a pointer is a rather abstract concept, but with assembly it's a very concrete and easy to understand component of the language.
  • Re:The Text (Score:1, Insightful)

    by Anonymous Coward on Tuesday December 02, 2008 @12:47PM (#25961225)

    > Some people want photons in pretty patterns on
    > their screens.

    Symbols (photons). Symbols (petty patterns).

    > Some people want the control surfaces of
    > aircraft actuated in ways that save
    > time/money/lives

    Symbols (control surfaces). Symbols (actuated).
    Symbols (time). Symbols (money)

  • by TheRaven64 ( 641858 ) on Tuesday December 02, 2008 @12:49PM (#25961265) Journal
    And if you want to really understand a language, you should learn a lower-level and a higher-level one. If you want to write good Java code, spend some time studying C and Smalltalk. If you want to write good C, learn an assembly language or two and something like Java.
  • Re:The Text (Score:5, Insightful)

    by theaveng ( 1243528 ) on Tuesday December 02, 2008 @12:50PM (#25961279)

    No.

    (1) His argument is that to discuss "software engineering" is as silly as to discuss "algebra engineering" or "formula engineering" in math courses. A program is simply a formula to be executed - nothing more - says the Computer Science professor.

    (2) Programs manipulate numbers. Mathematic formulae manipulate numbers. It's an entirely reasonable conclusion that he has reached that a program is merely a formula.

    (3) Putting pretty pictures on screen or manipulating airplane surfaces (my specialty) is still just formula execution.

  • Re:The Text (Score:5, Insightful)

    by 1729 ( 581437 ) <.moc.liamg. .ta. .9271todhsals.> on Tuesday December 02, 2008 @12:55PM (#25961347)

    More than not being "great", he seems to be rather foolish...

    1) His main premise is that "software engineering" cannot exist because software cannot be proved correct

    Actually, Dijkstra spent a lot of time showing how to prove a program's correctness. See his "A Discipline of Programming", for example.

  • by Bozdune ( 68800 ) on Tuesday December 02, 2008 @12:57PM (#25961381)

    Maybe they should choose a different line of work?

  • Re:Hmmm... (Score:3, Insightful)

    by Hatta ( 162192 ) on Tuesday December 02, 2008 @01:02PM (#25961471) Journal

    That's because it's computer science and not engineering. The problem is that people conflate the two. If you want a coder, hire a coder not a scientist.

  • by moderatorrater ( 1095745 ) on Tuesday December 02, 2008 @01:09PM (#25961597)

    The C-first approach leads to an early focus on low-level details. It also obscures programming style and design issues by forces the student to face many technical difficulties to express anything interesting.

    Expressing interesting things doesn't happen in a CS course, at least the ones where you're learning the language. It takes new CS students hours to implement the most simple linked list because it's not familiar to them. I learned low level first and I'm finding that it's the best way to teach my sister-in-law who's a beginning CS student. They're trying to teach object oriented features before they teach arrays or loops. Objects are constructs on top of the other programming concepts and should be taught as such. It was only after showing her how to use low-level features that she was able to start doing any semblance of OO programming.

    People get so caught up trying to teach the "right" way to program that they don't teach how to program first, which is a mistake. Students need to learn the power and wonder of while, for, and regular functions before you can teach them the power of object oriented programming. Computer science is unfamiliar and strange, let students learn the simple things before throwing the advanced concepts at them.

    I guess what I'm saying is that a good course would teach functional programming before teaching object oriented programming later in the same course.

  • by 0xABADC0DA ( 867955 ) on Tuesday December 02, 2008 @01:17PM (#25961763)

    You're right. Also Stroustrup had clearly pointed (in other argument lines) that C is not the better way to learn C++ (or OO in general):

    Somebody quoting Stroustrup on the topic of computer languages... seriously? C++ is like legalese -- it's impenetrable to read, full of unintended consequences, and even though it's spelled out in excruciating detail what it says is still open to interpretation.

    Not only is C the first subset of C++ that programmers should learn, it is the only subset of C++ they should learn.

    And I argue that C actually teaches people more about object-oriented that most other languages, because it teaches them in no uncertain terms why you should use objects. It's harder to realize why you don't just make fields public until you've seen global variables in a C program.

    Then Java teaches you how to do OO where you are not allowed to 'cheat' by replacing methods at runtime, or calling methods that don't exist, etc. And JavaScript takes all that and gives you LISP power.

  • by Fallingcow ( 213461 ) on Tuesday December 02, 2008 @01:18PM (#25961795) Homepage

    I like the part where he complained about kindergarten teachers bootstrapping kids into the unnatural, abstract world of mathematics with *gasp* addition of one pile of apples to another.

    He's clearly never tried to teach a 5-year-old about math. Kids can get stuck at very low levels of understanding if you don't guide them up the ladder of abstraction a bit, and examples like that are key.

  • by Darth_Burrito ( 227272 ) on Tuesday December 02, 2008 @01:20PM (#25961821)

    Rule Number 1 of Computer Science - Don't reinvent the wheel. Everyone who invents their own education focused language that's only used at their school is violating the first rule of computer science. At my university, the first year of programming is taught in a java like variant of C++ called Resolve C++. Why is it bad?

    1. It hurts students applying for co-ops/internships because none of the employers no what it means.
    2. It hurts students because there are about a thousandth of a percent of the sources of information about Jo-bob's made up educational language as there are for something like Java.
    3. It hurts students because they don't feel like they can go home and make useful projects in weird language XYZ in their own time.
    4. It wastes time because you are rolling your own instead of reusing something tried and true. As the needs of the modern world change, will your made up language be able to change with it, or will it stagnate like everything else in academia?
    5. By rolling your own when a veritable shitstorm of acceptable solutions exist, you are setting a horrible example for your students.
    6. Most importantly, while students are not always right, they absolutely hate it when they are forced to learn some professors pet project and if something useful is only slightly harder why piss them off.
  • by TheRaven64 ( 641858 ) on Tuesday December 02, 2008 @01:23PM (#25961865) Journal
    Java makes a terrible teaching language for a number of reasons.
    • It isn't a good language for teaching object-orientation because it isn't a pure OO language so you have to understand the difference between objects and intrinsics.
    • It isn't a good introduction to programming, because you need a fairly complex program a class with a static method) for hello world.
    • It isn't a good introduction to data structures because all objects are references and all intrinsics are not, making aliasing difficult to teach.
    • It isn't a good introduction to computer memory, since it adopts the Smalltalk memory model which hides almost everything from the programmer (great for using, bad for teaching).

    There are lots of other reasons that I am too lazy to list here. Learning Java is not bad, but learning it as a first language does not make your life easy. A good introduction to programming course should cover half a dozen languages, as case studies, rather than attempting to use one to teach all of programming.

  • by Garse Janacek ( 554329 ) on Tuesday December 02, 2008 @01:25PM (#25961905)

    almost at the end of their Computer Science degree, and who didn't understand why their code was crashing with "null references" when "Java was supposed to get rid of all that memory stuff!".

    If they can almost finish a degree and still be surprised when Java dies with a null reference, then the problem isn't that they were taught to program using Java, it's that they were taught Java very badly...

  • Re:The Text (Score:3, Insightful)

    by WhiplashII ( 542766 ) on Tuesday December 02, 2008 @01:31PM (#25961999) Homepage Journal

    I agree that is what he is saying - I disagree that it is a reasonable thing to say ;)

    A program is nothing without the computer it runs on. I could agree that if you were building programs for the purpose of knowledge (like he does, presumably), you are not an engineer - you are a mathematician. But that is not what computers are! Computers are a way to make something happen. By using generic computers running programs, we can more easily make complicated stuff happen.

    The airplane surface you actuate is engineered - but not by itself, it is engineered along with the software used to control it. The vast majority of software is engineered - engineers are trying to get a specific outcome, they are not trying to calculate something. Any calculation is incidental to the primary purpose.

    As a mathematician, he thinks the math is the most important part of it. Physics guys think the same - theory over substance in transistor design, for example. (CM polishing was a bad word for a long time!) Engineers just do what works - we don't care if it is perfect.

    People should not be studying abstract programming in college in order to learn applied engineering for industry. That's why the CS degree is so disparaged - it is great if you want to work in a university thinking up cool math, but it is not very good at building real stuff.

  • by rocker_wannabe ( 673157 ) on Tuesday December 02, 2008 @01:33PM (#25962023)

    It is really scary how few people really understood what Dijkstra was saying. My best guess is that because most people learn programming backwards, IMHO, by starting with the languages and then learning higher-level logical constructs, like state machines or even lists or maps, it has permanently skewed their perspective.

    My experience, after over 10 years as a system test engineer on software systems, is that poor software really is from a lack of discipline in starting with mathematical constructs before writing the code. I generally worked with very experienced programmers who didn't make a lot of language errors, like improper use of pointers, but were still tripped up by things like when to free allocated memory. The kind of errors that wouldn't occur very often if the discipline of using a mathematical construct like a hierarchical state machine was enforced. For instance, instead of starting with a proven state machine library they would just set flags and create an adhoc state machine that was riddled with problems.

  • by EWAdams ( 953502 ) on Tuesday December 02, 2008 @01:40PM (#25962169) Homepage

    ... in this article he's insisting that a carpenter has to be a physicist before he should be allowed to build a house. (Yes, that's arguing from analogy -- deal with it.)

    The fact is that the world needs a hell of a lot of running code in a hurry. Millions of lines of it. We don't have the luxury of treating a realtime airline-pricing-optimization manager as a lovely formal system that we can write out in pencil. We have to get it up and running, then fix bugs and add features as time permits, because of a phenomenon that Dijkstra doesn't take into account: IT'S NEEDED *NOW*.

    I also think he's being unfair by suggesting that modern educational institutions are anything like as hidebound as medieval ones. First, medieval universities were not intended for inquiry in the first place; they were intended to prepare young men for the priesthood -- i.e. to teach them doctrine, which was not subject to inquiry. No institution except maybe a seminary is as restrictive as that these days. Second, it doesn't seem to have occurred to him that learning by analogy is how people learn *effectively.* He may decry teaching children about arithmetic by using apples because it's not a formal system, but a five-year-old doesn't have enough knowledge to know what a formal system IS. Starting a five-year-old with Principia Mathematica is just pointless. And your basic coding grunt who wants to build websites doesn't need to be taught JavaScript as a formal system either.

  • by TheRaven64 ( 641858 ) on Tuesday December 02, 2008 @01:47PM (#25962269) Journal

    Dijkstra felt that computer science is about computers as much as astronomy is about telescopes -- IOW, not much

    Spoken like someone who has never met an astronomer. Any half-competent astronomer has a detailed grasp of optics and will have built at least one reflecting telescope (and maybe a refracting one as a hobby). Telescopes are very important to astronomy - the subject would barely exist without them - but they are not the focus of it. Similarly, computer science would be a very dull subject to study if computers didn't exist, and understanding computers is very important to understanding computer science. Viewing computer science as the study of computers, however, makes as much sense as viewing astronomy as the study of telescopes.

  • Re:The Text (Score:5, Insightful)

    by hypnotik ( 11190 ) on Tuesday December 02, 2008 @01:47PM (#25962277) Homepage

    I think you are missing his point, perhaps intentionally.

    The vast majority of software is engineered - engineers are trying to get a specific outcome, they are not trying to calculate something.

    Computer programs, he argues, are nothing more than long proofs. Each function you write is equivalent to a predicate in logical calculus, or a function in mathematics.

    If you were only interested in outcome, you could write a program that multiplies two numbers together as a long series of "if" statements. But you'd most likely miss some possible values for inputs.

    However, if you were interested in it being correct for all possible inputs, you would use the mathematical operation * or use a loop to calculate the correct answer.

    I think that is the argument he is making and as a University professor, I tend to agree. I've seen some of my students test an array by using an if-statement for every single element of the array, where as a loop would have been infinitely more suitable.

    Only one should be deemed correct. But if you adapt the "engineering" and "outcome" point of view, both are correct.

  • Re:The Text (Score:4, Insightful)

    by orclevegam ( 940336 ) on Tuesday December 02, 2008 @02:03PM (#25962527) Journal
    This reminds me of the problem that physicists have been trying to grapple with for a long time between the energy view of the world and the forces view. Some things are simpler to model as a set of interacting forces, others as a set of energy transformations. Of course both are equally correct models, but some things are easier to model using one or the other. Likewise taking the algorithm, versus outcome ways of modeling programming you see something similar, where a problem can be modeled with either one, but some problems are easier to model with one than the other.
  • Re:Alan Turing (Score:3, Insightful)

    by Anonymous Coward on Tuesday December 02, 2008 @02:06PM (#25962597)
    If Britain had not dismantled the machines they made during the war, and didn't force Turing to take hormonal treatment (thereby indirectly killing him) for being homosexual, they would most likely still be a super power to this day.
  • I can relate... (Score:5, Insightful)

    by gillbates ( 106458 ) on Tuesday December 02, 2008 @02:17PM (#25962781) Homepage Journal

    But I think his arguments are centered around a misunderstanding of terms. It's simple academic dishonesty to which he objects:

    • Computer Science is a discipline of mathematics, a true science.
    • Computer Engineering is an engineering discipline, concerned with how to use the principles of computer science to create working systems. Provable correctness is a must; you don't get to respin a board until it works. Generally speaking, things have to work right the first time.
    • Software Engineering is a vocational discipline, which requires some knowledge of computer science, in the same way a construction foreman needs to know basic math. Software engineering requires both an understanding of programming and the corporate structure for producing software. But it is more a collection of specific pragmatic methods than an exact science.
    • Computer Programming is a vocational discipline. It also requires a basic understanding of computer science, but for some jobs, notably business applications, it is enough to merely understand the language du jour. Regardless of how terrible the code is, from a business perspective, someone who produces code which can be shipped in a short period of time is a good programmer. The corporate bean counters could care less about things like correctness and maintainability, and are more interested in the state of accounts receivable. The programmer who helps out the accounts receivable side will be better liked regardless of the quality of his code, and probably promoted to management.
  • Re:The Text (Score:3, Insightful)

    by colmore ( 56499 ) on Tuesday December 02, 2008 @02:18PM (#25962793) Journal

    You're reducing his argument to a tautology in order to defend it. I don't think you want to do that.

  • Re:The Truth (Score:5, Insightful)

    by colmore ( 56499 ) on Tuesday December 02, 2008 @02:21PM (#25962835) Journal

    I think this is a clear case of computer science and software engineering (without going into Dijkstra's assessment of that term) being different beasts.

    Both the theoretical and immediately practical implementation of software are interesting and important, but they're studied in different ways by different people and trying to mash the two together tends to create more conflict than interdisciplinary synergy.

  • Then again, maybe it's just because my team has alternating making me want to kill them and myself for the better part of the semester...

    You're ready for a job in the software industry!

  • by volkris ( 694 ) on Tuesday December 02, 2008 @02:37PM (#25963125)

    IMO there needs to be a starker contrast between computer science and computer engineering, just as there's a contrast between "real" engineering and, say, physics.

    Those who just want to be able to program, who are focused purely on employability right out of college, can look for computer engineering courses teaching the popular programming languages. These people can be fine programmers, ready to start... so long as the language popularity hasn't changed by the time they graduate. It would be sort of advanced vocational program, just like any other engineering.

    But the real scientists, those who want to experience and express code on deeper levels, should be looking for something very different, that which Dijkstra describes. Just as a scientist and an engineer can work on the same project contributing different skills, the computer scientist has his place even in the real world.

    The two really are different mentalities, and it seems that the mixture of the two leads to situations that are non-ideal for either.

  • by SatanicPuppy ( 611928 ) * <SatanicpuppyNO@SPAMgmail.com> on Tuesday December 02, 2008 @02:56PM (#25963469) Journal

    Typical snobbery. As long as you're learning the theory, why shouldn't you do something practical? I'm sick to death of academics who work their asses off to create some wild assed "teaching language" or "teaching OS" just to prevent you from ever actually getting to touch the real thing.

  • by HonIsCool ( 720634 ) on Tuesday December 02, 2008 @04:06PM (#25964685)
    Everyone who is arguing against Dijkstra and saying that programing is about engineering, not writing formulas or proofs, remember your words when the software patent debate pops up again...
  • Re:The Text (Score:3, Insightful)

    by Metasquares ( 555685 ) <{moc.derauqsatem} {ta} {todhsals}> on Tuesday December 02, 2008 @04:09PM (#25964747) Homepage

    Just because you can abstract something to the degree that it is the same as anything else in the universe does not mean it is proper to do so. It would be like programming in an OOP language by upcasting everything to Object before working with it.

    I admire some of Dijkstra's contributions. Certainly his shortest-path algorithm was a great accomplishment. But he was very opinionated, and some of his opinions aren't necessarily the best strategies for others to employ in their own work.

  • Re:The Text (Score:4, Insightful)

    by WhiplashII ( 542766 ) on Tuesday December 02, 2008 @04:31PM (#25965099) Homepage Journal

    When designing avionics software, the halting problem is a real problem ;-}

    So to get around that, you write all software as a (very long) series of if statements in an interrupt routine. So what happens is the interrupt fires, and the list of instructions starts to execute... eventually, the interrupt fires again, restarting the process.

    The reason to do this: By putting instructions higher in the list, those instructions are run first - so you are able to put the "don't kill us" calculations before the "avoid turbulence" calculations. As an example of this saving lives, the computer in the Apollo 11 moon lander ran into trouble just a few feet above the lunar surface - pretty much a worst case scenario. Because the computer was designed like this, although they lost some functions they did not lose the ability to land the spacecraft - and Niel Armstrong made history as the first man to walk on the moon rather than die on it.

    In many software engineering applications, you must consider the failure modes of the hardware you are using - not the theory of programming formulas. Software that is proven correct but then doesn't run properly is useless.

    The reason I bring this up is that you assumed that using a gazillion if statements is a bad approach, because you are optimizing expression efficiency and treating the program as a formula. This is not the proper response in many cases, and an engineer would know that - that's why you are proving my point.

  • by ivan256 ( 17499 ) on Tuesday December 02, 2008 @05:13PM (#25965921)

    It's harder to realize why you don't just make fields public until you've seen global variables in a C program.

    That's an interesting statement, considering the set of reasons you shouldn't use global variables only slightly overlaps with the set of reasons you shouldn't make your object attributes public. Additionally, in introductory CS classes (anything in the first year), it's not obvious to the student at all why you shouldn't use global variables. Early C students learn about scoping, are told that it is important to avoid the global scope, and are taught *how* to avoid the global scope, but it isn't until you start to learn higher level computer science and basic level software engineering concepts that it starts to become clear why you avoid the global scope.

    It is important that C teaches you why you should use objects, but it is equally important that C teaches you why you shouldn't (though arguably there are other, better languages for teaching the latter). A good software engineer has learned that there is a time and place for both object oriented solutions and procedural solutions. The worst engineers I've ever worked with have insisted that one of the two methodologies was always the answer.

  • Re:The Text (Score:3, Insightful)

    by Intron ( 870560 ) on Tuesday December 02, 2008 @06:50PM (#25967581)

    OK, perhaps I should have mentioned "in the context of avionics systems." A watchdog timer is a timer that resets a CPU system if a timeout is reached. It is a way of attempting to achieve reliability in the presence of less reliable hardware.

    A watchdog timer is a timer that is periodically reset. If it times out the system does whatever it is designed to do in that case, which is not necessarily to reset the CPU. The use in high-availability systems is usually to transfer control to a standby when the primary system has failed.

    Do you believe it doesn't? The atmosphere stops gamma rays from hitting your equipment. These gamma rays change the value of variables (as in, the gamma ray flips the DFF circuit to the opposite value in your CPU). That means you cannot rely on variables to make sure your loop exits. Therefor loops are bad.

    Hogwash. If you can't depend on variables or CPU registers, then you can't depend on your if-statements branching to the correct location. Your example makes no sense.

    I have a multimillion dollar budget that says I am not - how about you?

    The CEO of AIG has a bigger budget than you, so is he even more correct?

  • by theaveng ( 1243528 ) on Wednesday December 03, 2008 @07:38AM (#25973567)

    What's wrong with raising dropout rates in Computer Programming? When I studied electrical engineering, we started-off with Calculus, Linear Algebra (matrices), and Physics. i.e. Pretty boring for most people.

    People fled engineering in droves because they didn't want to do the math. Starting with "hard" or "boring" courses weeds-out who should be engineers and who should do something else. If computer programmers are not willing to do math required to create solid software, maybe they should be weeded-out too, so what's left are the individuals who truly belong in that profession.

  • Re:Hmmm... (Score:1, Insightful)

    by Anonymous Coward on Thursday December 04, 2008 @12:19PM (#25989905)

    Are you cracked out? cout is an object, so you still need to explain OOP, and you have to explain that you're calling a member method <<, which also requires explaining operator overloading, etc etc etc.

    In my opinion the C++ version requires more because you really do need to explain operator overloading, whereas the Java version is bog-standard dot-notation method invocation. Obviously we don't agree on that point, but everyone should be able to agree that they both require quite a bit of 'splaining if the person has never seen a programming language of any kind.

8 Catfish = 1 Octo-puss

Working...