Twenty Years of Dijkstra's Cruelty 727
WatersOfOblivion writes "Twenty years ago today, Edsger Dijkstra, the greatest computer scientist to never own a computer, hand wrote and distributed 'On the Cruelty of Really Teaching Computer Science' (PDF), discussing the then-current state of Computer Science education. Twenty years later, does what he said still hold true? I know it is not the case where I went to school, but have most schools corrected course and are now being necessarily cruel to their Computer Science students?" Bonus: Dijkstra's handwriting.
The Text (Score:5, Informative)
While the handwriting is a novelty (and the PDF is actually small for a PDF), I question how long that server is going to last.
Also (and yes this is nitpicking), I must contest this:
Edsger Dijkstra, the greatest computer scientist to never own a computer
I submit for consideration Alan Turing [wikipedia.org] who may have designed the ACE and worked on the earliest computer (The Manchester Mark I [wikipedia.org]) although never really owned it or any other computer. I think a lot of people identify him as not only a hero of World War II but also the father of all computers.
Re:The Text (Score:4, Insightful)
I'd think Ada Lovelace would have a better claim there, given that not only did she never own a computer, they didn't even exist and there was at the time no such thing as a "program".
Re: (Score:3, Informative)
The Dijkstra wiki link said he owned a Mac for email and web browsing. How can they expect us to read the links if they aren't going to themselves.
Re: (Score:3, Interesting)
More than not being "great", he seems to be rather foolish...
1) His main premise is that "software engineering" cannot exist because software cannot be proved correct, only proved wrong. Well, I got news for ya - rocket engineering is the same way. So is electronics. So are bridges! Or do you think that having the SRB on the shuttle burn through the main tank was by design?
2) He goes further to say that foolish mortals (unlike himself) learn by analogy, and so can't handle the truth, etc. Then, hilario
Re:The Text (Score:5, Insightful)
No.
(1) His argument is that to discuss "software engineering" is as silly as to discuss "algebra engineering" or "formula engineering" in math courses. A program is simply a formula to be executed - nothing more - says the Computer Science professor.
(2) Programs manipulate numbers. Mathematic formulae manipulate numbers. It's an entirely reasonable conclusion that he has reached that a program is merely a formula.
(3) Putting pretty pictures on screen or manipulating airplane surfaces (my specialty) is still just formula execution.
Re: (Score:3, Insightful)
I agree that is what he is saying - I disagree that it is a reasonable thing to say ;)
A program is nothing without the computer it runs on. I could agree that if you were building programs for the purpose of knowledge (like he does, presumably), you are not an engineer - you are a mathematician. But that is not what computers are! Computers are a way to make something happen. By using generic computers running programs, we can more easily make complicated stuff happen.
The airplane surface you actuate i
The Truth (Score:3, Interesting)
Dijkstra's Cruelty is the nickname UT cs students gave to the course his wife taught. It was a required course when I was there and it was 2 semesters long. I think it was called "Software Development" but should have been called "Fantasyland Development". Total waste of time and energy. I never saw him on campus, except maybe at graduation.
The best classes were given by people who either were working in the real world or had some experience in the real world and were trying to get their masters or
Re:The Truth (Score:5, Insightful)
I think this is a clear case of computer science and software engineering (without going into Dijkstra's assessment of that term) being different beasts.
Both the theoretical and immediately practical implementation of software are interesting and important, but they're studied in different ways by different people and trying to mash the two together tends to create more conflict than interdisciplinary synergy.
Re:The Text (Score:5, Insightful)
I think you are missing his point, perhaps intentionally.
The vast majority of software is engineered - engineers are trying to get a specific outcome, they are not trying to calculate something.
Computer programs, he argues, are nothing more than long proofs. Each function you write is equivalent to a predicate in logical calculus, or a function in mathematics.
If you were only interested in outcome, you could write a program that multiplies two numbers together as a long series of "if" statements. But you'd most likely miss some possible values for inputs.
However, if you were interested in it being correct for all possible inputs, you would use the mathematical operation * or use a loop to calculate the correct answer.
I think that is the argument he is making and as a University professor, I tend to agree. I've seen some of my students test an array by using an if-statement for every single element of the array, where as a loop would have been infinitely more suitable.
Only one should be deemed correct. But if you adapt the "engineering" and "outcome" point of view, both are correct.
Re:The Text (Score:4, Insightful)
Re:The Text (Score:4, Insightful)
When designing avionics software, the halting problem is a real problem ;-}
So to get around that, you write all software as a (very long) series of if statements in an interrupt routine. So what happens is the interrupt fires, and the list of instructions starts to execute... eventually, the interrupt fires again, restarting the process.
The reason to do this: By putting instructions higher in the list, those instructions are run first - so you are able to put the "don't kill us" calculations before the "avoid turbulence" calculations. As an example of this saving lives, the computer in the Apollo 11 moon lander ran into trouble just a few feet above the lunar surface - pretty much a worst case scenario. Because the computer was designed like this, although they lost some functions they did not lose the ability to land the spacecraft - and Niel Armstrong made history as the first man to walk on the moon rather than die on it.
In many software engineering applications, you must consider the failure modes of the hardware you are using - not the theory of programming formulas. Software that is proven correct but then doesn't run properly is useless.
The reason I bring this up is that you assumed that using a gazillion if statements is a bad approach, because you are optimizing expression efficiency and treating the program as a formula. This is not the proper response in many cases, and an engineer would know that - that's why you are proving my point.
Re:The Text (Score:4, Informative)
what problems are you talking about?
Achieving higher than normal reliability.
I have... No clue - It's used in vital logic for failover between redundant controllers.
OK, perhaps I should have mentioned "in the context of avionics systems." A watchdog timer is a timer that resets a CPU system if a timeout is reached. It is a way of attempting to achieve reliability in the presence of less reliable hardware.
No clue
Obviously ;)
Do you believe that flying above the atmosphere changes your variables?
Do you believe it doesn't? The atmosphere stops gamma rays from hitting your equipment. These gamma rays change the value of variables (as in, the gamma ray flips the DFF circuit to the opposite value in your CPU). That means you cannot rely on variables to make sure your loop exits. Therefor loops are bad.
I think you are just spouting random gibberish.
I have a multimillion dollar budget that says I am not - how about you?
Re:The Text (Score:5, Interesting)
2. True-- but you're forgetting about the execution domain. Dijkstra points out that computers are simply "symbolic manipulators", and this is certainly true, but that does not make them general-purpose symbolic manipulators in the same way that a human is. A programmer must go to great lengths to ensure that, i.e., the number 1/10 or pi is preserved throughout the calculation chain, and doing so is computationally expensive. Sometimes prohibitively so. This is where engineering comes in, because if there's one thing engineers are really good at, it's deciding when something is "good enough" or not.
3. Sure, if you fully understand the phenomena. Are you telling me that your computational model fully accounts for turbulence?
What Dijkstra does not seem to understand is that engineering does not eschew mathematics. Engineers use the same theoretical knowledge that mathematicians and physicists do— they use the same analytical tools. Engineering is, rather, a superset of those analytical tools. It includes some new tools in order to deal with the complexity of certain tasks that are above the ability of most normal humans to solve. It is remarkably good at this.
Throwing out engineering because it will never solve the problem fully is like throwing the baby out with the bath water. Better solutions will emerge— functional programming, for instance, is very promising in many ways. I've read Dijkstra before, and I have great respect for him particularly because of his actual experience building large software systems. But this paper makes him sound like a bitter old man; maybe he didn't like the direction the field was moving.
Re:The Text (Score:5, Insightful)
More than not being "great", he seems to be rather foolish...
1) His main premise is that "software engineering" cannot exist because software cannot be proved correct
Actually, Dijkstra spent a lot of time showing how to prove a program's correctness. See his "A Discipline of Programming", for example.
Re:The Text (Score:5, Interesting)
Actually, Dijkstra spent a lot of time showing how to prove a program's correctness.
He did. In fact, he spent more time proving the program correct than it took to write, test, run, debug, and fix, the program, and then the proof still has to be checked for correctness. I learned the Dijkstra techniques for proving code. Even something as painfully simple as proving a loop invariant holds and would terminate was mind-numbingly difficult and tedious, and still fails to be correct in the presence of concurrency. Somehow the program proof advocates lost sight of Gödel's incompleteness theorems.
I'm not an advocate of Dijkstra's approach. However, does the Incompleteness Theorem really come into play here? I can't think of any useful algorithm for which I wouldn't be able to formally describe and verify the pre- and post-conditions. Can you think of any naturally-arising examples of algorithms for which undecideability might be an issue?
Re:The Text (Score:4, Funny)
Re:The Text (Score:5, Funny)
Re:The Text (Score:5, Funny)
What do you think this is, wikipedia?[citation needed]
Re:The Text (Score:5, Insightful)
Woe be to us, for all is certainly lost.
Re:The Text (Score:5, Funny)
Re:The Text (Score:5, Funny)
My forte is made out of cushions from my mom's couch.
Re:The Text (Score:4, Funny)
Not that I come up from the basement very often, but I've never seen my mom bake bread :)
Comment removed (Score:5, Funny)
Re:Mine was certainly cruel to us (Score:5, Interesting)
My old university dropped C and replaced it with Java for all CS courses apart from Operating Systems. I asked one of the professors why - he said many students complained that dealing with pointers was too hard, and that the rise of Java and Java programming jobs meant C was obsolete and pointless, that the issue of programming languages came up on prospective student visit days, and that in order to be "commercially attractive" to these potential students they had to switch. We even used to do assembly language programming in the first year - now, the replacement course teaches students how to use Eclipse for Java programming.
Several years later I was back tutoring, and I was very disappointed to find out that I had to explain pointers and pointer arithmetic to people who were almost at the end of their Computer Science degree, and who didn't understand why their code was crashing with "null references" when "Java was supposed to get rid of all that memory stuff!".
Re: (Score:3, Insightful)
If they can almost finish a degree and still be surprised when Java dies with a null reference, then the problem isn't that they were taught to program using Java, it's that they were taught Java very badly...
Re:Mine was certainly cruel to us (Score:4, Interesting)
I've done nothing but C (not even C++) programming for the last decade in various full time and consulting positions.
Linux is all C and the job market for Linux kernel, driver and system developers has been pretty active for many years now. Using QNX and vxWorks is all C programming too (not counting the tiny bits of assembly you have to stick into your programs).
This is why I think it's important that people learn some assembly language once they are past the basic syntax of C. They don't have to become experts in the assembly, but being able to write (and debug!) a few basic programs in some assembly would be a good experience. Like a hello world(one that calls the system call, and one that calls a puts function), string reverse and maybe a linked list bubble sort. If you can get those 3 done in assembly after you've been exposed to C, it should make pointers (and arrays) a lot easier to understand.
I believe being able to debug is as important as being able to program.
Re:Mine was certainly cruel to us (Score:4, Funny)
It's certainly not pointerless!
Re:Mine was certainly cruel to us (Score:5, Funny)
I love C. It's terse and really useful for optimising performance but it's really not a good teaching language.
C - all the power and flexibility of assembly language combined with the readability and maintainability of assembly language.
And I say that as someone who loves C.
Re:Mine was certainly cruel to us (Score:4, Informative)
Re: (Score:3, Interesting)
No C is a great teaching language if what you want is to teach the science of computing and how the machines do their work. You have to do some work with students in at the very least C and some Assembly if you want them to get an understanding of computer design and architecture. You don't have to do these things if you want to teach software engineering. They are different fields or at least different disciplines within a larger field, like an Oncologist is a type of medical doctor and so is an Endocri
Re:Mine was certainly cruel to us (Score:4, Insightful)
Honestly, it's most important to learn how to learn new languages than to learn any specific one. The specific language will change far more often than the concepts they represent.
Re:Mine was certainly cruel to us (Score:4, Insightful)
Re:Mine was certainly cruel to us (Score:4, Interesting)
Re:Mine was certainly cruel to us (Score:5, Informative)
You're right. Also Stroustrup had clearly pointed (in other argument lines) that C is not the better way to learn C++ (or OO in general):
BEGIN EXCERPT from http://www.research.att.com/~bs/new_learning.pdf [att.com] :
One conventional answer to the question ''Which subset of C++ should I learn first?'' is ''The C subset
of C++.'' In my considered opinion, that's not a good answer. The C-first approach leads to an early focus
on low-level details. It also obscures programming style and design issues by forces the student to face
many technical difficulties to express anything interesting.
Re:Mine was certainly cruel to us (Score:5, Insightful)
The C-first approach leads to an early focus on low-level details. It also obscures programming style and design issues by forces the student to face many technical difficulties to express anything interesting.
Expressing interesting things doesn't happen in a CS course, at least the ones where you're learning the language. It takes new CS students hours to implement the most simple linked list because it's not familiar to them. I learned low level first and I'm finding that it's the best way to teach my sister-in-law who's a beginning CS student. They're trying to teach object oriented features before they teach arrays or loops. Objects are constructs on top of the other programming concepts and should be taught as such. It was only after showing her how to use low-level features that she was able to start doing any semblance of OO programming.
People get so caught up trying to teach the "right" way to program that they don't teach how to program first, which is a mistake. Students need to learn the power and wonder of while, for, and regular functions before you can teach them the power of object oriented programming. Computer science is unfamiliar and strange, let students learn the simple things before throwing the advanced concepts at them.
I guess what I'm saying is that a good course would teach functional programming before teaching object oriented programming later in the same course.
Re:Mine was certainly cruel to us (Score:5, Insightful)
You're right. Also Stroustrup had clearly pointed (in other argument lines) that C is not the better way to learn C++ (or OO in general):
Somebody quoting Stroustrup on the topic of computer languages... seriously? C++ is like legalese -- it's impenetrable to read, full of unintended consequences, and even though it's spelled out in excruciating detail what it says is still open to interpretation.
Not only is C the first subset of C++ that programmers should learn, it is the only subset of C++ they should learn.
And I argue that C actually teaches people more about object-oriented that most other languages, because it teaches them in no uncertain terms why you should use objects. It's harder to realize why you don't just make fields public until you've seen global variables in a C program.
Then Java teaches you how to do OO where you are not allowed to 'cheat' by replacing methods at runtime, or calling methods that don't exist, etc. And JavaScript takes all that and gives you LISP power.
Re: (Score:3, Insightful)
There is nothing wrong with learning Java, although that's not what you should be doing for 4 years during a CS degree. During my degree some classes used Java, some C, Python, whatever the teacher felt like that semester (or was required because of the class). It was the students responsibility to learn the language on your own time. Class time was for learning about the theory that underpins *all* languages and other big topics that span multiple languages like OO, etc....
Re: (Score:3, Insightful)
Re:Mine was certainly cruel to us (Score:5, Insightful)
There are lots of other reasons that I am too lazy to list here. Learning Java is not bad, but learning it as a first language does not make your life easy. A good introduction to programming course should cover half a dozen languages, as case studies, rather than attempting to use one to teach all of programming.
Cruel and couldn't use a computer (Score:5, Insightful)
I'm glad I left, cause I can actually now use a computer, unlike much of the coders I come across. If you like computers, don't go into computer science. That is for people who enjoy math and theory.
Re:Cruel and couldn't use a computer (Score:5, Insightful)
Why is fake assembly and fake OS cruel? It's computer science, not a vocational tech course. They've presumably tried to bypass the issues of real-world systems that distract you from learning the point. Once you've got the basic concepts, any OS and any language become approachable - why would you want to learn something specific that would be out-of-date in short measure? Seems rather myopic to me.
Re:Cruel and couldn't use a computer (Score:4, Insightful)
In a practical field where you can do real assembly and work on a real OS, you ask why doing fake make-work is cruel?
Theory is fine, but theory shouldn't trump practical application in a field where practical applications are everywhere.
Re: (Score:3, Informative)
University vs. College.
Comp Sci. is not a trades course; go to a local community college and take a technology or programming course if you want real-world examples.
Computer Science is about learning to understand computing, whether you use real or completely fictional interfaces.
Re: (Score:3)
Re:Cruel and couldn't use a computer (Score:5, Insightful)
Re: (Score:3, Insightful)
That sounds like just about every biology class I've ever heard of.
Re:Cruel and couldn't use a computer (Score:4, Insightful)
Rule Number 1 of Computer Science - Don't reinvent the wheel. Everyone who invents their own education focused language that's only used at their school is violating the first rule of computer science. At my university, the first year of programming is taught in a java like variant of C++ called Resolve C++. Why is it bad?
Re: (Score:3, Insightful)
Oh, I don't know....
When I was in college, we (the students) were pushing for the CIS department to offer a course in...
wait for it...
VAX assembly.
That's real useful now, isn't it?
Dijkstra couldn't use a computer?!? (Score:5, Insightful)
Boy do you need to go back to school. Edsger wrote more and better stuff in his lifetime than anyone here on Slashdot. Did you ever get directions from Google Maps or Mapquest? Thank Edsger -- his shortest path algorithm is what they all use, and by the way, he wrote that before you were born, most of you. You know the semaphores used in the multi-cpu Linux kernels? Yep, you owe Edsger for them, too. And programming languages like C, Pascal, etc.? He helped write the first Alogol compiler, the great-grand-daddy of them all, once again before most of you were born.
Just because he eschewed the run-break-fix approach so beloved of the folks who are spewing billions of lines of error-laden code into the world today, doesn't mean he hadn't forgotten more about writing code than most folks here have ever learned. And yes, he advocated developing code formally, and he liked to do it with pen and paper.
So learn about who you're making snide comments about, and show some respect. When people are still using any algorithm you came up with 30 years from now, you will have the right to say something about Edsger Dijkstra.
I have a lot of respect for him, but... (Score:5, Insightful)
The fact is that the world needs a hell of a lot of running code in a hurry. Millions of lines of it. We don't have the luxury of treating a realtime airline-pricing-optimization manager as a lovely formal system that we can write out in pencil. We have to get it up and running, then fix bugs and add features as time permits, because of a phenomenon that Dijkstra doesn't take into account: IT'S NEEDED *NOW*.
I also think he's being unfair by suggesting that modern educational institutions are anything like as hidebound as medieval ones. First, medieval universities were not intended for inquiry in the first place; they were intended to prepare young men for the priesthood -- i.e. to teach them doctrine, which was not subject to inquiry. No institution except maybe a seminary is as restrictive as that these days. Second, it doesn't seem to have occurred to him that learning by analogy is how people learn *effectively.* He may decry teaching children about arithmetic by using apples because it's not a formal system, but a five-year-old doesn't have enough knowledge to know what a formal system IS. Starting a five-year-old with Principia Mathematica is just pointless. And your basic coding grunt who wants to build websites doesn't need to be taught JavaScript as a formal system either.
Cruel to be kind (Score:5, Insightful)
The aim of a really good degree (as opposed to a lecture driven box ticking one) is to be cruel, you want to feel that your head is going to explode and that your subject really is an absolute bitch.
Then you graduate and find out the real world is easier than the theory.
Cruelty is important in a good education to make you achieve in the real world. An easy flow through degree gets you the cert but gives you unrealistic expectation of how hard the real world can be.
Personally my degree was a mind bending bitch of mumbling lecturers and impossible (literally in some cases) questions that covered everything from quantum mechanics "basics" and abstract computing theory through to how to dope a transistor.
It was cruel, it was unusual... it was great.
Re:Cruel to be kind (Score:4, Insightful)
Handwriting? How about a Font! (Score:5, Interesting)
Dijkstra's Cruel Font link [tinyurl.com], so we at least get something recent(-ish) out of this article.
Comment removed (Score:5, Insightful)
Recursive Descent (Score:3, Insightful)
Dijkstra was a genius and made many contributions to Comp. Sci. But his suggestion that a program (really a program design) should be accompanied by a formal proof has problems at both ends of the development cycle: how do you prove that the formal specification is what the customer wanted, and how do you prove that the code actually implements the design?
I've seem automatic testing products that claim to do both, but in order to make them work you have to specify every variable and every line of code as the "requirements", then compare what the tool thinks the result should be to the output of the program. And yes, the vendor suggested that the business analysts write the formal requirements; you can imagine how well that worked.
Re: (Score:3, Interesting)
When I was in the CS curriculum in the University of Maryland shortly after the publication of this paper, one of the mandatory freshman CS courses took
Real-world cruelty (Score:3, Funny)
"Right from the beginning, and all through the course, we stress that the programmer's task is not just to write down a program, but that his main task is to give a formal proof that the program he proposes meets the equally formal functional specification."
Where exactly do semi-formalized, poorly thought-out specifications handed to you half-written out on a napkin and constantly subject to change fit into the programmers task and Dijkstra's world?
GIGO (Score:4, Insightful)
The glaring hole in Dijkstra's argument is that most software is built to automate what used to be manual processes, and they therefor have to mimic a human-centric process... which is inherently illogical, inefficient and rife with nonsense.
In the world outside the ivory tower, programmers do not have the freedom to create completely logical, functionally complete programs that can be reduced to a mathematical proof.
Next time your boss comes to you with an project to write an application, show him this paper and explain that what he's asking for is "medieval thinking" and see if you can then explain to him why you should keep your job if you don't want to do it.
Self-confidence (Score:4, Insightful)
All I have to do is read one paper like this to be reminded why I stayed out of academia. Ah the smugness, the hypocrisy, the great irony. A "radical novelty" this essay is not.
There are plenty of truths out there yet to be discovered. Unfortunately most academics lack the self-confidence to go looking for them and instead find clever new ways to twist ideas around.
Anthropomorphic Descriptions (Score:3, Insightful)
The thing that confuses me the most about this paper is his hatred for using anthropomorphic metaphors to describe the functioning of a program. It confuses me partly because his examples of why it's bad don't seem to actually address his complaint, or anything like it. But also, because the more I think about it, they seem to fit very well.
Program components may not be living things, but they do run autonomously. They perform actions, frequently without any input from me. They seem to do things and interact with other components on their own - why not describe it as such? There's also the fact that he doesn't give any alternate way of describing what's going on inside a program.
Re:Anthropomorphic Descriptions (Score:4, Insightful)
I think the problem is the false assumptions and analogies that get introduced by these lines of thinking. If a network is "this guy talking to that guy", your thinking will be constrained by what you know about human conversation. If there's a problem, someone can talk louder, slower, etc. and the analogy holds. But if the solution involves something that has no equivalent in the day-to-day world, how are you going to conceptualize it?
My pet peeve, that descends from this same idea, is from the teaching of object orientation. A car is a type of vehicle; a truck is a type of vehicle; they both have wheels; maybe the number of wheels is different; maybe each has a different turning radius or procedure for backing up.
Great, now you understand inheritance, polymorphism, member functions, etc. However, in practice, you use object orientation to avoid zillions of "if" statements, special case code, large blocks of almost-but-not-quite duplicated code.
In my experience, someone who learned OO by simple analogies is likely to be locked into thinking "I have to write every program by making a tree of classes like with the cars and trucks". Rather than "there are different ways to structure the code, and a good OO way will get rid of a lot of branching, magic numbers, and redundant code".
Dijkstra is the typical head-up-arse CS crack (Score:4, Insightful)
There are some good quotes atributed to him, but one particular one that goes to show how very wrong even experts can be:
It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.
In my experience this is utter arrogant rubbish. Not being able to teach good programming to people who know Basic only stems from the inability of people like Dijkstra to teach. One of the reasons I usually stear clear of all to academic IT enviroments and programming languages. Like Ruby or Java.
Re:Dijkstra is the typical head-up-arse CS crack (Score:5, Funny)
You have been trolled (by Dijkstra).
Re: (Score:3, Insightful)
I like the part where he complained about kindergarten teachers bootstrapping kids into the unnatural, abstract world of mathematics with *gasp* addition of one pile of apples to another.
He's clearly never tried to teach a 5-year-old about math. Kids can get stuck at very low levels of understanding if you don't guide them up the ladder of abstraction a bit, and examples like that are key.
I don't understand (Score:3, Funny)
I don't think I really understand what Dijkstra is getting at here. Can someone explain it to me with a car analogy?
What a pompous windbag (Score:4, Interesting)
I am very glad I never had him as a professor.
By the way, I am a physicist, and IMHO all of the best physicists develop a physical intuition about their physics, and that is certainly true with those who deal with quantum mechanics. Listen to the recorded lectures of Richard Feynman, for example.
Re: (Score:3, Funny)
That pompous windbag single handedly brought computing to modern age much like American Pie brought fun part of college life into movies.
I've got Karen Allen, Kevin Bacon, and the estate of John Belushi on Lines 1 through 4, and they'd like a word with you...
Sorta Kinda Yeah (Score:4, Insightful)
However, I can't get behind the man's call to teach without compiling and running programs. Well, to be fair, I'd have no problems teaching a freshman college course that way, but I'm teaching teenagers. I want the students to be able to unleash what they have wrought (or at least see that they've created a program that works). That's the bait that hooks them deeper into the coolness that is computer science or software engineering or programming or pattern engineering or thinkyness or whatever you care to call it.
Back to the future (Score:5, Interesting)
As I read through his writings it brought me back to my time at Moravian College circa 1979. I just started taking CS classes and in that same year Dr Brown, Head of the CS Department pulled out all the IBM mainframe systems and installed a PDP 11/45. Gone were the COBOL courses replaced by c, RATFOR, PASCAL, Fortran et al. I loved it and hated it at the same time.
Like the presentation, Dr Brown taught us programming before we really saw the computer. His focus was not on Language, but on concept. As he so well put to us, once done with our intro class we could work anywhere in any language. I believed it then and found it to be a true statement. At the end of that intro class he took the last three weeks and taught sort algorithms. The catch was each sort was analyzed in a different language. I chuckle when I read posts of youngsters that say "I learned Java, or C++ in college". I learned Programming in college then went on to figure out what language suited my economic and intellectual needs.
Cruelty in Computer Science? I am grateful for that kind of cruelty to this day. Since college I have had to adjust my knowledge as times and needs change. I have had the pleasure of working with RPG, COBOL, Java, FORTRAN, and even the bastard child Visual Basic. Unlike some, I do not look down at any language for each has its benefits for the task. What I do dislike is working on code written by persons who thought that "Learn to Code Java in three Weeks" made them a programmer; that language X is the best and only language out there.
Dr. Dijkstra says "Universities should not be afraid to teach radical novelties". What things could be discovered if that concept was embraced again.
The more things change... (Score:3, Insightful)
Fantastic article. I especially like this part:
(1) The business community, having been sold to the idea that computers would make their lives easier, is mentally unprepared to accept that they only solve the easier problems at the price of creating much harder ones.
So true. I deal with this every day. Despite the high tech wizardry around us, business still runs pretty much the same. Just, the management is all too happy to throw its problems at someone else. I can't remember how many times in the past that I've had a client, boss, or manager ask me something that is impossible and tell me "fix it" or "make it work".
what's left of Edsgar cruelty (Score:4, Insightful)
"I mean, if 10 years from now, when you are doing something quick and dirty, you suddenly visualize that I am looking over your shoulders and say to yourself 'Dijkstra would not have liked this', well, that would be enough immortality for me." - Edsger Wybe Dijkstra
A lot of software engineers like to work with new technologies, new paradigms, new code design patterns, new software development technologies and other forms of complexity. Quality Assurance rules by checklists and testing, only fixing symptoms. Every coder has it's own ideology what is correct code or the correct way to do it. Correctness proven by superficial subjective quality standards, beautiful crafted hacks included.
Edsger found ways to mathematically prove programs to be correct, requiring a very high level of math skills (the same level which is needed to prove the correctless a mathematical theory). This utopistic objective quality standard. Stuff for the real hard core developers who have plenty of time.
But most haven't the time. In the end time-to-market is key. Swift hackers remain the heroes of business who craft applications which get used in the real world.
However some pragmatical things I thank Edsger Wybe Dykstra for: invention of the stack, his low opinion about the GOTO statement; shortest path-algorithm, also known as Dijkstra's algorithm; Reverse Polish Notation and related Shunting yard algorithm; Banker's algorithm; the concept of operating system rings; and the semaphore construct for coordinating multiple processors and programs. His charismatic remarks about what we would typically consider software engineering are entertaining and humbling, examples:
False dichotomy (Score:5, Insightful)
Most of y'all are presenting a false dichotomy. It's not "Either learn abstract formalism OR learn practical languages." You can do both, you know.
I have met too many people who think that, because they can write some tangled, fucked-up C++, they are software engineers. Never mind the fact that they couldn't learn LISP, Objective-C, Java, or any number of other useful languages, as they don't know the first thing about actual computing.
Teaching Java or C++ doesn't matter. Sure, you need classes on practical application of your knowledge. But if you ignore what Dijkstra says here, you're going to end up with a bunch of code monkeys who have to test every element of the set, rather than test the rules of the set.
In my experience, those who started off learning theory, then learned how to apply that theory in practical situations, are far better programmers than those who are taught "practical" languages.
There's some very good advice in that paper. Calling him "out of touch" is a bit shortsighted.
At Ohio State, they teach this: "Resolve" (Score:3, Interesting)
Towards the end of the essay, an introductory course is described where the student, as programmer, is required to build a formal mathematical definition of his program and prove that his program conforms to the definition.
At The Ohio State University, we teach precisely this in the form of the "Resolve" programming language. For every function and every block of code, one must provide both the procedural code and a set of logical constraints that describe the effects of the process. For instance, if a function's job is to reverse the order of items in a list, then the internals of the function are accompanied by a logical constraint that tracks the movement of items from one list to another and ensures that the operation is totally consistent at every step, in terms, for instance, that the two lists sum to a constant size that is the same size as the input, and that the concatenation of the lists in a particular way yields the original list. (I'm summarizing.)
An active area of research here is on automatic provers that take your code and your logical definitions and actually prove whether your code and defintions match or not.
Mathematics (Score:3, Interesting)
I haven't read this essay before, and have only had time to skim part of it now, but Dijkstra's criticism of mathematicians has merit, IMHO. [I have an MS in Math, so I don't claim to be an expert.] It's been over 40 years since the introduction of nonstandard analysis (including hyperreals and later surreal numbers), but it's still not a mainstream topic. In fact, it boggles my mind why a professor or department would choose to teach their students "hard" analysis (Bartle or Royden) instead of "soft" analysis (Rudin) -- "soft" analysis uses topological results to arrive at key theorems faster, while "hard" uses Real Analysis as if it were the only option on the market. Not that there's anything wrong with using Rudin and Royden in tandem, but to ignore topological results altogether smacks of willful ignorance. To paraphrase Stephen King: do you expect brownie points for being ignorant of your own discipline?
The light on the horizon is the development of proof via programming, as covered by /. I'm sure there will be 50+ years of mathematicians screaming and kicking to avoid its introduction into the mainstream, but that will change the first time a computed proof that could not have been developed in one lifetime via the usual methods earns someone an Abel Prize. Until then, I suspect Dijkstra's point will still stand.
I can relate... (Score:5, Insightful)
But I think his arguments are centered around a misunderstanding of terms. It's simple academic dishonesty to which he objects:
Computer engineering vs computer science (Score:4, Insightful)
IMO there needs to be a starker contrast between computer science and computer engineering, just as there's a contrast between "real" engineering and, say, physics.
Those who just want to be able to program, who are focused purely on employability right out of college, can look for computer engineering courses teaching the popular programming languages. These people can be fine programmers, ready to start... so long as the language popularity hasn't changed by the time they graduate. It would be sort of advanced vocational program, just like any other engineering.
But the real scientists, those who want to experience and express code on deeper levels, should be looking for something very different, that which Dijkstra describes. Just as a scientist and an engineer can work on the same project contributing different skills, the computer scientist has his place even in the real world.
The two really are different mentalities, and it seems that the mixture of the two leads to situations that are non-ideal for either.
Re:Hmmm... (Score:4, Insightful)
I agree that teachers disconnected from reality are bad, but the alternative is even worse. Look at what too much bitching got us: they teach JAVA as the primary programming language in universities nowadays! How sadistically cruel is that?
Re:Hmmm... (Score:5, Insightful)
Re:Hmmm... (Score:5, Interesting)
There's nothing exceptionally wrong with Java as a starting language
Yes, there is. It insulates the student from some concepts that are important and because it's so aggressively object orientated even the standard "Hello World" program requires quite a bit of glossing over by the teacher.
As a result, it tends to get waved away as "magic" or "this will be explained later" but there's so much waved away that the students get disconnected. For instance, to simply output a line to a command line in Java you're looking at
System.out.println("output");
whereas with c++ (for instance) you have
cout << "output" << endl;
As someone who's teaching this stuff, the second is easier to explain in detail and doesn't rely on saying "don't worry what System.out is".
The other prime example when teaching object orientation is garbage collecting. Students who learn in Java are significantly more difficult to teach about dynamic memory and the necessity of cleaning up after themselves than those who've learned in other languages that don't abstract this away. It's much easier to switch from C/C++ to Java than the other way around.
The standard way of teaching basic programming is procedural, then functional, then object orientated then onwards. Using Java to teach in that cycle is nuts. How useful that cycle IS is another question, of course :)
Re:Hmmm... (Score:5, Insightful)
you serious?
to me, system.out.println looks way more reasonable than this "cout << endl" thing.
Re:Hmmm... (Score:5, Insightful)
I find it ironic that, to establish your argument that Java hides implementation details, you used a C++ example employing operator overloading such that the mere existence of functions is utterly concealed.
Re:Hmmm... (Score:4, Interesting)
I think they should teach low level first, teach students assembly first and work up from their. They don't need to create anything fancy in assembly just make sure that they understand how a computer works and does things rather then the abstracted model that higher level languages give you.
Re: (Score:3, Interesting)
You have to engage students. They aren't automatons who will simply take in any information at the same rate. I understand that you're talking about undergraduate college, and that the system is probably different depending on which country you live in, but you have to start at a level where it is easily possib
Re: (Score:3, Insightful)
Maybe they should choose a different line of work?
Starting low level vs meet-in-the-middle (Score:4, Insightful)
I think both approaches (top-down and bottom-up) make sense. You learn C very fast if you can think of it as a high-level assembler. And learning assembler teaches you a lot about what computers are all about and what they can and cannot do.
But learning algoriths on C or other low-level, manage-your-own-memory languages is unnecessarily painful and error-prone. The algorithm exists independently of a specific language incarnation. Learn the algorithms in a language that makes it easy to concentrate on the problem and not get lost in a thousand small implementation details.
You reach enlightenment when you can bridge the gap from very low level to the highest levels. But it is folly to try to do everything at once.
Re:Hmmm... (Score:5, Funny)
Ofcourse I can write a line of code!
Behold, in al its glory:
printf("hello world");
Re:Hmmm... (Score:5, Funny)
cat > hello.c
printf("hello world");
^D
gcc hello.c
hello.c:1: error: expected declaration specifiers or '...' before string constant
hello.c:1: warning: data definition has no type or storage class
hello.c:1: warning: conflicting types for built-in function 'printf'
Re:Hmmm... (Score:5, Informative)
cat > hello.pl
printf("hello world\n");
^D
perl hello.pl
hello world
Re: (Score:3, Funny)
Meh. Write-only Perl line noise.
How can that be a real program without about 10 lines of module and class declarations around it?
Professionals should know their tools (Score:5, Insightful)
I agree with that, but it isn't only in CS courses that programming should be taught.
The problem I see in current engineering and sciences courses is that they don't teach numerical analysis. Engineers and scientists today try to do everything in matlab or excel, except for those that do postgraduate courses, who often try to do things in fortran.
Programming languages are tools that anyone involved with advanced uses of computers should learn to use. If you are a professional you should know how to use professional tools.
Re:Professionals should know their tools (Score:5, Insightful)
I don't think you can lay the blame for students knowing less on the department that they attend.
Mine taught a good mix of coding and theory, but we still had morons who didn't do enough coding to actually learn their craft well, and people who learned the coding but didn't learn enough theory to get decent course grades.
The point is, while at university studying computer science or any other subject it is your own responsibility to teach yourself around the subjects you are introduced to in the classroom.
I was taught using Java and Delphi, and yet finished my degree as a pretty competent C coder, in spite of never having attended a class on that language.
I also studied a great deal more around the subjects then many of my peers. Those who did the same as me tended to do well on graduation, I went on to more years of poverty as a Ph.D student myself.
Re:Professionals should know their tools (Score:5, Funny)
Our school had 3 separate Java classes, 3 separate C classes, and 3 separate C++ classes: all in 3 different departments.
Silly. This can't be true. Everyone knows that there are no classes in C.
Re:Hmmm... (Score:5, Insightful)
I'd have to say in recruiting software engineers I have much more of a problem with theory-light code monkeys than I do with non-coders that are well-versed in CS theory. With the former you wind up with people who can't leave whatever language they're most familiar with and don't really understand why what they're doing works (cargo cult programming). It's easier to teach good coding practices in the field than it is CS theory.
My technical interviews aren't full of riddles or obscure CS theory questions, but I ask a series of pointed questions to see if the candidate has a good familiarity with the various language families (not just particular languages), common data structures (they should at least have encountered them, even if they need to look them up to implement them), and can talk in terms of pseudocode and algorithms instead of just library functions and language idiom. Language experience is a plus, but definitely not required.
Re:Hmmm... (Score:5, Insightful)
i.e. CS programs producing students who know loads and loads of theory and can't write a damn line of actual code.
That's because CS programs are misnamed. Most coding should be done by engineers, not scientists. A Master in Physics doesn't necessarily qualify you to build bridges either.
Re: (Score:3, Insightful)
Re:Hmmm... (Score:5, Interesting)
Dijkstra's comments were right on the mark, and fairly obvious to people outside of CS. They were only contentious within the field, for some odd reason.
The thing is, Computing Science should be approached in the same manner as most other science fields: A BSc in computing should be about theory, research, and pushing the state of the art. A modicum of programming is probably necessary to accomplish that, but programming should understood in the abstract--without the emphasis on 'this command, this language.' Learning to be a programmer (a) should be a division of computer engineering, or (b) probably not a degree at all. More like a one or two year college certificate.
Chemistry, Physics, Biology, Math, and so forth, are all degrees aimed at research and study, not commercial production. Why not computing?
Exactly (Score:3, Interesting)
That is spot on. It is the difference between an electrical engineer and an electrician. They don't do the same things and getting an EE to wire your house is as stupid as getting an electrician to design a CPU. Or you don't go to the engineer at GM who designed the engine of the car to change your oil (although given the state of Detroit that might change).
There is a difference between a CS degree and an IT or Software Engineering some other more hands-on degree. Yes, a CS or Comp Eng degree should hav
Re: (Score:3)
I'm curious where you are finding these students. In my experience, it is much more common to find people who think they know loads of theory, but can't write a line of code. In reality, these people usually can't write a line of proof either.
The best theoreticians aren't always the best coders, but they're usually able to code pretty well wh
Re: (Score:3, Insightful)
That's because it's computer science and not engineering. The problem is that people conflate the two. If you want a coder, hire a coder not a scientist.
Re:engineering (Score:5, Insightful)
I don't know what you think FrontPage has to do with anything. Perhaps you're just trolling?
Software engineers should understand use case analys, user interface design, project management and finance, and many other important subjects "computer science" curricula ignore while beating students over the head with details theory. Understanding issues of scalability is good (though often actual testing is used in the engineering world for practical reasons), but we don't need four years of that while ignoring more important topics.
I'm not saying exhaustive study of the mathematical theory of computation is bad. I'm saying students are badly served at most universities by focusing on that at the expense of other topics.