Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
News

The Art of Don E. Knuth 61

Snuffy writes "Salon has a nice feature on our hero. It's a full length feature, talking about his long-term project writing of The Art of Computer Progamming "
This discussion has been archived. No new comments can be posted.

The Art of Don E. Knuth

Comments Filter:
  • by rde ( 17364 ) on Thursday September 16, 1999 @01:06AM (#1678645)
    They just migrate to Windows.

    But seriously, folks... I was reading O'Reilly's new tome on Perl Algorithms and I was struck by a point mentioned in this article; no-one cares about efficiency any more. Just let your PIII do the work.
    We've all heard (and probably given) rants on how VB et al will be the death of the program, but it's good to have Knuth around to remind us that programming can still be an art form.
  • Anyone besides me think Knuth looks Robert Fripp of Crimso?

    I'll go back to my sugar now. :)
  • I find it amusing that there is even a programming book with the word "Art" in the title. Everybody in industry wants to turn this into a "Science", and from there, into a mechanized form that can be stamped out that much fast by the code monkeys. Of course, if programming is accepted as an art form, I wonder where that puts Windows?
  • I wish I had a job where Knuths work was of immediate relevence. I haven't had to sort anything myself since college.

    Sometimes I think things are just too easy these days :) I got exicted yesterday morning when I used mod for the first time in YEARS. I think that means I'm desperate for some real work!

  • by NettRom ( 39971 ) on Thursday September 16, 1999 @01:28AM (#1678650) Homepage
    But seriously, folks... I was reading O'Reilly's new tome on Perl Algorithms and I was struck by a point mentioned in this article; no-one cares about efficiency any more. Just let your PIII do the work.
    I'm never too much concerned about efficiency when I'm writing Perl-scripts for my linux-server at home, it'll never make much of a difference. And in some cases, what you do is so tiny and simple that doing it more efficiently will remove say 1ms from the execution time. Firing up the interpreter and loading any libraries et all take more time than actually running the script. But I'm digressing.

    I too find it slightly surprising to hear this. Maybe I'm just one of the strangers, those who strive for perfection, meaning that if I believe I can find a more efficient way to do task A I'll probably research into it. Hopefully it'll help me later, maybe I'll write stuff that runs on web-servers with high loads, and suddenly every cycle saved counts...

    We've all heard (and probably given) rants on how VB et al will be the death of the program, but it's good to have Knuth around to remind us that programming can still be an art form.
    I hope to soon have the time and money to buy his books and start reading them. They sound like a worthwhile read for even a not-really-a-programmer guy like me. Am looking forward to the experience.
  • by Anonymous Coward on Thursday September 16, 1999 @01:29AM (#1678651)
    I notice that Knuth appears to be deeply religious (for example his book 3:16 illuminated or his music based on Revelation). I find that interesting: around Slashdot, the assumption most often expressed is that anyone religious must be an irrational fool. Yet, when you look at people like Knuth, or Larry Wall, they are obviously not foolish in at least some technical areas.

    You have to wonder what the difference is... Why so many geeks are non-religious, but it seems that some at the very top are deeply religious (Knuth's only real competition is Linus in my book).

    Maybe the assumption that religious = irrational is at fault?



  • by RNG ( 35225 ) on Thursday September 16, 1999 @01:31AM (#1678652)
    While the "The Art of Computer Programming" is anything but easy reading, it's time well spent, even if you don't read the entire book. I bought the first 2 volumes a few years ago and the only way I could read through them, was to:

    1) Read a few pages
    2) Think about it
    3) Re-read the pages from step 1
    4) Let it sink in for a week or so
    5) Re-read the pages from step 1 again.

    Usually, once I had reached step 5, I felt that I actually understood what Knuth was talking about. If anybody ever tells you that languages like C and C++ are the past and Visual Whatever is the future, ask them a few questions about differnet sorting and searching algorithms ... in most cases the response is a blank stare that sometimes makes me feel as if I were speaking Klingon. I'm very glad, that during my college years (10+ years ago), they shoved all sorts of algorithms down my throat ... it seems that these days a lot of people graduating with IS (and to a lesser degree CS) degrees, wouldn't recognize an elegant algorithm if it came up and bit them in the ass ...

    --> Robert
  • I know the feeling.
    Mommy what's a binary tree?
    The scary thing is its been so long since I've used anything like sorting, searching, that now I have to look at a book to refresh the memory. Or much more fun - figure it out without the book. Then check that I'm right. I find it really disappointing that as a programmer I'm rarely asked to do anything interestingly worthwhile. Most of the time its just silly stuff. "uhhh report this data out in this format"
    -cpd
  • I'd rather say that the entire approach of assuming something (e.g., irrationality) about someone based on one known fact (e.g., him/her being religious) is at fault. Still, it's damn easy to do, and even feels kind of comfortable (at least to me). It's a bug in the brain, I guess. ;^)
  • I understand the feelings here. Some time ago I read a book on C algorithims and wished I had some reason to write binary tree code. Because many of us are Linux users we enjoy low-level detail. How can we make our software work better, how fast can we *really* run our processors, what fancy code can we create to save the extra millisecond. Meanwhile the rest of the first world curses the need to do such things. In the second world you would be lucky to see a computer. Never mind the third world. Could these other worlds even understand the issues we debate here - mainstream citizens and not the techies that I'm sure exist on some level in every society. Yet we look to the future (sci-fi such as Star Trek, Star Wars, etc.) and no one there debates this technology vs. that except to the extent that shields may be comprimised. And even on this point the solution is timing or raw power and not necessarily a particular type of shield, i.e. "raise shields...NOW" and not raise the Microsoft Shileds TM...NOW or the Oracle Shields, etc. Technology is simply expected to work. If I use a tricorder I expect a connection to be established over perhaps thousands of miles. Granted, the companies don't exist at this point for the users to debate one implementation vs. another. I guess a good opposing example of this would be the original RoboCop movie. This movie has cutting edge technology developed by two companies battling it out for the metropolian dollar. After the battle is won, though, we get RoboCop II and RoboCop III (bad movies IMHO) where the technology simply works and the RoboCop does its job fighting crime. RoboCop no longer has to justify its worth over another implementation. But back to the main point, why if we can say to our computer, "Computer...get me a chocolate chip milkshake" a la Star Trek wouldn't we also, as programmers, someday be able to tell the computer to build its own software to create someone a chocolate chip milkshake. Through iterative develoment over time (meaning generations and not years), similar to the industrial revolution, we may be programmers hundreds of years in the future wondering how we could have ever lived doing the low-level coding that we do now. Just a thought. Andy
  • by mvw ( 2916 ) on Thursday September 16, 1999 @01:43AM (#1678656) Journal
    Knuth is now updating MIX to MMIX, a reduced instruction-set computing machine that more closely mimics computers in use today.

    And you can help!

    DEK has released updated docs and supporting software. The initial conversion of the programs is done by volunteers.
    While the bits from Vol. 1 have been passed out to volunteers already, the MIX stuff from Vol. 2 is getting assigned for conversion these days.

  • Perhaps not. I think the problem isn't that people think religious = irrational (which, from an unbeliever's standpoint, is a perfectly reasonable statement). The problem is the unspoken false corollary, religious = stupid. It's quite possible to be irrational in some ways, and still be intelligent, capable, and even rational in others.
  • There's a link on the salon page to an article closely related to this, IMO a quite decent piece (and actually a quite good summary of the reasons why I personally stick to vi, makefiles and all that good stuff).
    Check it out at http://www.salon1 999.com/21st/feature/1998/05/cov_12feature2.html [salon1999.com].
  • What's worse, the person who "irrationally" believes in an unseen God, or someone who "blindly" bases everything off of a theory (a.k.a. not a fact) just because it's easier to believe in the tactile?
  • Have you actually read Knuth's book? One doesn't have to be religious to do Bible commentary. Heck, the very non-religious Isaac Asimov was probably this century's best Bible scholar. The Bible is an important book in Western Civ, for better or worse, just as Confucian and Daoist texts are important in Chinese Civ.
  • by geophile ( 16995 ) <jao@NOspAM.geophile.com> on Thursday September 16, 1999 @02:09AM (#1678664) Homepage
    Saving 1 msec. is not what algorithm analysis is about. It's about taking 1 msec. to do something vs. 10 seconds or more. In 1 msec. you can go through the loop of a binary search, oh I don't know, 100 times, which means that you can search a list of 2^100 items. If you did a linear search, it would take you, on the average, 1/2 * 2^100 * (let's say 1 nanosecond) which is 2^69 seconds, which is a LONG time. This is the difference between O(log n) and O(n) for large values of n.

    If you know n is small, (e.g. the number of lines in /etc/hosts), it won't matter what you do. If n is big, (e.g. a 100 meg log file), your choice of algorithm matters. If you're counting nanoseconds in a tight loop, then you're presumably already using a fast algorithm, but the exact variant is important. For example, heapsort is guaranteed O(n log n), but quicksort is often preferred because, although O(n^2) worst case, it's O(n log n) on the average, and the constant is lower than for heapsort. A lot of Knuth is about deriving these constants. Knuth goes even farther than this, e.g. showing how to derive the constants for different ways of choosing the pivot in a quicksort.

    It's exhausting, non-productive and insane to apply this sort of analysis to everything. But once you've focused on the performance-critical parts of your program, Knuth's detailed analysis is essential.

  • by ainvy ( 36615 ) on Thursday September 16, 1999 @03:25AM (#1678666)
    As somebody who teaches algorithms and complexity, I should say that a good fraction of students seem disinterested in an good algorithms course. They are more interested in the syntax of C++ or Java than in developing their algorithmic skills. They somehow do not realize that any professional programmer should be able to pick a programming language in three to four days. (At least a good working knowledge.) What therefore distinguishes two programmers is how algorithmic one is.

    I have also met many programmers in the industry who do not know what a Turing machine is and are surprised to hear that there such things as an unsolvable problem! After all, cannot a computer solve everything!? And of course, NP-completeness is some near mystical theory with no relevance.

    This is the second article on Knuth in a week on /. This will probably enthuse more youngsters into taking algorithmics more seriously.
  • Of course, if programming is accepted as an art form, I wonder where that puts Windows?

    Stuck to a refrigerator with a magnet like the hideous, only-a-parent-could-love-it first grade art project it is, would be my vote.
  • Hmm... I don't know if I (as a fairly serious student of the Bible) would label Asimov as the Century's best Bible scholar.

    Bluntly, even if you accept the scholarship of "higher critics" like Asimov (which I have a major problem with -- they don't study the Bible they revise it) he is hardly the best or best known.

    I own and have read his Bible commentary -- I found most of his conclusions to be banal. He is really a popularizer of a specific school of Bible study; that is, one that concentrates more on the text than on the meaning of the text. That doesn't make him a Bible scholar any more than his books on physics make him a great physicist. (Yes, I know his Dr. was in Chemistry -- that still doesn't make him a great physicist.)
  • by Anonymous Coward

    And where are the breakthru languages?

    You just aren't looking.

  • I also found that peculiar. While there are certainly religious and rational people, the great majority of religious people I encounter are actually NOT rational about it. As if among everything else in the world, that was one thing that could not be questioned. Perhaps he was just brought up religious and likes to stay that way for social reasons. Maybe he doesn't care to think about it...time is not something he has to waste. There are plenty of intelligent and religious people who have never had a reason to question their belief, were never challenged about it.
  • But seriously, folks... I was reading O'Reilly's new tome on Perl Algorithms and I was struck by a point mentioned in this article; no-one cares about efficiency any more. Just let your PIII do the work.
    I'll mostly agree with you. A lot of, shall we say, less-experienced programmers tend to ignore speed considerations. In many cases, speed is not necessarily a critical attribute, especially as processors et faster and faster. OTOH, a "good" programmer will at least consider speed as an important characteristic. The days of tuning your code to save a couple of milliseconds in the middle of a tight loop are probably largely behind us. The better programmers do consider algorithm complexity, though. If you can use an O(n logn) algorithm and you pick an O(n!) one instead, you just might be surprised when you try to sort that 10,000 record database! A good programmer ought to spend more time developing a solid algorithm than performance-tuning the program once it's written:

    The rule of program optimization:
    Don't do it!

    The rule of program optimization (for experts only):
    Don't do it yet!


    --Phil (Still working through my copy of Volume 1.)
  • Does anyone know if MMIX will only run on 2009 or will 1009 be able to execute it.

    Sinan
  • That's what religion is all about isn't it? Faith? At a certain point there is a leap that must be taken. If you require an answer to the question of "Why" the only answer will be faith based (whatever religion you may be talking about, science, christianity, buddhism etc). For some of us it is an absurd question (and we are in an absurd position because we are capable of asking it). There are plenty of intelligent people whose faith and belief have been deeply challenged and having confronted these issues (like Job) have come back to their belief in God. I think as scientists, engineers, human beings we need to respect others attempts to understand a world that is essentially alien from us (whether that explanation is you calculus text book, the Bible, the Enuma Elish, the Bhagavad-Gita/Upanishads, Koran, etc). gid-foo
  • I suspect they *sound* a lot alike. :} In some ways, Fripp has done to music what Knuth has done to algorithms... subjected both to deep technical and aesthetic scrutiny.

    "Discipline is never an end in itself, only a means to an end". -Fripp

    ---
  • I find that interesting: around Slashdot, the assumption most often expressed is that anyone religious must be an irrational fool. ... Maybe the assumption that religious = irrational is at fault?

    Religion is irrational. Faith is irrational. These are given. A religious person needs no proof of the religion, and this is irrational. Bt, that's not the problem.

    The problem lies in the fact that many people assume that a person is irrational in all they do, all the time. In other words, people think irrational==fool, which is clearly not true.

    Personally, I have no faith in a god of any sort. I do have faith in other things, and much that I do is irrational, even downright stupid. But, in certain areas, I do things very rationally indeed. I do not condemn a person for believing in a god, since what they believe is their own business. I do condemn a person for believing in a god and then condemning me because I do not. Usually I will not associate with these people.

    Once I met a guy who discovered I was an atheist (not something I talk about normally, but he brought it up), then he got his whole damn family trying to convince me otherwise. I think I did fairly well in the debates with them, considering I know the Bible a LOT better than they did (So few believers ever really read the thing cover to cover...), but it never seemed to do any good. Eventually I had to threaten him that I'd kill his cat if he didn't drop it. (Believe me, the situation was out of control. I love cats, really. Dogs, on the other hand...) They backpedaled quickly.

    Anyway, the point is that few non-believers get upset with believers, it's usually the other way around. Then the non-believer gets upset because the believer won't drop it. We atheists have (usually) put a hell of a lot of thought into the subject, and are quite satisfied with our beliefs or lack thereof.


    ---
  • Knuth, now 61, hopes to finish the book around 2003 -- though "that's probably slipped by a year or two," he admits. It could be a decade or two into the next millennium before he completes the set.

    Given the average lifespan of most famous computer scientists, I'd say our beloved Mr. Knuth will be dead before he finishes Volume 4, let alone Volume 5 (will there be a 5?)

  • Does anyone know where I can get a poster or t-shirt with Knuth on it? I would also like to get one with rms and various other geek superstars. gid-foo
  • Not a bad algorithm, but you left out the *real* time sink. Doing those nasty problems at the end of each chapter :). Especially those muthas rated over 40!

    I came at programming rather back-asswards. I didn't start reading Knuth until years after I was programming professionally. To be truthful, for the kind of assembly programming I do, Knuth is rarely that useful. But it's wonderful exercise for the mind, and recommended reading in any case.
  • It's not that he's religious per se, he just likes reading about himself.

  • by Lucius Lucanius ( 61758 ) on Thursday September 16, 1999 @06:21AM (#1678682)
    Being extremely brilliant in science or math does not mean that a person cannot be not irrational or religious. OTOH, being completely rational doesn't make one a brilliant mind either. Many accomplished genuises were extremely good in their field, and yet irrational in other respects.

    * Godel came up with one the most elegant and logical proofs in math, yet in his later years was paranoid about being infected by microorganisms in his food and ended up starving to death. A notably illogical way to die.

    * Ramanujan, who was self taught and considered one of the most brilliant mathematicians of the century, was devoutly religious and was eccentric enough that he carried a doctor's certificate stating he was sane.

    * Emannuel Swedenborg was a deeply religious and profound scholar with an allegedly stupendous IQ.

    * John Forbes Nash, who won the Nobel for a thesis he wrote at 21, became schizophrenic later. There's an unforgettable exchange in his biography, "A Beautiful Mind", by Sylvia Nasar (an excellent book).

    Mackey: "How could you, a mathematician... believe that extraterrestrials are sending you messages?"

    Nash: "Because, the ideas I had about supernatural beings came to me the same way that my mathematical ideas did. So I took them seriously."

    Of course, this does not mean that Knuth or any genius is bound to be eccentric or worse. The point is - the history of math and science is replete with people of enormous logic, who were quite irrational and non-scientific outside their core field of accomplishment.

    In any case, religion has more to do with faith and finding personal meaning, not with exhibiting scientifically rigorous proof (which would be quite irrational indeed, not to mention a career-limiting-move. )

    L.

    PS - found this thoughtful remark from Nash describing his delusions

    http://www.nobel.se/laureates/economy-1994-2-aut obio.html

    Thus further time passed. Then gradually I began to intellectually reject some of the delusionally influenced lines of thinking which had been characteristic of my orientation.This began, most recognizably, with the rejection of politically-oriented thinking as essentially a hopeless waste of intellectual effort.

    So at the present time I seem to be thinking rationally again in the style that is characteristic of scientists. However this is not entirely a matter of joy as if someone returned from physical disability to good physical health. One aspect of this is that rationality of thought imposes a limit on a person's concept of his relation to the cosmos. For
    example, a non-Zoroastrian could think of Zarathustra as simply a madman who led millions of naive followers to adopt a cult of ritual fire worship. But without his "madness" Zarathustra would necessarily have been only another of the millions or billions of human individuals who have lived and then been forgotten.

    Statistically, it would seem improbable that any mathematician or scientist, at the age of 66, would be able through continued research efforts, to add much to his or her previous achievements. However I am still making the effort and it is conceivable that with the gap period of about 25 years of partially deluded thinking providing a sort of vacation my situation may be atypical. Thus I have hopes of being able to achieve something of value through my current studies or with any new ideas that come in the future.
  • Religion is irrational. Faith is irrational. These are given.

    I know of no widely accepted standard that has proven this statment. If such is "given," then I ask, by whom?

    Have you ever felt genuine faith before? If not, you have no place to say it is irrational. Feelings afterall, are very rational.

    Describing faith to someone is difficlt to do with words. I immagine it would be something like explaining taste to someone who has no tongue; or color to a person with no eyes to see.

    And so the prospect of faith, a belief in something with no physical evidence, can only be thought as irrational to those who have never felt it before. Or to those whose faith has let them down and have become disillusioned with their beliefs.

    Anyway, the point is that few non-believers get upset with believers, it's usually the other way around.

    I'm a believer, but I will admit that your statement here holds true generally, at least from my observations. It is unfortunate how few of us actually pause to consider our religion, and how ready we are to accept tradition with so little thought.

    This I would not call faith at all. Faith is more akin to a desire, or hope for something to be true. It often influences one's actions, and sometimes the consequences positively affirm the faith.

    Many theists, though, just like to argue.

    We atheists have (usually) put a hell of a lot of thought into the subject

    This is not untrue of some theists as well. There are those of us who know why we are religious. Granted, usually it doesn't come down to the tangible evidence, rather the faith that we have at last found.

    Some hold the opinion that, to find truth, one must doubt all until tangible evidences present themselves.

    I say, to have a truly open mind, one must not doubt, but seek. It is not bad to believe; in believing we discover. Only when we are unable to change our beliefs is when we have truly closed our minds.
  • I also found that peculiar. While there are certainly religious and rational people, the great majority of religious people I encounter are actually NOT rational about it.

    I don't think it's all that peculiar, and I doubt the Professor finds it all that peculiar either. The great majority of NON-religious people are also not rational about it. Most folks find it profoundly disturbing to think about the syllogism that, if there is a God, he might have an opinion about how we should live.

    Most agnostics are not really agnostics, they're antignostics. It's not that they don't know, it's that they don't want to know.

    By and large, I find that the percentage of irrational geeks matches up with the percentage of irrational people. The irrationality just shows up in different ways in different groups. What's really, really irrational in my book is to judge the goodness of something by whether smart people or stupid people are associated with it. You can't judge anything by whether bozos are associated with it, because bozos are associated with everything.

    Larry Wall

  • Stuck to a refrigerator with a magnet like the hideous, only-a-parent-could-love-it first grade art project it is, would be my vote.

    As the parent of a young child, I'd like to point out to you that you are comparing the best effort of a 6 year old to the what-can-we-get-away-with laziness of some supposedly educated adults. If I were the 6 year old, I would be insulted ;-)
  • Neural Nets have actually been around since before there was such a thing as Computer Science.

    Fuzzy Logic is used in conjunction with Neural Nets and is very good at some types of things. For example, the automatic transmission controllers in some newer cars are based on Fuzzy Logic, as well as the engine control systems.


    Computer Science is still very much alive and kicking. Check out some of the work into Artificial Life that has been done recently and you will see.
  • I bought the first 2 volumes a few years ago and the only way I could read through them, was to: ...

    I suppose every programmer used to go this process 20 or 25 years ago. It's sad that today many soi-disant "programmers" barely know how to click on things in Visual Basic (pardon my French here). But for any application which needs reasonable response time, knowledge of algorithms is essential... without, of course, neglecting other important things like graphic and user interface design. Real artful software is fast where it needs to be, easy to use but unobtrusive.

    Now, if you wish to go through the 5-step process in much tighter loop - read a single sentence and think a week about it, then repeat - try Buckminster Fuller's "Synergetics" (sadly seems to be out of print). Whew.

  • Yep,
    Knuth is a grand master in algo.

    My first lecture in college was my a disciple of his religion. I find doing programming and learning other languages so much faster.

    Many, when given a problem would not even spend a moment thinking how to do it. Instead, they start coding with their favorite language (Java/VB/etc) and later on complained that the program runs too slowly.

    I do agree about the ignorance of algorithmics by graduates nowadays. They couldn't be bothered and thinks that a fast CPU (or a cluster or some smart ass even mention Beowulf!) can solve everything. Their response when I mentioned about their (lacking of) algorithm design, was "who cares?!?"


    Watcher
  • Start at this [stanford.edu] link to his home page.
  • Oh, I understand how you feel. My lecturer (who I respect greatly) had this problem. He taught students from the algorithmics angle, insisting that students think thru the problem and solving in in pseudo-code. Another lecturer taught Pascal (that was the language used in Programming 101). Guess who got the most popular award?

    Most students don't appreciate algo. These students think that it is irrelevant and that any problem can be solved, by just putting more time into coding. OTOH, I went thru the whole shebang, with Algo 101, CS201 (Programming and Data Structure), Analysis (which was algorithm and notational analysis) and Computation model (I can still remember the look on my buddy's face when my lecturer mentioned that the proof that all computers are Turing equivalent could appear in the exams :).

    It made me appreciate algo and computation a lot more. When my lecturer had mentioned that 2-D matrix multiplication has its order reduced to O^2.xx, I was impressed...=) However, the industry tries to brainwash everyone into thinking that the only important thing is their product (all of them incl Sun, Oracle, MS, etc), othering else matters.

    As for NP-completeness, it is heretical to their belief that computers cannot do everything. They ignore it totally, and throw more computational power when it program runs too slowly...


    Watcher
  • Of course he focuses on the text. if you start focusing on what you believe to be the meaning of the scripture, it becomes, in fact, completely meaningless, since you can interpret it any way you want. To get any sort of useful analysis you have to look at what's actually written, not what your political agenda, religion, or hopes want you to believe is meant.
  • Have you ever felt genuine faith before? If not, you have no place to say it is irrational. Feelings afterall, are very rational.

    Feelings are indeed quite irrational. Feelings are abstract emotions, not reasoned responses. The term "rational" refers to reasoning, not emotional responses. Faith is an emotional response, and hence not any more rational than nostalgia, lust, or annoyance are.


    And so the prospect of faith, a belief in something with no physical evidence, can only be thought as irrational to those who have never felt it before. Or to those whose faith has let them down and have become disillusioned with their beliefs.


    The concept makes perfect sense - it's the belief in something without evidence that it is correct. This belief is usually based on either a hope or desire for something to be true, some sort of "gut feeling" (some prefer to think of this "gut feeling" as a message from God) that it is true, or both. Either way, it's not rational - i.e. a reasoned response.

    This I would not call faith at all. Faith is more akin to a desire, or hope for something to be true. It often influences one's actions, and sometimes the consequences positively affirm the faith.

    Exactly. it's a hope for something to be true, rather than any sort of evidence that the thing is indeed true. One can hope for whatever one wants, but this doesn't increase the chances of it actually happening.
  • > As for NP-completeness, it is heretical to their belief that computers cannot do everything. They ignore it totally, and throw
    > more computational power when it program runs too slowly...

    Of course computers can't do everything.

    When it comes to NP problems and N gets bigger you may never find the best solution but you can still try to get a sufficient good one (by optimizations and other techniques). And the more computer power and time you spend on the problem, the better the solution gets.

  • Heh. I know what you mean. I went through a course last year called Data Structures and Algorithms, using C++. The prof was visiting, so we got lucky (he was a *GREAT* prof! Great class, Dr. Astrachan, if you read /.). Spent lots of effort drilling in the various algorithms and data structures (to the point of exhaustion). Lots of complaints (lab too long, too hard, etc), but really useful class.

    Makes me wonder how many programs out there use *BUBBLE SORT* (the dog of all sorts) as their sorting algorithms. Great for low n, but doesn't compete with other O(n^2) algorithms.

    Lots of time spent trying to make elegant (and readable) algorithms, and armed myself with lots of algorithms. I may not remember them, but I certainly will keep the notes and files (hey. I coded some, others were given as experimental data. Code reuse == good).

    BTW, if you were speaking Klingon (like I do from time to time), people would step back first to avoid the spittle }}:-)
  • It's not that he's religious per se, he just likes reading about himself.


    I think this was in rec.humor.funny (or somewhere else):


    Richard M. Stallman, Linus Torvalds, and Donald E. Knuth engage in a discussion on whose impact on the computerized world was the greatest.

    Stallman: "God told me I have programmed the best editor in the world!"

    Torvalds: "Well, God told *me* that I have programmed the best operating system in the world!"

    Knuth: "Wait, wait - I never said that."

    :-)
  • Actual working programmers have relatively rare opportunities to apply such knowledge. Most of the hard but tidy problems are Somebody Else's problems, and neatly tucked away behind some interface which has to be learned and used.

    The problems most real programmers face are trivial, but extremely numerous. They glue this bit to that, make an interface that the customer can live with, add this little feature, find that 3-year-old typo that ruins the app, etc. The problems are mostly in understanding what has to be done and learning the interfaces to the parts it affects.

    As for unsolvable problems, my experience is that these are academic hair-splitting. Like that old saw: "It is impossible to create a program which detects infinite loops." This is only mathematically true, not practically relevant. One could trivially write a program which expresses the conditions under which each loop will be infinite, and non-trivially sort these into always true, sometimes true, maybe true, and never true. Furthermore, it could deduce more abstract conditions for the "sometimes" and "maybe" cases and sometimes reduce them to "always" or "never". Ultimately, it could warn the programmer that the program will never terminate no matter what the inputs (a clear and obvious bug), or in many cases it could give conditions which the input must meet for the program to terminate (and, of course, point out those areas of the program which it can't understand as potential trouble spots, and hand this information over to the debugger). It would not be theoretically perfect, but it could be extremely useful, and it might even be possible to write every useful program in such a way that the loop-checker could verify it.

    There's a big difference between a problem being mathematically impossible to correctly answer in all conceivable situations with a yes or no, and a problem being impossible to answer correctly in most practical cases and to tell a human to deal with the rest.

    In other words, special cases are the stuff of the real world, with the need for general solutions few and generally encapsulated into libraries.

    Most working programmers work with small data sets, and when they don't, they are usually accessing a database. They don't care about "for sufficient values of N" because for sufficient values of N their programs give up and print an error message (or just crash).

    Don't get me wrong, I have the available volumes of TAoCP in the latest editions and have read them (not that I remember a great deal of detailed information from them, I still look through them and other books when I want a hashing function or a pseudo-random number generator). Being a games programmer, I am very concerned with efficiency so I have a great many opportunities to apply my knowledge of algorithms and data structures; often to intractable problems (fortunately, for games you can often get away with just making it look like you've solved the problem ;) ). It's just that most programmers I've met and worked with seldom have any need to do anything themselves that is more complicated than looping through a list.

    Most programmers can't pick up a the skills and habits of writing effectively and efficiently in a new language in a few days either. More importantly, they can't pick up a new library in a few days. What distinguishes two programmers in terms of usefulness is rarely "how algorithmic one is," but one's ability to learn new interfaces quickly, to write thousands of lines of clean, readable code and to spot common errors quickly. Even more important is that intangible ability to just make it work, to dive into that horrible tangle of a million lines of poorly written code and come back up with a new feature or a fixed bug without bringing the system down. That is the real complexity working programmers deal with. Skill with algorithms is trivial next to this ability to cope.

    In other words, software engineers don't need to be computer scientists (though it can help).
  • I know of no widely accepted standard that has proven this statment. If such is "given," then I ask, by whom?

    Pretty much by any person using his brain. :-)

    Have you ever felt genuine faith before? If not, you have no place to say it is irrational. Feelings afterall, are very rational.

    Wrong again. Feelings are irrational. Get a dictionary. Rational describes that which is reasoned. Do you reason how you feel? If so, I feel sorry for you.

    Or maybe you'd like to talk about bio-chemistry. The feelings are situated in the thalamus (sp?). The center of reason is in the cortex. These are two entirely separate structures.

    Describing faith to someone is difficlt to do with words. I immagine it would be something like explaining taste to someone who has no tongue; or color to a person with no eyes to see.

    I perfectly well understand what faith is. Faith is a belief, in the face of lack of evidence, or in the face of evidence to the contrary, that something is true. And there is nothing wrong with that. I'm fairly sure that alien intelligences exist, even though I have no proof of that. One could say I have faith this is true. It is an irrational assumption to make, but there it is.

    And so the prospect of faith, a belief in something with no physical evidence, can only be thought as irrational to those who have never felt it before. Or to those whose faith has let them down and have become disillusioned with their beliefs.

    Any thinking person, with or without faith, must recognize that faith itself is irrational. Don't define irrational as "a bad thing" until you understand what it means. Irrationality is a good thing. A very good thing. It's what makes humans human, instead of vulcans. Didn't you ever watch Star Trek? :-)

    Many theists, though, just like to argue.

    So do many atheists. :-)

    Some hold the opinion that, to find truth, one must doubt all until tangible evidences present themselves. I say, to have a truly open mind, one must not doubt, but seek. It is not bad to believe; in believing we discover. Only when we are unable to change our beliefs is when we have truly closed our minds.

    True. I do not hold that the existance of a god is impossible. There is damn little that is impossible. But I do hold the belief in the existance of a god is, and has always been, a bad thing for all of humanity. This is because belief, in general, closes those minds. A mind that holds a belief against all reason to the contrary is closed, by definition. And that's how most beliefs humanity holds work.


    ---
  • It's an action.
  • Feelings are indeed quite irrational. Feelings are abstract emotions, not reasoned responses. The term "rational" refers to reasoning, not emotional responses. Faith is an emotional response, and hence not any more rational than nostalgia, lust, or annoyance are.

    What I mean by "feelings are very rational" is that feelings are the product of some physical need. For example, I could explain why I feel faith, or why I like it, etc. There is a physical reason or need that suggests the feeling.

    My opinion is that, to be truly religious (not fanatical), one must know why they believe.

    Many atheists compare faith to delusion. And in most cases, I would tend to agree with them (not that I think it's necessarily a bad thing). However, if a person realizes what his faith is, and understands that it is not based on solid, (perhaps scientific) evidence, then this person is not deluded. Delusion consists, critically, of the person being unable to separate the belief from reality. Not that the belief is necessarily untrue, but rather that there is no "real" evidence to support it.

    One can hope for whatever one wants, but this doesn't increase the chances of it actually happening.

    I can't argue with that. Still, many humans require hope to live hapily. For some of us, faith gives us hope, and that is reasonable to me.

    I think faith is powerful because it can help one see possibilities, how things might be, or how they could be. Imagine how much stuff is out there that will never be proven in our lifetime, yet is actual fact. I think it is ok to believe in something without proof, but I think one must admit that the belief is based upon faith.

    Not the bastardized faith that makes the theist want to pile religion upon the unwilling.
  • According to the FAQ at his web page [stanford.edu] Knuth plans to finish Volume 4 (Combinatorial Algorithms) in 2004, and Volume 5 (Syntactic Algorithms) in 2009. Then he will revise Vol 1-3 yet again, followed by a one-volume "Readers Digest" version of 1-5. THEN come Volume 6 (Context-Free Languages) and 7 (Compiler Techniques), "but only if the things I want to say about those topics are still relevant and still haven't been said." In other words, he has his life planned out until about age 90.

    PS - I went to Case with Don, and well remember the IBM 650 mentioned in the dedication to Volume 1, along with the SOAP assembler. I don't intimidate easily now, let alone when I was a teenager, but.... Hell of a note when you're 60 years old and a highlight is that you once sat at the next keypunch to a genius.

  • Your responses seem to indicate that your thinking is oddly similar to mine, despite the fact that we come to nearly opposite conclusions.

    One thing that I'm not clear on is that, if you recognize that faith is something that one believes without solid evidence, how does one decide what to believe in? Why become Muslim, or Christian, or Hindu, or Jewish, or Wiccan? More importantly, why become any of these? What's wrong with making up your own hope/belief and having faith in that?
  • Your responses seem to indicate that your thinking is oddly similar to mine, despite the fact that we come to nearly opposite conclusions.

    Yeah, I consider myself a devout Christian. But to be honest, if I had to argue mainstream Christianity, I'd rather take the atheist's side. :) I honestly do share many views about modern Christianity that a lot of atheists point out.

    One thing that I'm not clear on is that, if you recognize that faith is something that one believes without solid evidence, how does one decide what to believe in? Why become Muslim, or Christian, or Hindu, or Jewish, or Wiccan? More importantly, why become any of these? What's wrong with making up your own hope/belief and having faith in that?

    Most of the people that I consider "honestly" religious don't always attach themselves to any particular denomination. As you can imagine, there are those who come up with their own crazy ideas. And you'd probably be right if you suggested the motive was, oh, probably lucre rather than faith.

    There are so many reasons humans pick one church over another. Tradition. Culture. Violence. Opression. Want to fit in. Strokes the ego. Seems logical at the time. Make new friends. Socialize. Etc.

    As for my own religion, I must say that I grew up with it. For most of my life, it was a matter of being too young to know anything else. After awhile it became part of my culture and tradition. Not the best reasons to be religious. Then came a time when I started to think on my own. I discovered other religions and was profoundly embarrased by them. :) Then I discovered all of the awful things people say about my church. That was a hard time because everything I was taught to believe in was being challenged, and it didn't feel good. Looking back, I'm glad I saw the other side because it made me have to choose. I don't look at my church anymore as "that's how it's always been, so it must be right." Instead, I actually try and find things out for myself.

    I chose to keep my faith because it has served me well in the past. I hope someday I'll have more than just faith to affirm my beliefs. I think in the end (whether it's personal anihilation or a continuation of life after death) I will be my own reward. If my faith can help me conceive something more noble, or motivate me to live a little better, or just feel happier about myself, well I suppose that is some form of reason, if you like.
  • What is peculiar is that Knuth's explanation apparently comes from BOTH the text book, AND the Bible.
  • "The great majority of NON-religious people are also not rational about it. Most folks find it profoundly disturbing to think about the syllogism that, if there is a God, he might have an opinion about how we should live."

    Well, I don't know what NON-religious people you know who are irrational about it...but I think most non-theists are non-theists /BECAUSE/ they are rational. The rationality leads to their belief, instead of the word of a 2000 yr old book.

    http://www.infidels.org/news/atheism/overview.ht ml

    They seem very rational. If there is a God, which I am not denying, and he DOES have an opinion about how I live, I will certainly rely on deduction from natural phenomenon to prove this to myself, not speculation, conjecture, and hearsay.

    "Most agnostics are not really agnostics, they're antignostics. It's not that they don't know, it's that they don't want to know."

    I find that peculiar. How can you be antignostic...it's not as if there really is a truth to BE known. That statement sort of presupposes a truth that they choose to deny. By definition (a thorough explanation of which you can find at the above link) and agnostice do not profess a knowledge.

    "What's really, really irrational in my book is to judge the goodness of something by whether smart people or stupid people are associated with it."

    Well, I don't judge the "good"ness of things by whether stupid or smart people are associated with it, I judge the smartness or validity of such things. Sure there are plenty irrational intelligent people (Nazi scientists anybody?), but I'd wager there are more irrational stupid people than irrational intelligent people.

"The one charm of marriage is that it makes a life of deception a neccessity." - Oscar Wilde

Working...