Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
News

An interview with Donald Knuth 69

shem gave us the hook-up to a review with Donald Knuth [?] . He talks about retiring early from teaching to work on his writing, as well as his on-going project of writing The Art of Computer Programming
This discussion has been archived. No new comments can be posted.

An interview with Donald Knuth

Comments Filter:
  • Ok, so I might be poorly informed about the current trends in computer programming :-)

    Is there anyone *really* using "The CWEB System of Structured Documentation" (http://www-cs-faculty.stanford.edu/~knuth/cweb.ht ml [stanford.edu]) to "Write programs of superior quality"?

    I've never seen any programs written in CWEB (nor in FWEB or any other WEB) apart from those on Knuth's site and those that are distributed together with the CWEB distribution.

  • I have used TeX, Wordperfect, Word and I have been using Framemaker since about a year.

    I don't agree with you discarding WYSIWYG as useless. I do agree with you that the way some wordprocessors implement this is highly annoying. This actually caused me to switch from word to Framemaker. I abbandoned TeX long before that. TeX is a nice set of typographic algoritms (arguably unriveled by other word processors) but I never understood why it has to be compiled. If it could be interpreted instead, TeX could be used in a WYSIWYG program and we wouldn't be discussing this. Sure a large LaTeX document takes some time to compile but most of the time seems to be IO time (reading files). If I'm wrong please point out where the error is.
    The main reason I and many other people don't like to program our text is that human beings are much better at processing graphical information than textual information. If there's a layout error in a printed/displayed page of text most people will be able to spot it in an eyeflash. Finding the same error in a LaTex file is a lot more troublesome.
    Basically when I was writing LaTeX documents I found myself executing a compile/xdiv/debug cycle (write some Latex code, see if it does what I expect it to do/ fix the errors), thus emulating WYSIWYG.

    As I said I'm a reasonably happy Framemaker user now. I absolutely hate the userinterface but at least its better than having to edit the file directly and it manages to things more consistently than word does. It combines the structure you have with LaTex with the graphic feedback of a WYSIWYG system. Also the frames metaphore is a really nice way of modeling a document.

    In case you are wondering, I'm a Ph D. student and I write scientific articles (i.e. higly structured documents with very tight layout requirements). Since I also have to submit those articles, I usually have to conform to different layout guidelines. LaTeX is a nightmare if it comes to changing the layout unless you know how to work on a really low level with TeX. In Framemaker this is really easy.

    Also importing/exporting in framemaker is really easy. Unfortunately TeX is not supported but then what commercial wordprocessor bother to support TeX anyway.

    I don't think TeX will be around much longer. If it comes to structure, there is XML and SGML. TeX can't compete in this area. If it comes to layout TeX is still competitive but not that much. If it comes to interoperability it sucks. Basically the only useful output is postscript. It's not really suitable for online publishing (i.e. HTML) and I have not seem any usable input/output filters for wordprocessors.

    Framemaker certainly does a nice job at all of these things and it's not the only DTP system out there that does so. I think TeX will probably be used for some time by mathematical people since they seem to be a little conservative when it comes to changing software. But the rest of the world has already moved on.

    BTW. I don't hate TeX. I really appreciate the quality of its output. I just think the concept of having to edit/compile/debug is not very suitable for wordprocessing.
  • TeXnically TeX is more a page layout language than a mark-up language - I would say that it is closer to postscript that to html on the global diagram of things. LaTeX, on the other hand, is a mark-up language ( with some formating options ). I think that everyone would agree that mark-up in the purest form is more productive. A title is a title after all, not 14pt bold text, centered horizontally, with a 28 pt whitespace above and below.
  • All WYSIWYG is, is a realtime preview. Realtime feedback in any application has been shown in user studies to increase productivity.

    It's same as compilation vs interpretation. The shorter the compile-edit-debug cycle, the better.

    Well, that's sort of true, but on the other hand, having to do preview in realtime means that you can't do computationally expensive processing very much in a WYSIWYG system. That's one reason that output from [La]TeX looks so much better than output from Word. For a lot of basic things like tight-setting blocks of text and hyphenation, even high-end DTP packages like Pagemaker don't do as well as TeX.

    Also it's relatively easy to "repurpose" documents in any kind of markup, be it TeX, SGML, whatever, by changing the processing you do on them.


    --
    The scalloped tatters of the King in Yellow must cover
    Yhtill forever. (R. W. Chambers, the King in Yellow)
  • Well I hate to tell you, but the experiences I am reporting here are from using LaTeX, not TeX. Yes, LaTeX is in many ways easier to use than TeX, but it still can be difficult to get something right. Especially so if you're writing a technical text and have to get the formatting of formulae, tables, equations and an index just right.

    To repeat, I greatly respect LaTeX and I think that Don Knuth is near unto God. May he live forever. But I cannot agree that WYSIWYG should be wholly discarded for markup languages.
  • Literate programming is a brilliant idea, but CWEB is not, IMHO, a winning implementation of it. When I've experimented with a similar system for writing a literate C program, I found it difficult to keep two separate orderings accurate. One ordering is the sequence of the text. The other ordering is the parts of a C file: #includes first, some function prototypes, then the actual functions. It was tedious to get the generated code to have the pieces in the right order to avoid compiler warnings, and required uninteresting sections containing only references to subsections. IMHO, you'd need a smarter tangler (the program that goes through your input and generates the compilable code) that could recognize #includes and put them up top, automatically generate prototypes for you from your function definitions, and handle such bookkeeping for you.

    Literate programming is also best suited for explaining code that requires complex algorithmic or logical reasoning; I can't imagine writing a literate CGI script, for example, because most of the text would be uninteresting, "here we check if the 'run' form variable is set...". LP would be great for compilers and optimizers, regex engines, numerical software, and other programs where the logic of the code is complex and involves external information such as proofs.

  • All WYSIWYG is, is a realtime preview. Realtime feedback in any application has been shown in user studies to increase productivity.

    Fair enough, as long as the user knows that it's just a preview of how the document may finally appear. Unfortunately, most users view it as being the physical document itself. For example, if a paragraph describing a figure doesn't appear on the same page as the figure, the WYSIWYG user will fiddle with the document until the paragraph and the figure happen to appear on the same page. That works fine until next week when you add another paragraph, which may mess up the formatting. Someone working with te document at the markup level will explicitly tell the computer, "I want these things to be on the same page." One user is working with the logical structure, and the other is trying to manipulate the physical appearance. For most text (especially if it is "living" text that gets edited later), the logical structure is what counts.

    Another place where I've seen WYSIWYG make people do more work is in GUI application development. A "Visual ____" programmer lays out a form on the screen. To make it look good, he lines up a bunch of fields so that they are in a column. But instead of explicitly telling the computer that the fields are supposed to be in a column (like you would with a GUI toolkit like, for example, ClassAct or MUI (sorry, I'm most familiar with Amiga stuff)), he's just putting them all at horizontal pixel #241. Later, when he changes something on the form (even changing a font is enough to mess things up!), all of the WYSIWYG layout has to be redone by the programmer, instead of the computer. Why? Because the visual programmer isn't treating it as just a preview. To him, the preview is the end product.

    Are you going to sit here and tell us that PBM + scripts is better than GIMP?

    No, of course not. With a image processing program, assuming your display is good enough to accurately show the image, the WYSIWYG display is the final product itself. It's just pixels.


    ---
    Have a Sloppy day!
  • Knuth's contributions to computer science is awesome. If all he had done was TeX, he would still be among the greatest. Ditto even if he had'nt written Tex! Admittedly, Knuth probably knows more on combinatorics or "concrete mathematics" than most mathematicians; and more on alogrithms than any computer scientist.

    I could'nt see a post mention another of Knuth's lasting contributions: to parsing. The Yaccs and Bisons owe much to Knuth.
  • FPGAs have been around for a while. It doesn't make any sense to rely on the limited resources and clock speed of an FPGA to implement a CPU to compete with state-of-the-art processors on the market. Sure, it is fun to implement drop-in replacements for 6502 or Z80 in a relatively small Xilinx FPGA; but try coming up with something to compete with Alpha.

    The Freedom CPU project was discussed a while ago in the EDN magazine, and almost all professionals agreed it was completely unrealistic. If the Linux hacker community would start hardware design projects; instead of ocerly ambitious projects like Freedom CPU, they should concentrate on simpler, more realistic, and more useful projects: An open-source design for a PDA can be an example, just think about something like Itsy running Linux, based on a commodity processor, with GPL'd design files freely available on the Net. As with Linux distributors, I am sure there would be a lot of enterprising people who would produce low cost kits/finished products based on the design. MIT had a small embedded computer board design based on the Motorola 6811 that they used in their robotics classes; and the design was (perhaps it still is) freely available over the Net. It was immensely popular among hobbyists a couple of years ago. I have been thinking about posting this to Slashdot for discussion for a while now.

    This would have been great, but Freedom CPU, IMHO, is totally unrealistic. Of course that's my opinion, and time will tell.
  • Again, from the article:

    Before he can rest in the promised land, Knuth faces one last mountain. He must redesign the generalized computer used in his book for programming examples and exercises from a 50-year-old von Neumann-style machine with inefficient commands to a more modern RISC (reduced instruction set computer) system permitting faster operation. (Intel processors in most PCs are of the older variety; PowerPC chips in recent Macintosh models are RISC.)

    "I'm trying to design it so it's 10 years ahead of its time," says Knuth. "I've studied all the machines we have now and tried to take their nicest features and put them all together." This super RISC machine, which he calls MMIX, is essentially a teaching concept. But he says he "would love to see it built.

    I'm spending a lot of time documenting it so someone could build it. The design will be in the public domain." In the midst of his "Computer Musings" series of introductory talks on MMIX, Knuth is mere months away from completing this phase of his work.
  • I think JavaDoc was pretty clearly inspired by Knuth's _Literate Programming_. Although it isn't quite as thorough as WEB, i think Knuth probably deserves the credit for the concept of embedding documentation (as opposed to comments) within source code, and compiling documentation as a combination of code analysis and editorial comments.

    ---
  • "Come again? The MMIX machine had a rather completely specified instruction set, and 64-bit registers wasn't in it. "

    From the MMIX link in the original post:

    "MMIX is a machine that operates primarily on 64-bit words"

    As I understood from the interview, Knuth is totally redoing the MIX architecture to produce a new one called MMIX. Sounds good to me.
  • > If it comes to interoperability it sucks.
    > Basically the only useful output is postscript.
    > It's not really suitable for online publishing
    > (i.e. HTML) and I have not seem any usable
    > input/output filters for wordprocessors.

    There is a Latex to HTML converter, which works resonable, and there is also a tex version that generates output in pdf format.... which seems the standard for web publishing.


    > I just think the concept of having to
    > edit/compile/debug is not very suitable for
    > wordprocessing.

    All depends on the kind of workprocessing. It is ideal for scientific word processing.

    The edit/compile cycle alows one to generate tex documents with other programs. For instance, you can have the results of simulations automatically transformed into tables and graphs. Make some changes to the simulation model and everything can be automatically generated again. Just takes one compilation run. Doing tables and graphs by hand is a drag....
  • by Slothrup ( 73029 ) <curt AT hagenlocher DOT org> on Friday September 10, 1999 @12:19AM (#1691438)
    I know this is completely off-topic, but I have a $2.56 check from Knuth and it's one of my only prized possessions. His attempts to bring varying types of aesthetics to programming are sorely needed. Read _Literate_Programming_.
  • by Suydam ( 881 )
    Knuth refers to TeX as "open-system software". Is this a new term for Open Source? :)

  • Combining Donald Knuth's MMIX [stanford.edu], the Freedom CPU Project [tux.org] and Linux [linux.com] might transfer us into true Cyberspace.
  • by Anonymous Coward
    "Premature optimization is the root of all evil." - Knuth
  • "It took incredible discipline, and several months, for me to read it. I studied 20 pages,
    put it away for a week, and came back for another 20 pages."

    Let's see.. Vol.1: 700odd pages. At 20 pages
    a week, that's just under 3 years, not really
    characterisable as "several months"...

    K.
    -
  • Let's not go into that Freedom CPU thing again. It is a highly dubious project, I remember it being discussed here at some time. If you are waiting for Freedom CPU, good luck.
  • I doubt that Knuth will ever consider ACP to be finished, but I hope he completes the first edition of each remaining volume.


    Chris Wareham
  • by Sloppy ( 14984 ) on Friday September 10, 1999 @12:46AM (#1691446) Homepage Journal

    On the downside, TeX is limited in its appeal because it's not WYSIWYG, Fuchs admits, employing the acronym for "what you see is what you get" the standard term describing text processing software that displays formatting on screen as it will appear on the printed page.

    This seems a good a time as any to get this off my chest: Not only do I consider WYSIWYG to not be an important feature, but I think it has caused a lot of harm.

    Maybe it's just because I grew up using text formatters like "runoff", but one of the things I liked about markup-based formatting was that I always knew that the document would always come out right, even efter I made changes. When I first encountered a word processor (WordPerfect) in the late 80s, even though it wasn't really WYSIWYG, it tried to be, and it infuriated me that whenever I added a paragraph to a document, I always had to scroll down and "fix" all the formatting problems that it might cause further down the document. Eventually I discovered that when I used WordPerfect's "reveal codes" mode, it was about as good as a traditional text formatter.

    And things haven't changed since then. Now the people in the office are using MS Word, and I swear: they spend just as much time manually formatting documents as they spend typing text. It's ridiculous! WYSIWYG makes the user do formatting work that should be done by the computer.

    And then there are the web "masters" (*cough*) who use WYSIWYG HTML-authoring tools. This trains them to think that WYSIWYG is even possible on the web, and they make web pages that look all screwed up if you don't have you window the exact same size as theirs, for example.

    WYSIWYG sucks! Markup forever! Long live runoff, nroff, troff, TeX, HTML, etc!


    ---
    Have a Sloppy day!
  • Seems to be more like a primer to that typesetting system ;-) I haven't written that many documents with it, but I'm very grateful to anyone who contributed as it makes formulas so easy. Figures are getting assigned numbers in the right order! No, this is not going to be flamebait on how bad Word is. Just a 'thank you' to Donald Knuth for his great works, including books such as Concrete Mathematics...
  • Riiiight. Are you maybe taking the Gates quote just a leeetle bit literally there?

    And, if you are going to be literal about things, it is perfectly accurate to characterise 3 years as 'several months'...

    The fact that Gates has read Knuth at all should come as a pleasant surprise - it is by all accounts more than most /. readers have done, IMO.
  • by Mark Gordon ( 14545 ) on Friday September 10, 1999 @12:47AM (#1691449) Homepage
    Mathematics belongs to God, and Knuth is our prophet.
  • and so is the series. Even George Lucas [slashdot.org] would have to admit that Knuth understands the big picture and how to fill it in. I cannot imagine what the world would be like if he had not written this material. Does it mean I'm an old fart if I read and enjoy owning Knuth's books? Then you young buttheads should check it out too!
  • by jabber ( 13196 ) on Friday September 10, 1999 @12:51AM (#1691451) Homepage
    From the interview:


    Knuth says he realized then that TeX wasn't just a digression, it was itself part of the vision. "I saw that this fulfilled a need in the world and so I better do it right."


    This is the crux.
  • ha! These youngsters refer to "open-source software"! Is this a new term for "open-system software?

    Knuth was around long before open-source...I think I'll trust him ;)
  • I've found that markup-based languages like Tex are good for collaborative projects here at work. Since their source is ASCII-text based, it's easy for our source-control system (CVS) to correctly merge versions of a document modified by different authors. We don't have a tool to merge documents stored in binary formats, like Word, Wordperfect, etc.

    - Tim
  • Not necesserily dubious I think. With commodity FPGAs on the horizon I can pretty well imagine Linux hackers sometimes diving into the CPU's architecture to make a few things fit better here and there ... I agree that CPU design needs huge resources - just like OS design ;)
  • I must be way too young..._/Bill Gates/_ had a column?!
  • Backward compatibility accounts for a share too, IMHO.
  • the guy was wrong about the poet too: Piet Hein is definately not Danish but Dutch

    No, he was Danish. See This [netpower.no] quick biography, for example.

    Incidentally, my sig is also paraphrased from and a tribute to Knuth...


  • I too was lucky enough to win a copy of any book I wanted in a semester-long programming contest (1 hard problem per month). Unfortunately, I only got to choose one book, so I kept winning it in succeeding semesters until I received all three! It saved me money that I surely would have spent on these fine volumes.

    On another note, one of Knuth's 50-point problems was solved within the last few years. Check out this site [upenn.edu] to find out more about the problem, and its elegant solution. (The solvers' book, A=B, is available in PDF format until April 2000, I believe.) Knuth was so impressed by their work that he wrote the foreword for their book. Definitely cool stuff.

    -jason

    "If you're not part of the solution, you're part of the precipitate."

  • There is a Latex to HTML converter, which works resonable, and there is also a tex version that generates output in pdf format.... which seems the standard for web publishing.

    - latex2html is not exactly what I mean when I say web publishing. I must admit it is some time I worked with it but last time I saw it it sucked.
    - pdf is nice for distributing printable documents (as an alternative for postsctipt)

    The edit/compile cycle alows one to generate tex documents with other programs.

    Sure you can. You could also generate HTML or XML or any other fileformat.

    I'm not saying TeX isn't useful. It's very useful. It always got the job done for me. But lets face it web publishing from tex is not something you want. It's nice to have these tools if you have a pile of TeX files but otherwise stay awy from it.

    What I am saying is that in most cases there are better alternatives. Unless of course you happen to like hacking awy at your text.

    All depends on the kind of workprocessing. It is ideal for scientific word processing.

    There's certainly something to say for that. Bibtex for instance is a nice thing. If you produce a lot of math stuff. TeX is probably the best alternative. For the rest of the world it isn't.


  • I think there may be a problem with this analysis
    in that a WYSIWIG system has to have some way of interpreting "what you see" and turning it into the coding that you get.

    This amounts to scanning an image an converting it into a logical structure, which is, as they say, non-trivial.

    Look at what happens if you feed a typical HTML file into a WYSIWIG editor, edit it a little, and then try and look at it in a text editor again. Often the WYSIWIG editing generates so much additional, spurious coding that the HTML file has become unreadable to a human being.

    So you can't just call WYSIWIG "realtime preview",
    because it's a one-way trip through that lens. You have to give up control over the logical structure, and hope that the "what you see" will turn out to be "what you want".


  • "About once a month during the academic year, Knuth comes down from the heights to a basement lecture room in the Gates Computer Science Building at Stanford to deliver one of his "Computer Musings" lectures, usually about some aspect of his current work on The Art of Computer Programming. These talks draw computer science students, visiting professors, software engineers from nearby companies and an occasional CEO. On a balmy day earlier this year, the topic and listeners are different. "

    Has anyone been to one of these lecture series? Is it open to the general public?
  • Yeah, it was a long time ago (~5 years ago?). It didn't last very long because it tended to be more self-serving than promote anything productive.
  • but i don't trust the guy who wrote the article ... "open system software" sounds a bit strange to me.
  • And, if you are going to be literal about things, it is perfectly accurate to characterise 3 years as 'several months'...

    By that logic, it's no less accurate to
    characterise the age of the universe as several
    seconds. And I was only talking about volume 1.

    K.
    -
  • Can't say that I agree. I respect TeX a great deal, and I've used it for publication-quality technical papers. There is certainly no better tool for that task.

    But the experience of learning TeX and getting a paper done on deadline with it can be intensely frustrating. Writing a paper with TeX is like programming -- you spend a great deal of your time debugging, chasing down errors that at first are mystifying and enraging. Finding out what the "invalid sequence" is (usually many lines above the place where TeX identifies the error) can be enormously time-consuming and demoralizing.

    I like to program, and I like to write texts. But when I want to write texts, I just want to write a text. I don't want to be forced to write a program in order to get it done. The reason I have a computer is to let it solve most of those problems for me.
  • Come again? The MMIX machine had a rather completely specified instruction set, and 64-bit registers wasn't in it.

    I can see an emulator for mix written in any of several portable languages (pick one that has a really good optimizer). This would allow the algorithms to be directly transcribed, etc. And I think that it already exists. But it would seem to have rather limited uses.

    What might be very interesting would be a series of optimizing translators that would translate from particular cpu machine code into mix, and then back to another machine. The only question I would have about this would be: Why is mix better than either the JVM or whatever Python calls it's virtual machine (PVM?). MIX was designed when hardware was a very expensive resource, but the economics have changed, and it now no longer optimizes the correct things. (Although you MIGHT be able to design a MIX emulator that would run entirely from within the CPU cache. That could be an interesting, if rather machine specific, option.)
  • I had helped someone solve a nasty problem, and it being used in a commercial venture, they wanted to pay me. Company policy didn't permit this since I was a 17 year old kid, and not an official contractor or employee, so a gift of any book in print was offered.

    Without blinking, I asked for (and received) "The Art of Computer Programming" (all three volumes). It still sits on my shelf and I use it as reference material, particularly to point novice programmers where I work to some algorithm they need explained.

    Sometimes they complain, "But it's 25 years old!" My reply is usually something like, "Well, then you've got 25 years to catch up on... this is the best start I can think of."
  • still has, or had until recently, got a column.


    according to a feb 3rd, 1999 article [play.com] at play.com his 'New York Times' column is syndicated throughout the United States as well as internationally. It appears in over 130 papers including the Chicago Tribune and Houston Chronicles.

  • the guy was wrong about the poet too: Piet Hein is definately not Danish but Dutch

    Anyway this is a minor detail, it only annoys me because I'm dutch.
  • I'm glad to see this interview posted on Slashdot. Knuth is certainly one of the great pioneers of the free software movement, among many other accomplishments. In the "Digital Typography" book, his description of why he didn't want Xerox PARC to own the copyright on the fonts he was creating ranks up there with any modern-day philosophizing on the benefits of free software.

    Unlike the people who get a lot of publicity for being free software or Open Source gurus, Knuth hasn't sought a lot of publicity for his contributions. But they are there for all of us to benefit from.

    Much of my work on the printing subsystem of Gnome consists of adapting the beautiful work that Knuth and his students did with TeX and Metafont. The more I study the original sources for this work, the more impressed I am by their depth overall "rightness." I do believe that it is time to adapt the best parts of the work to a more modern environment with interactive editing and so on, but this in no way detracts from the magnitude of the original work. And, because it's free software, I can!

    So, thank you, Don Knuth, for being a pioneer both in computer science and in free software.

  • I personally use LyX for all my non-business word processing. It always does The Right Thing, allows you to get a sense for the structure and organization of your document (which I've always had trouble doing with plain Latex or SGML), but discourages/prevents using ad hoc formatting. It is essentially a visual tool for writing Latex (or DocBook or Literate Programming): its paradigm is WYSIWYM: What you see is what you mean. www.lyx.org [lyx.org].
  • Style sheets in Word and similar programs let you design your formatting in ways remarkably similar to the old TeX/Scribe/Runoff/etc programs. My favourite formatter was Scribe, which worked a lot like style sheets - you could define your styles in a file, and use that file for every document you created. This made it easy to define @section(xxx) and so on. So the actual document with markup was very easy to read.

    I'd like to still use a solution like that today, but TeX is too complex, and as far as I know Scribe is gone (although I think LaTeX is similar in some ways).

    To tell the truth, the main reason this is a dead issue for me is that I write all my documents in HTML, and I print less than 1% of what I write. If I was still producing long, complex documents, I'd definitely be looking for something less crash-prone then Word. TeX might even be the answer, be it ever so complex.

    But, in all seriousness, check out style sheets before discarding what you see might be what you get software.

    D

    ----
  • Basically, CSS takes away the guarantee of consistent rendering (one of TeX's more amazing features). Thus, if you fiddle with the style sheets to get the document looking exactly the way you want it, your columns all aligned, figures moved to really nice logical places, etc., then as soon as you bring the document to another machine, it totally reformats your document. You're back to a rendering which is ok (it does correspond to the style markup), but certainly not excellent.

    TeX, on the other hand, implements consistent rendering. It accomplishes this basically through mind-bogglingly careful craftsmanship. As anyone who's been through the TeX sources knows, it makes no use of floating point. Rather, all computation (including trigonometry and other complex formulae) is done with hand-coded fixed point math. That way, Knuth guarantees that the document will be pixel-for-pixel on any TeX implementation.

    Ironically, TeX is in this way much more WYSIWYG than, say, an HTML editor. What you see (in the xdvi preview or whatever) is exactly what you get when you transport the document to other machines. In an HTML editor, yes you get to see that your font kinda resembles Times, and you get to see the colors, but every browser is going to render the page slightly differently. In my opinion, calling this "WYSIWYG" is stretching the original meaning of the acronym somewhat.

    I think the holy grail is a good interactive editor, with the typographic sophistication of TeX, a nice stylesheet mechanism, and the consistent rendering property. Having the editing interactive means that you have much shorter cycles towards getting the document to look like what you wanted.

    I personally am working toward this goal with the Gnome-Text subsystem in Gnome. I have a good head start because I am using ideas, algorithms, and file formats from TeX. Consequently, there's code in Gnome CVS (under the libhnj module) that can read TeX hyphenation data, then typeset using TeX-like whole paragraph optimization. My current implementation is about 12 milliseconds per page on a 400MHz Celeron. This is fast enough for interactive display.

    So if you really care, you can help me with the coding. Of course, this is /., so I'm not expecting too much :)

  • (The amount is an inside joke: 256 equals 2 to the 8th power-the number of values a byte can represent.)
    This would be a nice analogy nowadays, but when Knuth started to write TAOCP, the equation `eight bits = byte' was far from being universally true. For example, a byte of the MIX computer must be able to hold a minimum of 64 distinct values and at most 100.
  • It's the sexiest math book ever written. Really! The two characters, Alice and Bill get it on like crazed weasels every other chapter! But what can you expect of a man whose canonical photograph shows off his enormous organ?
  • I wonder if anyone will read this. I'm responding so late in the game... Here it goes!

    I read about a study done years ago by AT&T. They gave a documenting group a WYSIWYG document generation system and another went on using a markup system.

    The markup system users were way more productive and the result was judged better.

    The reasons? The WYSIWYG users were always touching up their formatting, a little change here, a little change there. They were wasting their time and producing inconsistent results.

    If anybody reads this and is interested in the reference, I could try and dig it up. I'm pretty sure I still have it.

  • I started off using roff and barb, so tag based editors are where I started, but...
    I rather like WYSIWYG, although it sure is nice to be able to edit the tags directly. There once existed a Mac version of Word Perfect that let one split the screen so that part of it showed the text in WYSIWYG form, and part showed the tags. One could edit in either half. That was a nice feature. I think that they dropped it in later versions.. too bad. The reason that I stopped using it was crashes, not the split screen. The split screen editing was one of the strong points.
  • You said "But when I want to write texts, I just want to write a text. I don't want to be forced to write a program in order to get it done."

    Well, yes. Plain TeX is/was like that. That's one of the reasons why LaTeX appeared. I really don't feel like I'm "programming" when I write a document using LaTeX. Well... maybe when I write the header stuff (the \usepackage[foo]{bar} and \author{baz} stuff), but certainly not when I write the main body of the document.

    LaTeX "encourages authors not to worry too much about the appearance of their documents, but to concentrate on getting the right content." (that last bit was taken from http://www.latex-project.org/intro.html [latex-project.org]).
  • by Anonymous Coward

    All WYSIWYG is, is a realtime preview. Realtime feedback in any application has been shown in user studies to increase productivity.

    It's same as compilation vs interpretation. The shorter the compile-edit-debug cycle, the better.

    TeX is like compilation, WYSIWYG is like interpretation. You just get to see the results faster. There is no reason that markup and wysiwyg are diametrically opposed. One is a data format, the other is a interface.

    Take editors like Word2000/Frontpage2000 (edits XML/HTML), or the word processors/page layout editors which use TeX as the underlying format.

    It is a thousand times better for a human being to select colors from a color wheel or fonts from a font dialog and see a realtime preview, rather than trying FONT COLOR=#123456, loading the page, and retrying.

    A GUI done right, shouldn't lead to pages that are unreadable at different resolutions.


    Command line zealots (the antiGUI type) typically don't understand that there is a 1-to-1 mapping between people clicking menus searching for how to do something (which they deride), and command line users typing "foo --help" and trying different options. Or, the idea that somehow you can script the command line but can't script GUI apps (patently false. properly written, menus are just triggers to program operations and can be triggered via scripts, ala GIMP)

    Are you going to sit here and tell us that PBM + scripts is better than GIMP?




  • How many Donald Knuth are there really? This must be a conspiracy, because a single guy can not do all he did in one life time!


    You could maybe do it if you're using two spare lifes and you don't sleep much, but in my opinion there must be several of them out there:
    • Donald Knuth
    • Donald Jr. Knuth
    • Dollie Knuth
    • Mickey and Pluto Knuth
    • and a few Gizmo Knuth (but please don't feed them after midnight)


    Kroll.
  • WYSIWYG == AYSIAYG (All You See Is All You Get)
  • I find it funny to see Knuth popping up here. Last week I got hold of The Art of Computer Programming Vol 1 to look through the lists and tree stuff again. It gets better everytime I look at it. I was however wondering if anyone else has a source of similar quality info but taking more consideration of the effects of a SMP machine, locks, cache, NUMA etc.

Algebraic symbols are used when you do not know what you are talking about. -- Philippe Schnoebelen

Working...