Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
News

Open Source Acid Test Revisted 129

Kragen Sitaker has written a brutal reply to the story Sengan posted yesterday on the The Open Source Acid Test. It goes down and point by point shows the factual errors and fud in the story. It pleases me greatly to post this feature. Check it out.
The following was written by Slashdot Reader Kragen Sitaker

I read Ted Lewis's article, _The Open Source Acid Test_, on your web pages.

I was appalled that an organ of a prestigious international society like the IEEE would publish such error-riddled, poorly-researched, deliberately deceptive nonsense. It's as if the _New England Journal of Medicine_ had published a case study of a zombie animated by voodoo!

The author did not cite sources for any of his dubious statistics, and they are therefore hard to disprove. Given the remarkable lack of factual accuracy in the article, I doubt that they have any basis in fact.

To begin with the most obvious errors:

  1. Linus Torvalds's name is not Linus Torvold.
  2. Applix, Tower Technology, and NewMonics do not sell open-source software.
  3. There is no such company as "Walnut Creek Stackware". www.cdrom.com belongs to Walnut Creek CDROM. There is no such company as "Tower Tech JVM". www.twr.com belongs to Tower Technology, which sells a (non-open-source) JVM. There is no such web site as www.debian.com.
  4. www.python.org is operated by the Python Software Association, not CNRI, although it is currently hosted on CNRI's network.
  5. Several of the "commercial enterprises" listed in Table 1 are not commercial enterprises at all. www.hungry.com, www.python.org, and www.debian.org are all operated by nonprofit organizations. The Corporation for National Research Initiatives, which was incorrectly listed as operating www.python.org, is actually a not-for-profit research organization.
  6. It is absurd to say that Unix was the foundation for Hewlett-Packard and IBM, as Lewis does in his introductory paragraph. Both companies had been established for more than thirty years when the first line of Unix was written.
  7. On page 126, Lewis claims that the open-source community admits that its organizational structure is weak. The evidence he adduces is a quote from a document published on www.opensource.org. What he doesn't tell you is that the document is *a leaked internal Microsoft memo*. Unless Lewis missed the 115 references to Microsoft in this document and also failed to read the introductory paragraphs, the only reasonable conclusion is that he is being deliberately deceptive.
  8. On page 125, Lewis claims that "Currently, Linux's installed base numbers 7.5 million". As usual, he cites no source. However, the most widely-cited source for such figures is Robert Young's paper, Sizing the Linux Market eight different data sources to obtain an estimate of between five and ten million Linux users. However, this paper has a date of March 1998. If Linux's growth had continued to double yearly in 1998, as it did from roughly 1993 to 1998, the number of Linux users would be between ten and twenty million.
  9. On page 128, Lewis says, "Windows NT market share smothers all Unix dialects combined". According to International Data Corporation's Server Operating Environment report, Unix and Linux together had 34.6% of the server market in 1998, while Windows NT had 36%. See more information. The actual number of server Linux shipments IDC tallied in 1998 was only three-quarters of a million; that suggests that if you include people installing multiple servers from the same CD and installing from Internet downloads, you would find that Linux's server market share is much greater than Windows NT's.
  10. Lewis remarks, "With few exceptions, open source software has never crossed the chasm into the mainstream without first becoming a commercial product sold by a commercial enterprise." Does he think that Linux is not a commercial product sold by commercial enterprises? If not, there are literally dozens of "exceptions" to this statement -- Perl, Apache, sendmail, BIND, Linux, Tcl/Tk, Berkeley DB, Samba, the X Window system, FORTH, GNU Emacs, and trn, for example. Many of these became popular before they were commercially sold at all.
  11. Lewis misstates the business case for Linux and "its open source software cousins". According to Eric Raymond -- whom Lewis quotes extensively elsewhere in this article -- a much more compelling business case is founded on the better quality of the software, choice of suppliers, choice of support and maintenance, freedom from legal exposure and license tracking. More details are available at opensource.org/for-buyers.html.

These minor factual errors, so far, merely indicate that the author knows very little about the topic he writes about and is deliberately trying to mislead his readers; they do not directly undermine his conclusions. However, as I shall show, each of his supporting arguments consist of incorrect facts and lead to faulty conclusions.

One of the author's major contentions is that as Open Source software adds more features and becomes more comparable to proprietary software, it will lose many of its advantages. He cites as examples Linux's supposed lack of video card support, wireless LAN support, and "a good selection of productivity software."; he claims that Unix contains 10 million lines of code, while Linux contains only 1.5 million. On page 126, he says, "Maintenance and support grow more complex, and costs increase due to a scarcity of talented programmers. Success leads to features, and feature creep leads to bloated software."

With regard to video card support, it is true that the Linux kernel does not have video card support in it. That facility is provided by video drivers in other software; nearly all graphical software available for Linux uses X11 for access to those video drivers. Open-source X11 drivers for most video cards are available from www.xfree86.org; the list of supported cards there currently lists 555 different kinds of video cards, many of which include numerous individual models.

For those few cards for which XFree86 support is not available, proprietary X11 drivers are available from Xi Graphics and Metro-Link.

With XFree86, Linux's video card support is better than either Windows 98 or Windows NT, and considerably more extensive than any Unix that does not use XFree86.

To claim that Linux lacks video card support is merely laughable.

With regard to wireless LAN support, it is true that many of the recent wireless LAN products do not currently have support in Linux. However, Linux has had support for packet-radio wireless networking and several kinds of LANs for years, and has supported several wireless LAN products since at least late 1997, including most of the most popular ones:

Lucent Wavelan
DEC RoamAbout DS
Lucent Wavelan IEEE
Netwave Airsurfer
Xircom Netwave
Proxim RangeLan2
Proxim Symphony
DEC RoamAbout FH
Aironet ARLAN
Raytheon Raylink
BreezeCom BreezeNet

This information is readily available on the Web in the Linux Wireless LAN Howto.

With regard to productivity software, there are several office suites available for Linux, and there have been for several years. ApplixWare and StarOffice are the two most common.

With regard to the size of Linux: first, among the utilities tested in the failure-rate study (the latest report on which is entitled "Fuzz Revisited: A Re-examination of the Reliability of Unix Utilities and Services". the quote used on page 125 appears to be from the original paper, which I cannot find on the Web) are the standard set of Unix utilities, awk, grep, wc, and so forth. These utilities have a standard set of functionality common across all Unix systems, except that the GNU utilities tend to have a great deal of extra functionality included. If the GNU utilities really are only one-sixth the size of the corresponding utilities on a Unix system, yet provide much more functionality, and still have one-third to one-sixth of the failure rate, that is not an indictment of the defect rate of free software, but rather a vindication of it -- which is why this study is linked to from the Free Software Foundation's Web pages. The study is unfairly biased in favor of less-featureful proprietary software, and that software still came out way behind.

(From my own experience, I know that frequently, the best workaround for a bug in a Unix utility is to install the GNU version.)

Lewis's claim that this represents "a single-point estimate of defect rate" is incorrect. The paper includes detailed results of the tests on 82 different utilities, along with aggregate statistics by operating system. 63 of these utilities were available either from GNU or from Linux, and were tested in this study.

With regard to the lines-of-code figure: it is not easy to measure the number of lines of code that constitute "Linux", because it is not easy to define what constitutes "Linux" -- or, for that matter, "Unix" either.

If we mean just the kernel, this site has some figures for the sizes of several OS kernels in 1994. SunOS 5.2's kernel is listed as containing 680,000 lines of code, while SunOS 5.0's kernel is listed as containing 560,000 lines of code. If the rate of increase per version remained constant (doubtful, because 5.0 and 5.1 weren't really finished products) then the latest SunOS (the one that's the kernel of just-released Solaris 7) would contain 1,280,000 lines of code.

By comparison, the source code of the 2.2.1 Linux kernel totals 1,676,155 lines of code, including comments and blank lines, counting only .c, .h, and .S (assembly) files.

The Linux project's source code has already reached a level where we would "expect Linux defect densities to get worse". They haven't.

On page 125, Lewis cites Apache as an example of support diminishing when "the hype wears off", saying "it is currently supported by fewer than 20 core members" -- implying that the "cast of thousands" is a thing of the past. The truth is that the core Apache team has never been larger than 20 people, and they *still* receive contributions from many people outside the group. He also says that "Apache is losing the performance battle against Microsoft's IIS." But Apache has never been intended to be the fastest HTTP server around -- it's already more than fast enough to saturate a T1 when running on a puny machine, so its developers have been concentrating on things like adding more features and making it more reliable.

On page 128, Lewis says, "The concept of free software is a frequently practiced strategy of the weak". While free-as-in-price giveaways are common -- Microsoft's Internet Explorer strategy is a perfect example -- they are not related to open-source software, and their patterns of success and failure have little relevance for us here.

This discussion has been archived. No new comments can be posted.

Open Source Acid Test Revisted

Comments Filter:
  • I don't think that quoting this reply is too inappropriate...
    Date: Fri, 12 Feb 1999 10:25:45 -0800
    To: David Woodhouse <David.Woodhouse@mvhi.com>,
    ombudsman@computer.org, software@computer.org
    From: Ted Lewis <tedglewis@friction-free-economy.com>
    Subject: Re: Complaint re: 'Computer' magazine.
    Cc: aburgess@computer.org

    David,

    The April issue of Computer will reply to many of the issues raised by the open source community.
    Grrr. When I hit 'preview', /. removed all the bloody &gt; escapes, and replaced them with the real characters, which then didn't bloody work next time.
  • >Both companies had been established for more than thirty years...

    That's somewhat of an understatement.

    IBM was established in 1888, and incorporated in its present form in 1911. Hewlett-Packard was established in 1938.

    In other words, IBM was founded long before the authours of Unix -- and probably even their parents -- were born.
  • The usual `cypherpunks' user and password is valid.

    -Brett.
  • The 212% growth for Linux cited in various places refers I believe to unit sales, not unit installations. Actual unit growth appears to be about 100% (7.5 million to 15 million), according to the Red Hat survey numbers. See this article [news.com] for more details.

    This is the problem when looking at growth rates: you have to keep track of what it is that's growing (and what the percentage basis is).

  • by drwiii ( 434 )
    I wouldn't mind seeing more of this guy's stuff..
  • Nice writeup, I enjoyed reading it.
  • I must say, after the very low quality of most slashdot "editorials" these days, this one was very refreshing. Very high quality.


    The only thing I should mention is that the New England Journal of Medicine is a purveyor of all too much junk science. Check out one of the best sites on the web http://www.junkscience.com [junkscience.com] to find out more.

  • This is easily the best editorial published by Slashdot in recent memory... Rob, get this guy to be a regular contributor... Congratulations Kragen for your logical and well thought out rebuttal.
  • litteraly! That is exactly what I said yesterday though you stated it a bit more eloquently. :)

    Bottom line: If you don't evaluate and investigate your subjects both objectively and thoroughly, you don't have anything worth saying.

    The author of the origional article should take the Bottom line fact to heart.
  • Posted by neuralfraud:

    That was an excellent summary of that article i beleive. Only an absolute idiot would write something like that, or he is (as you suggest) deliberatley trying to deveice the OSS community.
    in which case, has not done a good job of!


  • I really enjoyed reading this. I'm glad there are some people who can retalliate without having to flame.

    Nice piece of work!


    - Jesper Juhl -
  • but I'm standing on my chair applauding.

    I'm getting funny looks from my coworkers.
  • I think it was RMS who suggested (or rather
    demanded) that what the press commonly refers
    to as "Linux" is really "a common collection
    of GNU utilities and the Linux kernel" thus
    "GNU/Linux", so comparing the Linux kernel
    to the entire UNIX OS is apples and oranges
    and also discredits all the thousands who have
    ported GNU utilities to Linux while the press
    assumes that Linus commands the kernel and
    the common utilities. In fact we should really
    annoy the press and insist what they think is
    Linus is really "XFree + GNU + Linux"

    Also I am highly skeptical of any claim that
    IIS outperforms Apache. Can I see some benchmarks
    to that effect anyone? Perhaps on a Windows
    box Apache is not so great compared to IIS but
    I am confident that Apache + UNIX kicks the crap
    out of IIS + NT.
  • An excellent rebuttal, but you're only preaching to the converted if you don't find a way to get this read. Submit this to Computer suggesting that Ted Lewis needs to defend his previous statements, else Computer magazine itself might be held liable (at least in its readership's eyes) for the inaccuracies, deceptions, and borderline slander in an article they published.
  • This is what I love about /. If BS exists, Rob, Hemos, Sengan, ect. finds it, and /.ers tear it to shreds.

    God, think what we could do if we were experts on the law, or public policy. 'Course, if we were, we wouldn't be cool enough to visit /.

    Oh well, keeping MS in check is enough for now.

    Don Negro
  • I ignored Mr. Lewis' article at first, I figured it was just more FUD for the heap. Reading Kragen's reply caused me to read the "article" in question.

    Mr. Lewis, it seems, it not as much interested in researching facts as he is in promoting his book. His inattention to detail is obvious. Misdirection, assumption, and outright misrepresentation are no way to present an analysis of anything. Mr. Lewis has just destroyed whatever credibility he may have had.
  • I was really happy when I saw the original article. It was full of crap and deliberately misleading ("MS is written by hard-working 60hr a week professionals and Linux is written by part time hackers....etc etc") but I was happy to see it.

    Why was I happy - simple: I'm looking to buy one year Put Options on MS and as people start to believe Linux really is a threat, this diminishes the amount of money I can make. The more FUD I see, the happier I am. It won't help them in the long run.

    MS is powerful, smart, unscrupulous and rich, but they are just one company. How can they compete against the rest of the world ? Why do people use MS at all - because most people care more about the applications than the OS. Yes, there are far more applications for MS at the moment, but that will change. Running a small ISV, which do I prefer: a sensible, reliable OS with free development tools where if I succeed I can keep my market - or a crappy OS where success will just result in being squished like a bug when MS incorporates a poor clone of my product into the OS or into Office ?

  • This is what we need to do when we see FUD, folks: refute it as completely, as acturately, and (most importantly) as unemotionally as possible. Going off the deep end every time an author makes a boo-boo will make us look like a bunch of raving loonies. On the other hand, I think articles like this one present the case for Open Source in the best possible way -- by presenting the *FACTS*.

    --Troy
  • click here [friction-f...conomy.com].

    Looks like Teddy got his Ph. D. back in the days when all you had to do was spell the word COMPUTER correctly. :)
  • The Linux kernel has video support for framebuffer devices, and Apache is miles faster than IIS. From what I understand, IBM have even joined in to make it even faster -and- more feature-complete. (The two are not contradictions, if done properly.)
  • Sorry if this posts twice, I waited about 5 minutes for it to load in the static pages, but didn't see it, so I thought I would try again

    For lack of a better word, I think "modular" covers it. As "Linux*" grows bigger, the modularity of it also grows. Which, in fact improves support.

    Commercial OS's grow bigger and rely on "in house" support. The more features they add to the code base, the more support they have to provide.

    Where as, development "teams" in the Linux commnuity actually provide specific nitch code and support for the code completely independantly of each other... Well, at least more independantly than commercial shops.

    XFree86 and Linux is only one of thousands of examples. Both could stand independantly, Linux with commercial X support, XFree86 on other OS's. Both have thier own development team, thier own specific tasks, and thier own support base and style. Same goes for almost every linux application down to the basics... vim, bash, tcsh, etc.. these aren't written and supported by kernel coders. So the support base branches out as rapidly as the code base.

    Where as, shops like Microsoft use in house "support centers" and in house "development centers." This leads to the "support center" haveing to deal with Windows, Word, WordPad, DOS, on and on and on, each with more features every release. But all just the same support center.

    So, the case of "more Linux* features" creating bloat and failing maintaince and support doesn't hold to the commercial model it would fail in.

    This is how GPL/GNU evolved, little seperate teams, on little seprate projects, functioning independantly, but knowing what thier place in the "big picture" was, because they are carving that place out for themselfs. Thus, the "modularity" of Linux*, and the reason to keep that modularity, and the reason support will scale up efficently with the features in GNU/GPL.

    Linux* -- * denotes, in a generic sence, because Linux to me is the kernel, but what were really talking about here is the whole ball of wax, from kernel up through the highest level applications.

  • That comeback was very good and to the point. It's amazing how many stupid people there are in the world... If people would just do research before they release something, this kind of article would never be published in the first place.

    It's just like the PC vs Mac thing... PC users are always dis the Mac users but Mac users never really yell to the PC users..

    O well.. my $cents = 2;
    ChiefArcher
  • This is not in any way to an attempt to disagree about the accuracy of the excellent rebuttal article. Anyway, according to this Sun white paper [sun.com] Solaris 7 has 400,000 lines of code in the kernel, and 12 million for 'Solaris'.

    Look for "lines of code" in the text - it'll be the 2nd match. the white paper also suggests NT might have some trouble with it's 40 million lines of code, most of which is very recent.

  • I wrote up a several-page treatise on the flaws and errors in his paper, but it appears Kragen beat me to the punch.

    (Good job, BTW: you caught some I missed.)

    For the sake of completeness, here are some he missed:

    On page one, Linux has already failed the acid test of crossing from early adoption to mainstream acceptance, but on page two, Linux is still considered to be in the "free-growth" stage. Which is it?

    He compares the entire Windows NT development group and the code they manage to the Linux kernel team only, attributing to it the task of managing all of Linux. By extrapolating his numbers to all the projects represented by a typical Linux distro, you get as many as 160,000 developers represented.

    Did you know Linux was losing market share? "Linux is still Unix, and Unix is losing market share, therefore Linux is losing market share, and Microsoft would never support a shrinking market." That gem is on page 4. Someone needs to go tell IDC.

    If anyone's interested, I can post the paper somewhere.
  • Did someone make sure and send a linke to the author of the original inflamatory article?
    --
  • (sorry...quoting Monty Python)

    I have been debating if it was worth my renewing my membership with the IEEE. If they keep allowing poorly researched, glaringly inaccurate articles such as that then I will not renew...ever. I used to have a lot of respect in their opinions and facts but that article sucked aol startup disks. I wand to see that chap in an IRC room where we can have a little discussion about his work and to see if he will stand behind his paper.

    Personally I dinnae think he has the balls.
  • I find it interesting that in blasting this article (which I agree is full of poor arguements and _very_ poor statistics), /.'ers have chosen to totally ignore Netware. In another context that might be considered FUD, don't you think?

    Here is my totally unscientific guess on server market share:

    Boxes installed
    1 - Netware
    2 - *nix (including Linux)
    2a - Sun
    2b - Linux
    3 - NT
    4 - AS/400
    5 - IBM mainframe

    Bits served - internal networks
    1 - IBM mainframe
    2 - Netware
    3 - AS/400
    4 - *nix (Sun probably first)
    3 vs. 4 could certainly be argued
    5 - NT

    Bits served - external networks (Internet)
    1 - *nix
    1a - Sun (by far)
    1b - other *nix including Linux
    2 - unknown? Could be NT, IBM mainframe, or other.

    Server market share
    This one is tricky. By boxes, by bits served, by $$ value of systems sold? If I take out 5 Netware servers and replace them with 20 NT boxes, what does that do to the market share of each? If I replace the 20 NT boxes with 2 Linux machines, again what does that do to market share?

    By installed base or growth rate? I would guess that Novell has by far the largest installed base, but the lowest current growth rate. Linux - high growth rate, but that doesn't mean much if your initial installed base is small.

    Finally, 1997, 1998, or 1999 base/growth rate? Novell lost market share and installed base in 1997, bleeding like a stuck pig. I don't think that process continued at the same rate in 1998, or will continue in 1999 based on discussions with peers.

    No answers here, just some observations and questions. But don't forget other platforms, most notably Netware and AS/400.

    sPh
  • Given that the media/drones/masses are so remarkably inept at establishing the difference between a kernel and an OS, how about if we start calling the Linux/GNU/X combination..

    TuX

    ?
  • Because I nor anyone really reads the past postings/replies, did someone make a copy of the original article to which this rebukes?

    If so, location?

    Thanks...

  • When I read something like that original article the first thing that comes to my mind is "Hmm, I wonder how much actual, personal experience this person has with using open-source software?"


    It is often the case that a question so obvious as that isn't even posed - people just accept a person's statements and credentials without examination. As an example, I've heard people
    extolling the superiority of NT's scheduling
    vs. linux's but these same people nearly inevitably respond with a sheepish "no" when
    I ask: "Ok, since you're so obviously an expert in computer science - could you sit down right now and code even a bubble sort in any programming language?".


    Too many people are just parrots - repeating some spiffy-sounding fact-byte they've been fed with having anywhere near the knowledge to objectively analyze the statement for themselves.


    Anyone can sound technically competent in the broad sense - I've interviewed probably 100+ developers in the last 4 years and it is
    astounding how wide the gap sometimes is between the level of experience/knowledge professed on their resume and their actual competency when it comes down to specific questions.


    I suppose (for the author of the original article) that's the downside of spewing out a rant like he did that's likely to be read by
    the programming community - we don't tend to be easily impressed
    by hand waving, just boring old facts for us, thanks.

  • by hany ( 3601 )
    thanks to Kragen Sitaker
  • "Maintenance and support grow more complex, and costs increase due to a scarcity of talented programmers. Success leads to features, and feature creep leads to bloated software."


    Kragen is right on his analysis (sp?), in the original article this was (above) the sentence that made me thing Ted was right, and this sentence still worries me... sure linux becomes big, "too" big?
    --
  • The facts are in. Talk to your CS prof, and get the momentum to see this tripe rebutted in the pages of the original journal.

  • by Puff ( 3954 )
    Preach it, Brother!
  • by law ( 5166 )
    This is the reason I keep coming back to Slashdot.

  • He really has no right to demand anything. All he is concerned about is his piece of the spotlight.
  • I'd love to read more articles from Kragen.

    -gabriel
  • The normal cypherpunks/cypherpunks user/passwd works on this site, like everywhere else. I've seen lots of people saying 'I couldn'd read the original article, but ...' so I thought I'd post this reminder!
  • I've had this edition of Computer magazine for a week now and have been seriously planning to write the editors regarding this article. Now I've got someplace to point them in addition to my own gripes.

    Way to go.

    -Chris
  • Excellent work. I was suspicious of what kind of a rebuttal there was going to be, but this was remarkably well done and well thought out.
  • Excellent response. We should make sure that
    the 'Computer' editors read this and make
    sure Mr.Lewis's writings are ignored in the
    future.
  • Thank you
  • IIS on NT can be faster than Apache on *nix under certain unrealistic circumstances, but not under anything you'd call "realistic conditions".

    With a fairly high-powered box (dual P-Pro and up) that is not heavily loaded, that only servers static files and never CGIs, database lookups, SSIs or any other dynamic content then yes, IIS is a bit faster. But as Apache Performance Tuning [apache.org] says,

    Most sites have less than 10Mbits of outgoing bandwidth, which Apache can fill using only a low end Pentium-based webserver. In practice sites with more bandwidth require more than one machine to fill the bandwidth due to other constraints (such as CGI or database transaction overhead).

    If you add anything like CGI, mod_perl, PHP, SSI or whatever, Apache quickly takes the lead. I can offer a rough performance comparison between Apache + PHP + MySQL on FreeBSD and IIS + Cold Fusion + Access on WinNT. The former was maxing out a K6/2-350 with 256Mb RAM at 200,000 page views per day. The latter was maxing out identical hardware at under 5,000 page views per day. Admittedly they weren't identical sites (actually the Apache site had much more complex db work). More importantly, to compare IIS with Apache accurately we should have had them both connecting to a database on a separate machine (and not use Access!), but nonetheless it makes clear that the IIS on NT option is technically less preferable.

  • Sometime ago (I think it was last September) I read that IEEE had selected Microsoft as the Most Innovative Company of the Year(**). Since I couldn't think of anything MS had innovated in the preceding year, and since this was the same year when huge strides were being made elsewhere (Linux, Java, Real Media, cable modems, you name it), I concluded that MS had managed to stack the board of IEEE, and that the award was a show put on for the DOJ.

    It appears that IEEE is no longer objective, which makes them irrelevant as a standards body.

    ** - Hearsay Warning: Although I read this in a major publication (one of the major weekly PC-industry papers), I was never able to corroborate it on IEEE's website.
  • Great rebuttal.
    I have hard time believing that IIS is supposedly faster. Didnt ZDnet run some tests that proved linux 2x faster? Im too lazy to look up the article.

    chad
  • *This* is what editorials should be like, not the meandering, semi-amateurish prose we get from JK.

    It's a pleasure to read something clearly laid out and thought out. Good job Kragen.

    JB
  • by Pym ( 8890 )
    It goes to show that if anyone says the *INX people are fighting some religious war against the infidels, they need to be pointed to articles like the load of FUD blasted in this excellent reply. In the workplace, I'm fairly constantly dealing with arguments of this type, when my questions expect technical answers about the viability of NT server as a reliable enterprise solution. I'm treated as a zealot hyping some hacker kiddie OS, yet...the examples against the NT server 'solution' are many and factual and the arguments in favor of NT are often FUD. I work both sides, like 'em, but this is -not- a 'juvenile and doomed religious' war, this is a war for technical truth and clear facts. This is a war against implimentation based on ignorance, informed or uninformed. Given this, We Will Always Win as long as we have people like Kragen and countless others to stand up and tell it like it is, -even if- we do not agree. Just the Facts.

    Question the dominant paradigm.

    Pym
  • >A few hours later I received this reply:

    > Craig,

    > Thanks for the kind words. I try to get my readers to think out of the box once in
    > awhile. Sometimes it isn't possible.

    > --ted

    >Yeah, I found it unbelievable, too.

    Maybe what he meant by the phrase ``to think out of the box" was ``I just took some LSD for the first time & it was so kewl that I think you ought to do it to."

    Geoff
    A Computer Society member
  • Linux trounces NT [zdnet.com] as a server platform. "Apache is losing the performence battle" ? hmmm ...
  • Mr. Lewis is more full of himself than the average "expert", and neglected to check the facts. I found it odd that he failed to chart the rate of increase of Windows LOC as he did with Linux. Also odd was that he extrapolated Linux LOC THREE WHOLE YEARS into the future from the last "measured" data point. The claim that Linux LOC will continue to grow exponentially (and at the rate he has selected...ever wonder if he chose the most pessimistic curve that could fit the data?) is a bit silly. Just think...in another decade we could be running operating systems that boast over 10 billion LOC. It also looks as if he is claiming that Linux will be bigger than Windows 2000 when it comes out in 2002 (*chuckle*).

    I also absolutely disagree with his issue about support. Usenet support is far superior to MS's online (or telephone...who bothers to try THAT these days?) support, and far more responsive. Additionally, online HOWTOs and included man pages are much better than anything Win98 has to offer. Just the other day, I swapped out my modem for a USR internal (non-Win) modem so that I could maybe connect up to my ISP after rebooting in Linux. After tiring of this, I removed it and reinstalled my original modem. Ohmigod...endless headaches. There's nothing in any MS docs or help files that addresses anything that I ran into. (PortDriver busy or missing on COM4? Nothing is even USING COM4!) My lesson out of all this? Don't even think about fscking with your modem. Plug'n'Play...doesn't. His assertion that Linux advocates are inherent microsoft-bashers is partially true...to the extent that people are sick and tired of crap like I just described, and want software that is written for them, not dumped on them.
  • Well said that man!!!
    An excellent piece of debating.
    Skillful demolition of the spurious bluff from the original author.
    In the true spirit of debate, I'd love to have the original author come back for a 'summing up', and a chance to re-state his point of view in light of the researched figures, then allow one more posting in rebuttal..
    That, I think, would be entertainment.. :)

    Malk.
  • Maybe I'll read the original article now that I have some perspective.
  • It's silly compareing the number of lines of code in Linux to the number of lines of code in Windows or any other single platform operating system. Linux runs on x86, PPC, ARM, Sparc, MIPS, 68000, Alpha, and many other processors. I've never seen Windows (NT) running on anything but x86 and Alpha. We're talking apples and oranges.
  • This response was an excellent counter (to the FUD that should never have been published by IEEE) -- well written and very informative (for me anyway).
  • by afniv ( 10789 )
    I wish I've seen the quote in a Micro$oft add as well. 'til I see it, I don't believe it.

    ~afniv
    "Man könnte froh sein, wenn die Luft so rein wäre wie das Bier"
    "We could be happy if the air was as pure as the beer"
  • What I have noticed is that due to the underlying drive of OSS, that of "scratching an itch" and the availability of time and resource vs deadlines and budgets of commercial software, when software bloats, that too becomes an itch.

    For example the previous posters comment about modular drivers vs drivers in the kernel comes to mind. Another is the CVS tree's as it became harder to write the Kernel and Utilities.

    Protocol and Interface Standards are another weopen against the complexity of projects.

    As OSS Software evolves and becomes more complex, solutions to complexity evolve along with it.

    With OSS it is viable to say, ok we will have to rethink this and derive a better, more scalable approach. With Commercial software it is a lot harder to say "Damn we need to scrap this approach and rethink it, BTW this adds another 6 months to the dealine". What percentage of project managers are going to have the force of will to push through this change at the sacrafice of a deadline, not many. There is also the preassure of those higher in the hierachy, who never seem to like a predictor of doom.

    That is why I feel that OSS will prevail over commercial software in he long term.

    IceTiger
    PS Yeah I can code a bubble sort :)
  • (see subject)

    Methinks this eloquence is wasted on the choir...
  • Feature creep is not likely to become a problem with free software. Feature creep is mostly a function of the commercial software business model.

    At a certain point, given enough time and support, most software reaches a point where it is fairly polished, and does a good job at what it is intended to do.

    In the free software development model, at this point, the software developers, having done their work and satisfied themselves, move on to new projects, and except for bug fixes and polishing, the software becomes stable and complete.

    In the commercial software development model, there is no time to be wasted ... more features MUST be added in order to produce a new version, because the way a commercial software company makes money is by bringing version n+1 to market, and charging its users for the upgrade.

    I could name a number of software products that were much better two or more versions ago. None of them are free or open source software.

    - John

  • I've got the whole Internet on a CD, AOL sent it to me for free.. J. Dvorak said so, so it must be true.

    The original article, I'm afraid, is another in a continuing trend. A little knowledge is dangerous, and tight deadlines, coupled with a desire to grand-stand, in addition to 'incentives' from corporate 'sponsors'; ultimately lead to irresponsible reporting.

    CNN, and the grand-daddy of FUD ZDNET, have always thrown buzzwords and fuel on the fire, without really checking their facts and sources. Their (ZDNET) authors and editors probably get a beta and free support for the latest M$ product - and so journalistic ethics get swept under the rug.

    What is shameful, is that the original article appeared in an off-shoot of the IEEE. (!?!?!) If we can't trust the IEEE to lead by example, maybe journalism should also become a grass-roots, open-source effort.

    As for the reply to the drivvel.. Well done! Way to burn down the 'straw-man'. But it'd be even better to cram it down the author's throat by submitting it to a trade journal.

  • Excellent response. I am trying to decide which email addresses at IEEE to send my complaints about the original article to. I am a member and have been for 10 years.

    I wonder if this kind of shoddy journalism would be tolerated if the victim was a company that would sue? I hope they get a lot of responses from members.

    Chuck
  • The "Slackware" distribution of Linux (http://www.slackware.org [slackware.org]) is distributed by Walnut Creek, I believe. The Slackware CD does mention Walnut Creek, and Walnut Creek is also mentioned on Slackware.org.

    Ain't no such thing as Stackware, though.

  • Wasn't it PCWeek that just published benchmarks on identical machines, showing Apache outserve IIS by 50%?

    By the way, that doesn't even include a 2.2 kernel.
  • COMPUTER is not peer-reviewed, so this is a matter of the editor choosing to print an article by someone who is well known and perhaps a friend in old-time military computing circles. That world, like the corporate glass house mainframe one, works quite differently than ours. For example, most people don't know that Ada is still a key language in military computing. (Actually, Ada is pretty cool in a lot of ways.)

    But the spec-driven project development and management perspective of the military is drilled deep down into the ethos there, as well as a certain type of "analysis" that Lewis' piece typifies, which manages to miss both the forest and the trees.

    The outrage here is not misplaced, but I see no need to flame the guy. He's just lording it over us in the good old fashioned way.

    --------
  • I got a similar reply. Let me quote Phil Lesh and Robert Hunter, whose words zero in on the heart of the issue in cases like this:

    "You ain't gonna learn what you don't wanna know."

    --------

  • I would have thought that Computer magazine and the IEEE would be interested to know, if they don't already, about the factual inaccuracies in this article - how much pressure would need to be exerted to get them to print a correction or six?

    If someone big in the GNU/Linux/Open Source world, or the author of this well argued piece, were to make official representations to Computer, we might well get a result :-)

    Linus, are you listening?

    (Er... no ;-)

    Gerv
  • Actually, according to ZDNET, noted Microsoft apologist, their WebBench tests show Apache running under the three most popular commercial Linux distributions to significantly outperform IIS 4.0 running under NT 4.0 with SP 4. In fact they "ate NT's lunch", performing between 16 and 50 percent faster. So the original article is just plain wrong about Apache losing the performance battle.
  • Try www.unix-vs-nt.org
  • ...Kragen. This is exactly the way to handle FUD: With the facts, and just the facts, Jack.
  • I'd counter that sentence with a little annecdotal evidence. (That sentence bothered me too)

    Once, if I wanted to install a new device+driver, I had to compile it into the kernel (=bloat). Now all I have too do is load the driver-module (=counter-bloat), as is done in RH.

    I can still compile the features into the kernel if I want to, but I don't have to. The driver can be modified, recompiled and integrated in future developments; the hardware hasn't been rendered unusable.

    Sure, the modular version is going to be "bloated" because of the interface overhead, but at least it's not static.

    I guess what I'm getting at is, there is a BIG difference between closed-source (static) bloat and open-source (dynamic) bloat. Change.
    Also, cost rises exponentially if you can't understand the code; opensource encourages readability. Scarce talent today != scarce talent tomorrow.

    And last, but certainly not least, Linux+GNU has a secret weapon. You know all those old/used computers that have been and are being shipped to developing nations? Which software+tools do you think they'll learn, use, and develop.?. :-)
  • This is one of the best anti-FUD articles I have ever read. CmdrTaco -- Sign this guy up!!
  • Thanks to someone who is quicker than I with HTML the original article is avalible.
    Well, I am probably rather biased, but the article had a rather sarcastic flavor to it. I guess reading /. will do that. It is almost a joke!
    I have gathered from the other replies I have read that the author of teh original article, one Ted Lewis, has in teh past been considered respectable. I agree that that article is a piece of garbage that is not even worth being used as toilet paper, even if printed on something nice and soft!
    But I see only one alid point, the Linux kernel is getting rather large and the inflow of patches and code dose make it difficult for one person to organize everything.
    But if you read patch summaries, easily found through http://www.linuxhq.com/ [linuxhq.com] you will note that Linus isn't without help. Collecting of patches and source is done by other people, one being Alan Cox, who passes this on to Linus.
    I belive that Linux will thrive, probably not take over, but thrive. The internet is becoming almost an ecosystem in itself and the fit will survive, and those with open source will evolve stronger than beasts that release only biniaries.
    But this is only my opinion
    PsychoFreak
  • I have no words for the praise I have for this poster. He rebutted the article clearly, concisely, and with more facts and analysis that you can shake a stick at. More users should take this example, and respond to such articles with this sort of rebuttal. Even more, they should be emailed to the appropriate news agencies to get their appetites whetted. We all know that many of the agencies lick the boots of those in power, however something like this, which refutes all of the basic premises the article was founded on, cannot be challenged in any substantive way. He is right. The article was wrong. Mail it away to someone who might publish it elsewhere.

    We need to raise our voices high when something like this happens. 8-)

    ---
  • One more point about the Apache performance claim: Remember that article a few weeks ago by noted MS partisan Ziff-Davis that showed that Apache kicked NT's butt for web throughput, and for Samba, NT's native language? Guess this guy couldn't be bothered to actually check to see which product was better before spouting off about it.
  • One more point about the Apache performance claim:

    Remember that article a few weeks ago by noted MS partisan Ziff-Davis that showed that Apache kicked NT's butt for web throughput, and for Samba, NT's native language?

    Guess this guy couldn't be bothered to actually check to see which product was better before spouting off about it.

  • I'm certainly going to be sending email. And snail mail. I thought Ted was off base about java, but I didn't say anything about it. This latest round is just too much.
  • "I try to get my readers to think out of the box once in awhile"... I read that to mean that he knew his article was whacked and incorrect, but he decided to troll with it. Great, what a service to humanity.

I cannot conceive that anybody will require multiplications at the rate of 40,000 or even 4,000 per hour ... -- F. H. Wales (1936)

Working...