Open Source Acid Test Revisted 129
I read Ted Lewis's article, _The Open Source Acid Test_, on your web pages.
I was appalled that an organ of a prestigious international society like the IEEE would publish such error-riddled, poorly-researched, deliberately deceptive nonsense. It's as if the _New England Journal of Medicine_ had published a case study of a zombie animated by voodoo!
The author did not cite sources for any of his dubious statistics, and they are therefore hard to disprove. Given the remarkable lack of factual accuracy in the article, I doubt that they have any basis in fact.
To begin with the most obvious errors:
- Linus Torvalds's name is not Linus Torvold.
- Applix, Tower Technology, and NewMonics do not sell open-source software.
- There is no such company as "Walnut Creek Stackware". www.cdrom.com belongs to Walnut Creek CDROM. There is no such company as "Tower Tech JVM". www.twr.com belongs to Tower Technology, which sells a (non-open-source) JVM. There is no such web site as www.debian.com.
- www.python.org is operated by the Python Software Association, not CNRI, although it is currently hosted on CNRI's network.
- Several of the "commercial enterprises" listed in Table 1 are not commercial enterprises at all. www.hungry.com, www.python.org, and www.debian.org are all operated by nonprofit organizations. The Corporation for National Research Initiatives, which was incorrectly listed as operating www.python.org, is actually a not-for-profit research organization.
- It is absurd to say that Unix was the foundation for Hewlett-Packard and IBM, as Lewis does in his introductory paragraph. Both companies had been established for more than thirty years when the first line of Unix was written.
- On page 126, Lewis claims that the open-source community admits that its organizational structure is weak. The evidence he adduces is a quote from a document published on www.opensource.org. What he doesn't tell you is that the document is *a leaked internal Microsoft memo*. Unless Lewis missed the 115 references to Microsoft in this document and also failed to read the introductory paragraphs, the only reasonable conclusion is that he is being deliberately deceptive.
- On page 125, Lewis claims that "Currently, Linux's installed base numbers 7.5 million". As usual, he cites no source. However, the most widely-cited source for such figures is Robert Young's paper, Sizing the Linux Market eight different data sources to obtain an estimate of between five and ten million Linux users. However, this paper has a date of March 1998. If Linux's growth had continued to double yearly in 1998, as it did from roughly 1993 to 1998, the number of Linux users would be between ten and twenty million.
- On page 128, Lewis says, "Windows NT market share smothers all Unix dialects combined". According to International Data Corporation's Server Operating Environment report, Unix and Linux together had 34.6% of the server market in 1998, while Windows NT had 36%. See more information. The actual number of server Linux shipments IDC tallied in 1998 was only three-quarters of a million; that suggests that if you include people installing multiple servers from the same CD and installing from Internet downloads, you would find that Linux's server market share is much greater than Windows NT's.
- Lewis remarks, "With few exceptions, open source software has never crossed the chasm into the mainstream without first becoming a commercial product sold by a commercial enterprise." Does he think that Linux is not a commercial product sold by commercial enterprises? If not, there are literally dozens of "exceptions" to this statement -- Perl, Apache, sendmail, BIND, Linux, Tcl/Tk, Berkeley DB, Samba, the X Window system, FORTH, GNU Emacs, and trn, for example. Many of these became popular before they were commercially sold at all.
- Lewis misstates the business case for Linux and "its open source software cousins". According to Eric Raymond -- whom Lewis quotes extensively elsewhere in this article -- a much more compelling business case is founded on the better quality of the software, choice of suppliers, choice of support and maintenance, freedom from legal exposure and license tracking. More details are available at opensource.org/for-buyers.html.
These minor factual errors, so far, merely indicate that the author knows very little about the topic he writes about and is deliberately trying to mislead his readers; they do not directly undermine his conclusions. However, as I shall show, each of his supporting arguments consist of incorrect facts and lead to faulty conclusions.
One of the author's major contentions is that as Open Source software adds more features and becomes more comparable to proprietary software, it will lose many of its advantages. He cites as examples Linux's supposed lack of video card support, wireless LAN support, and "a good selection of productivity software."; he claims that Unix contains 10 million lines of code, while Linux contains only 1.5 million. On page 126, he says, "Maintenance and support grow more complex, and costs increase due to a scarcity of talented programmers. Success leads to features, and feature creep leads to bloated software."
With regard to video card support, it is true that the Linux kernel does not have video card support in it. That facility is provided by video drivers in other software; nearly all graphical software available for Linux uses X11 for access to those video drivers. Open-source X11 drivers for most video cards are available from www.xfree86.org; the list of supported cards there currently lists 555 different kinds of video cards, many of which include numerous individual models.
For those few cards for which XFree86 support is not available, proprietary X11 drivers are available from Xi Graphics and Metro-Link.
With XFree86, Linux's video card support is better than either Windows 98 or Windows NT, and considerably more extensive than any Unix that does not use XFree86.
To claim that Linux lacks video card support is merely laughable.
With regard to wireless LAN support, it is true that many of the recent wireless LAN products do not currently have support in Linux. However, Linux has had support for packet-radio wireless networking and several kinds of LANs for years, and has supported several wireless LAN products since at least late 1997, including most of the most popular ones:
Lucent Wavelan
DEC RoamAbout DS
Lucent Wavelan IEEE
Netwave Airsurfer
Xircom Netwave
Proxim RangeLan2
Proxim Symphony
DEC RoamAbout FH
Aironet ARLAN
Raytheon Raylink
BreezeCom BreezeNet
This information is readily available on the Web in the Linux Wireless LAN Howto.
With regard to productivity software, there are several office suites available for Linux, and there have been for several years. ApplixWare and StarOffice are the two most common.
With regard to the size of Linux: first, among the utilities tested in the failure-rate study (the latest report on which is entitled "Fuzz Revisited: A Re-examination of the Reliability of Unix Utilities and Services". the quote used on page 125 appears to be from the original paper, which I cannot find on the Web) are the standard set of Unix utilities, awk, grep, wc, and so forth. These utilities have a standard set of functionality common across all Unix systems, except that the GNU utilities tend to have a great deal of extra functionality included. If the GNU utilities really are only one-sixth the size of the corresponding utilities on a Unix system, yet provide much more functionality, and still have one-third to one-sixth of the failure rate, that is not an indictment of the defect rate of free software, but rather a vindication of it -- which is why this study is linked to from the Free Software Foundation's Web pages. The study is unfairly biased in favor of less-featureful proprietary software, and that software still came out way behind.
(From my own experience, I know that frequently, the best workaround for a bug in a Unix utility is to install the GNU version.)
Lewis's claim that this represents "a single-point estimate of defect rate" is incorrect. The paper includes detailed results of the tests on 82 different utilities, along with aggregate statistics by operating system. 63 of these utilities were available either from GNU or from Linux, and were tested in this study.
With regard to the lines-of-code figure: it is not easy to measure the number of lines of code that constitute "Linux", because it is not easy to define what constitutes "Linux" -- or, for that matter, "Unix" either.
If we mean just the kernel, this site has some figures for the sizes of several OS kernels in 1994. SunOS 5.2's kernel is listed as containing 680,000 lines of code, while SunOS 5.0's kernel is listed as containing 560,000 lines of code. If the rate of increase per version remained constant (doubtful, because 5.0 and 5.1 weren't really finished products) then the latest SunOS (the one that's the kernel of just-released Solaris 7) would contain 1,280,000 lines of code.
By comparison, the source code of the 2.2.1 Linux kernel totals 1,676,155 lines of code, including comments and blank lines, counting only .c, .h, and .S (assembly) files.
The Linux project's source code has already reached a level where we would "expect Linux defect densities to get worse". They haven't.
On page 125, Lewis cites Apache as an example of support diminishing when "the hype wears off", saying "it is currently supported by fewer than 20 core members" -- implying that the "cast of thousands" is a thing of the past. The truth is that the core Apache team has never been larger than 20 people, and they *still* receive contributions from many people outside the group. He also says that "Apache is losing the performance battle against Microsoft's IIS." But Apache has never been intended to be the fastest HTTP server around -- it's already more than fast enough to saturate a T1 when running on a puny machine, so its developers have been concentrating on things like adding more features and making it more reliable.
On page 128, Lewis says, "The concept of free software is a frequently practiced strategy of the weak". While free-as-in-price giveaways are common -- Microsoft's Internet Explorer strategy is a perfect example -- they are not related to open-source software, and their patterns of success and failure have little relevance for us here.
A response from Ted Lewis (Score:1)
Only thirty years? (Score:1)
That's somewhat of an understatement.
IBM was established in 1888, and incorporated in its present form in 1911. Hewlett-Packard was established in 1938.
In other words, IBM was founded long before the authours of Unix -- and probably even their parents -- were born.
How to read the original article. (Score:1)
-Brett.
Shipments, not units, grew 212% (Score:1)
This is the problem when looking at growth rates: you have to keep track of what it is that's growing (and what the percentage basis is).
i agree (Score:1)
No Subject Given (Score:1)
Very Good Post (Score:1)
The only thing I should mention is that the New England Journal of Medicine is a purveyor of all too much junk science. Check out one of the best sites on the web http://www.junkscience.com [junkscience.com] to find out more.
Anti-FUD Bomb (Score:1)
You took the words right outta my mouth... (Score:1)
Bottom line: If you don't evaluate and investigate your subjects both objectively and thoroughly, you don't have anything worth saying.
The author of the origional article should take the Bottom line fact to heart.
Ted Lewis is a moron (Score:1)
That was an excellent summary of that article i beleive. Only an absolute idiot would write something like that, or he is (as you suggest) deliberatley trying to deveice the OSS community.
in which case, has not done a good job of!
Simply beautifull!!!!! (Score:1)
I really enjoyed reading this. I'm glad there are some people who can retalliate without having to flame.
Nice piece of work!
- Jesper Juhl -
You can't see me, (Score:1)
I'm getting funny looks from my coworkers.
The press still doesn't know what Linux is (Score:1)
demanded) that what the press commonly refers
to as "Linux" is really "a common collection
of GNU utilities and the Linux kernel" thus
"GNU/Linux", so comparing the Linux kernel
to the entire UNIX OS is apples and oranges
and also discredits all the thousands who have
ported GNU utilities to Linux while the press
assumes that Linus commands the kernel and
the common utilities. In fact we should really
annoy the press and insist what they think is
Linus is really "XFree + GNU + Linux"
Also I am highly skeptical of any claim that
IIS outperforms Apache. Can I see some benchmarks
to that effect anyone? Perhaps on a Windows
box Apache is not so great compared to IIS but
I am confident that Apache + UNIX kicks the crap
out of IIS + NT.
Get This Out (Score:1)
Amen Brother (Score:1)
God, think what we could do if we were experts on the law, or public policy. 'Course, if we were, we wouldn't be cool enough to visit
Oh well, keeping MS in check is enough for now.
Don Negro
"then they fight you" (Score:1)
Mr. Lewis, it seems, it not as much interested in researching facts as he is in promoting his book. His inattention to detail is obvious. Misdirection, assumption, and outright misrepresentation are no way to present an analysis of anything. Mr. Lewis has just destroyed whatever credibility he may have had.
Excellent article, damn it all. (Score:1)
Why was I happy - simple: I'm looking to buy one year Put Options on MS and as people start to believe Linux really is a threat, this diminishes the amount of money I can make. The more FUD I see, the happier I am. It won't help them in the long run.
MS is powerful, smart, unscrupulous and rich, but they are just one company. How can they compete against the rest of the world ? Why do people use MS at all - because most people care more about the applications than the OS. Yes, there are far more applications for MS at the moment, but that will change. Running a small ISV, which do I prefer: a sensible, reliable OS with free development tools where if I succeed I can keep my market - or a crappy OS where success will just result in being squished like a bug when MS incorporates a poor clone of my product into the OS or into Office ?
Ouch... Nice article. (Score:1)
--Troy
Here is a useful Ted Lewis link (Score:1)
Looks like Teddy got his Ph. D. back in the days when all you had to do was spell the word COMPUTER correctly.
Just a couple of points... (Score:1)
Modular (Score:1)
Sorry if this posts twice, I waited about 5 minutes for it to load in the static pages, but didn't see it, so I thought I would try again
For lack of a better word, I think "modular" covers it. As "Linux*" grows bigger, the modularity of it also grows. Which, in fact improves support.
Commercial OS's grow bigger and rely on "in house" support. The more features they add to the code base, the more support they have to provide.
Where as, development "teams" in the Linux commnuity actually provide specific nitch code and support for the code completely independantly of each other... Well, at least more independantly than commercial shops.
XFree86 and Linux is only one of thousands of examples. Both could stand independantly, Linux with commercial X support, XFree86 on other OS's. Both have thier own development team, thier own specific tasks, and thier own support base and style. Same goes for almost every linux application down to the basics... vim, bash, tcsh, etc.. these aren't written and supported by kernel coders. So the support base branches out as rapidly as the code base.
Where as, shops like Microsoft use in house "support centers" and in house "development centers." This leads to the "support center" haveing to deal with Windows, Word, WordPad, DOS, on and on and on, each with more features every release. But all just the same support center.
So, the case of "more Linux* features" creating bloat and failing maintaince and support doesn't hold to the commercial model it would fail in.
This is how GPL/GNU evolved, little seperate teams, on little seprate projects, functioning independantly, but knowing what thier place in the "big picture" was, because they are carving that place out for themselfs. Thus, the "modularity" of Linux*, and the reason to keep that modularity, and the reason support will scale up efficently with the features in GNU/GPL.
Linux* -- * denotes, in a generic sence, because Linux to me is the kernel, but what were really talking about here is the whole ball of wax, from kernel up through the highest level applications.
Wow (Score:1)
It's just like the PC vs Mac thing... PC users are always dis the Mac users but Mac users never really yell to the PC users..
O well.. my $cents = 2;
ChiefArcher
Solaris lines of code (Score:1)
Look for "lines of code" in the text - it'll be the 2nd match. the white paper also suggests NT might have some trouble with it's 40 million lines of code, most of which is very recent.
Damn! Beat me to it! (Score:1)
(Good job, BTW: you caught some I missed.)
For the sake of completeness, here are some he missed:
On page one, Linux has already failed the acid test of crossing from early adoption to mainstream acceptance, but on page two, Linux is still considered to be in the "free-growth" stage. Which is it?
He compares the entire Windows NT development group and the code they manage to the Linux kernel team only, attributing to it the task of managing all of Linux. By extrapolating his numbers to all the projects represented by a typical Linux distro, you get as many as 160,000 developers represented.
Did you know Linux was losing market share? "Linux is still Unix, and Unix is losing market share, therefore Linux is losing market share, and Microsoft would never support a shrinking market." That gem is on page 4. Someone needs to go tell IDC.
If anyone's interested, I can post the paper somewhere.
send a link (Score:1)
--
Here Here, Well Spoken Bruce! (Score:1)
I have been debating if it was worth my renewing my membership with the IEEE. If they keep allowing poorly researched, glaringly inaccurate articles such as that then I will not renew...ever. I used to have a lot of respect in their opinions and facts but that article sucked aol startup disks. I wand to see that chap in an IRC room where we can have a little discussion about his work and to see if he will stand behind his paper.
Personally I dinnae think he has the balls.
My take on server market share (Score:1)
Here is my totally unscientific guess on server market share:
Boxes installed
1 - Netware
2 - *nix (including Linux)
2a - Sun
2b - Linux
3 - NT
4 - AS/400
5 - IBM mainframe
Bits served - internal networks
1 - IBM mainframe
2 - Netware
3 - AS/400
4 - *nix (Sun probably first)
3 vs. 4 could certainly be argued
5 - NT
Bits served - external networks (Internet)
1 - *nix
1a - Sun (by far)
1b - other *nix including Linux
2 - unknown? Could be NT, IBM mainframe, or other.
Server market share
This one is tricky. By boxes, by bits served, by $$ value of systems sold? If I take out 5 Netware servers and replace them with 20 NT boxes, what does that do to the market share of each? If I replace the 20 NT boxes with 2 Linux machines, again what does that do to market share?
By installed base or growth rate? I would guess that Novell has by far the largest installed base, but the lowest current growth rate. Linux - high growth rate, but that doesn't mean much if your initial installed base is small.
Finally, 1997, 1998, or 1999 base/growth rate? Novell lost market share and installed base in 1997, bleeding like a stuck pig. I don't think that process continued at the same rate in 1998, or will continue in 1999 based on discussions with peers.
No answers here, just some observations and questions. But don't forget other platforms, most notably Netware and AS/400.
sPh
Let's call it TuX (Score:1)
TuX
?
Original article please? (Score:1)
If so, location?
Thanks...
Indeed! (Score:1)
It is often the case that a question so obvious as that isn't even posed - people just accept a person's statements and credentials without examination. As an example, I've heard people
extolling the superiority of NT's scheduling
vs. linux's but these same people nearly inevitably respond with a sheepish "no" when
I ask: "Ok, since you're so obviously an expert in computer science - could you sit down right now and code even a bubble sort in any programming language?".
Too many people are just parrots - repeating some spiffy-sounding fact-byte they've been fed with having anywhere near the knowledge to objectively analyze the statement for themselves.
Anyone can sound technically competent in the broad sense - I've interviewed probably 100+ developers in the last 4 years and it is
astounding how wide the gap sometimes is between the level of experience/knowledge professed on their resume and their actual competency when it comes down to specific questions.
I suppose (for the author of the original article) that's the downside of spewing out a rant like he did that's likely to be read by
the programming community - we don't tend to be easily impressed
by hand waving, just boring old facts for us, thanks.
thanks (Score:1)
ok but (Score:1)
Kragen is right on his analysis (sp?), in the original article this was (above) the sentence that made me thing Ted was right, and this sentence still worries me... sure linux becomes big, "too" big?
--
Excellent--Expand and Publish (Score:1)
Yeah! (Score:1)
wow! (Score:1)
RMS has no right to "demand" anything... (Score:1)
He really has no right to demand anything. All he is concerned about is his piece of the spotlight.
I second that (Score:1)
-gabriel
How to read the original article (Score:1)
Yahoo! (don't let em sue me :-) (Score:1)
Way to go.
-Chris
Congratulations (Score:1)
Well Written! (Score:1)
the 'Computer' editors read this and make
sure Mr.Lewis's writings are ignored in the
future.
Excellent article (Score:1)
Benchmarking Apache (Score:1)
IIS on NT can be faster than Apache on *nix under certain unrealistic circumstances, but not under anything you'd call "realistic conditions".
With a fairly high-powered box (dual P-Pro and up) that is not heavily loaded, that only servers static files and never CGIs, database lookups, SSIs or any other dynamic content then yes, IIS is a bit faster. But as Apache Performance Tuning [apache.org] says,
If you add anything like CGI, mod_perl, PHP, SSI or whatever, Apache quickly takes the lead. I can offer a rough performance comparison between Apache + PHP + MySQL on FreeBSD and IIS + Cold Fusion + Access on WinNT. The former was maxing out a K6/2-350 with 256Mb RAM at 200,000 page views per day. The latter was maxing out identical hardware at under 5,000 page views per day. Admittedly they weren't identical sites (actually the Apache site had much more complex db work). More importantly, to compare IIS with Apache accurately we should have had them both connecting to a database on a separate machine (and not use Access!), but nonetheless it makes clear that the IIS on NT option is technically less preferable.
IEEE's been infiltrated by the MIB (Score:1)
It appears that IEEE is no longer objective, which makes them irrelevant as a standards body.
** - Hearsay Warning: Although I read this in a major publication (one of the major weekly PC-industry papers), I was never able to corroborate it on IEEE's website.
Good show! (Score:1)
I have hard time believing that IIS is supposedly faster. Didnt ZDnet run some tests that proved linux 2x faster? Im too lazy to look up the article.
chad
I hope Katz read this article and the comments (Score:1)
It's a pleasure to read something clearly laid out and thought out. Good job Kragen.
JB
AMEN! (Score:1)
Question the dominant paradigm.
Pym
Ted's amazing response .... (Score:1)
> Craig,
> Thanks for the kind words. I try to get my readers to think out of the box once in
> awhile. Sometimes it isn't possible.
> --ted
>Yeah, I found it unbelievable, too.
Maybe what he meant by the phrase ``to think out of the box" was ``I just took some LSD for the first time & it was so kewl that I think you ought to do it to."
Geoff
A Computer Society member
Linux beats the pants off IIS (Score:1)
Excellent job. (Score:1)
I also absolutely disagree with his issue about support. Usenet support is far superior to MS's online (or telephone...who bothers to try THAT these days?) support, and far more responsive. Additionally, online HOWTOs and included man pages are much better than anything Win98 has to offer. Just the other day, I swapped out my modem for a USR internal (non-Win) modem so that I could maybe connect up to my ISP after rebooting in Linux. After tiring of this, I removed it and reinstalled my original modem. Ohmigod...endless headaches. There's nothing in any MS docs or help files that addresses anything that I ran into. (PortDriver busy or missing on COM4? Nothing is even USING COM4!) My lesson out of all this? Don't even think about fscking with your modem. Plug'n'Play...doesn't. His assertion that Linux advocates are inherent microsoft-bashers is partially true...to the extent that people are sick and tired of crap like I just described, and want software that is written for them, not dumped on them.
Well said! (Score:1)
An excellent piece of debating.
Skillful demolition of the spurious bluff from the original author.
In the true spirit of debate, I'd love to have the original author come back for a 'summing up', and a chance to re-state his point of view in light of the researched figures, then allow one more posting in rebuttal..
That, I think, would be entertainment..
Malk.
Good Rebuttal (Score:1)
Lines of code (Score:1)
Thanks for the anti-FUD (Score:1)
Yeah! (Score:1)
~afniv
"Man könnte froh sein, wenn die Luft so rein wäre wie das Bier"
"We could be happy if the air was as pure as the beer"
OSS Evolves counters to bloat automaticaly (Score:1)
For example the previous posters comment about modular drivers vs drivers in the kernel comes to mind. Another is the CVS tree's as it became harder to write the Kernel and Utilities.
Protocol and Interface Standards are another weopen against the complexity of projects.
As OSS Software evolves and becomes more complex, solutions to complexity evolve along with it.
With OSS it is viable to say, ok we will have to rethink this and derive a better, more scalable approach. With Commercial software it is a lot harder to say "Damn we need to scrap this approach and rethink it, BTW this adds another 6 months to the dealine". What percentage of project managers are going to have the force of will to push through this change at the sacrafice of a deadline, not many. There is also the preassure of those higher in the hierachy, who never seem to like a predictor of doom.
That is why I feel that OSS will prevail over commercial software in he long term.
IceTiger
PS Yeah I can code a bubble sort
Was this sent to the author? (Score:1)
Methinks this eloquence is wasted on the choir...
Feature creep (Score:1)
At a certain point, given enough time and support, most software reaches a point where it is fairly polished, and does a good job at what it is intended to do.
In the free software development model, at this point, the software developers, having done their work and satisfied themselves, move on to new projects, and except for bug fixes and polishing, the software becomes stable and complete.
In the commercial software development model, there is no time to be wasted
I could name a number of software products that were much better two or more versions ago. None of them are free or open source software.
- John
The 'net on a silver platter. (Score:1)
The original article, I'm afraid, is another in a continuing trend. A little knowledge is dangerous, and tight deadlines, coupled with a desire to grand-stand, in addition to 'incentives' from corporate 'sponsors'; ultimately lead to irresponsible reporting.
CNN, and the grand-daddy of FUD ZDNET, have always thrown buzzwords and fuel on the fire, without really checking their facts and sources. Their (ZDNET) authors and editors probably get a beta and free support for the latest M$ product - and so journalistic ethics get swept under the rug.
What is shameful, is that the original article appeared in an off-shoot of the IEEE. (!?!?!) If we can't trust the IEEE to lead by example, maybe journalism should also become a grass-roots, open-source effort.
As for the reply to the drivvel.. Well done! Way to burn down the 'straw-man'. But it'd be even better to cram it down the author's throat by submitting it to a trade journal.
I will contact IEEE (Score:1)
I wonder if this kind of shoddy journalism would be tolerated if the victim was a company that would sue? I hope they get a lot of responses from members.
Chuck
Walnut Creek (Score:1)
Ain't no such thing as Stackware, though.
zdnet & Re: Apache performance (Score:1)
By the way, that doesn't even include a 2.2 kernel.
Excellent work, Kragen (Score:1)
But the spec-driven project development and management perspective of the military is drilled deep down into the ethos there, as well as a certain type of "analysis" that Lewis' piece typifies, which manages to miss both the forest and the trees.
The outrage here is not misplaced, but I see no need to flame the guy. He's just lording it over us in the good old fashioned way.
--------
Ted's amazing response .... (Score:1)
"You ain't gonna learn what you don't wanna know."
--------
Let's get a response from Ted Lewis (Score:1)
If someone big in the GNU/Linux/Open Source world, or the author of this well argued piece, were to make official representations to Computer, we might well get a result
Linus, are you listening?
(Er... no
Gerv
Apache winning performance battle against IIS (Score:1)
Bravo (Score:1)
Well said... (Score:1)
ok but (Score:1)
Once, if I wanted to install a new device+driver, I had to compile it into the kernel (=bloat). Now all I have too do is load the driver-module (=counter-bloat), as is done in RH.
I can still compile the features into the kernel if I want to, but I don't have to. The driver can be modified, recompiled and integrated in future developments; the hardware hasn't been rendered unusable.
Sure, the modular version is going to be "bloated" because of the interface overhead, but at least it's not static.
I guess what I'm getting at is, there is a BIG difference between closed-source (static) bloat and open-source (dynamic) bloat. Change.
Also, cost rises exponentially if you can't understand the code; opensource encourages readability. Scarce talent today != scarce talent tomorrow.
And last, but certainly not least, Linux+GNU has a secret weapon. You know all those old/used computers that have been and are being shipped to developing nations? Which software+tools do you think they'll learn, use, and develop.?.
Bravo!! (Score:1)
Well, I finally got the original article (Score:1)
Well, I am probably rather biased, but the article had a rather sarcastic flavor to it. I guess reading
I have gathered from the other replies I have read that the author of teh original article, one Ted Lewis, has in teh past been considered respectable. I agree that that article is a piece of garbage that is not even worth being used as toilet paper, even if printed on something nice and soft!
But I see only one alid point, the Linux kernel is getting rather large and the inflow of patches and code dose make it difficult for one person to organize everything.
But if you read patch summaries, easily found through http://www.linuxhq.com/ [linuxhq.com] you will note that Linus isn't without help. Collecting of patches and source is done by other people, one being Alan Cox, who passes this on to Linus.
I belive that Linux will thrive, probably not take over, but thrive. The internet is becoming almost an ecosystem in itself and the fit will survive, and those with open source will evolve stronger than beasts that release only biniaries.
But this is only my opinion
PsychoFreak
Vigilance (Score:1)
We need to raise our voices high when something like this happens. 8-)
---
Apache performance (Score:1)
Apache performance (Score:1)
Remember that article a few weeks ago by noted MS partisan Ziff-Davis that showed that Apache kicked NT's butt for web throughput, and for Samba, NT's native language?
Guess this guy couldn't be bothered to actually check to see which product was better before spouting off about it.
Excellent idea. (Score:1)
Ted's amazing response .... (Score:1)