Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
News

Free Software and the Innovators Dilema 107

John R. Zedlewski has contributed an excellent feature entitled 'Free Software and the Innovators Dilema'. Talks a lot about how industries tend to shift, and what happens when a new low end/low cost technology wrecks the margins. Its worth a read. Check it out.
The following was written by Slashdot Reader John R. Zedlewski .

Free Software and the Innovator's Dilemma

If you wanted to assemble a "must read" list for any businessperson looking at the Linux/Free Software industry, what would you include? Certainly "Open Sources" from O'Reilly, is the most obvious answer, probably followed by Bob Young's upcoming "Under the Radar," which details the story of Red Hat's rise. But I would argue that a third book belong in the top tier of that list as well: "The Innovator's Dilemma," written by Clayton Christensen and published by Harvard Business School Press.

"The Innovator's Dilemma" traces the histories of various industries, from disk drives and microprocessors to steamships and automobiles, in which established market leaders have been beaten out by smaller, more nimble competitors. This idea, that startup firms have noticeable advantages over their larger rivals, has been one of the cornerstones of the internet era, not to mention Microsoft's antitrust defense, but Christensen is one of the few authors to actually address the specifics of these show-downs. In almost every case, he claims, his example established industry leaders were managed well, by conventional standards. They listened to their customer base and constantly sought to increase their penetration in high-margin, high-end markets. These seemingly-innocuous strategies become disastrous, however, when a "disruptive technology" enters the low end of the market. The established firms shy away from these new technologies to avoid undercutting their core, profitable businesses, but this ultimately leaves the market open for a new player to implement the disruptive technology, then slowly march up the food chain, overthrowing the old market leader. Minicomputer manufacturers in the 1980s, for instance, diligently followed their customers' demands to invest only in faster minis, while ignoring the PC market, which held little interest for the companies' established base of scientific and business customers. How many of those companies are still alive today?

This isn't, of course, a book review. Instead, I guess you could call it my attempt at a wake-up call to those software companies (you know who you are) who still think they can make a living on "business as usual" in the next millennium. Specifically, I want to focus on Linux, which might be the best example of a truly disruptive technology that we've seen since the advent of the internet. In fact, this theory gives us a guide to understand how established software firms risk missing the boat with respect to Linux, just as brick-and-mortar retailers were overtaken on the net by smaller, more daring startups. Consider three statements that a software vendor looking at Linux in 1999 might make:

  • "That sounds like an interesting idea, but we asked our customers and they don't seem interested."
  • "That sounds like an interesting idea, but the profit margin sounds too slim."
  • "That sounds like an interesting idea, but it would eat into our more profitable core business."

Now take those same three statements and imagine them coming from an executive from an established retail vendor considering e-commerce in 1995. It's not much of a stretch, is it? In this case, though, we're hampered by our hindsight. We fail to appreciate that the executives who turned down a chance to take, say, Barnes & Noble to the web in 1995 were in fact making a very reasonable decision based on the traditional, financial bottom line. They would have spent millions to set up shop, made it easier for customers to price compare (and see that, in fact, the Barnes & Noble of 1995 was a fairly expensive bookstore); the customers who did buy online would need at least some discount to offset the cost of shipping; and the site, like Amazon, would have been a spectacular way to drive the parent company into the red. By not embracing the net, however, they opened the door to the B&N's greatest threat in decades, and they ultimately had to spend even more money to build play catch up against the $20 billion internet rival that wouldn't even have been created if the existing booksellers hadn't dropped the ball.

How have today's successful technology firms, then, fallen into the "innovator's dilemma" with respect to Linux? By focusing on the short run and the high end. When companies like Sun, SCO, and Microsoft dismiss the OS, they talk about its lack of scalability, its relative newness, and its lack of a journalling file system. These features, however, are irrelevant at the workstation and workgroup server level, and, more importantly, they're all being developed at an unbelievable pace to help the OS scale to enterprise levels. As commodity hardware becomes more and more powerful, while Linux and Windows NT continue to scale up to new heights, the traditional Unix vendors will find themselves increasingly marginalized to the very highest-end of the computing spectrum, falling into what I call the "Silicon Graphics trap."

Silicon Graphics (now SGI) was notorious for their focus on the high-end, sexy technologies: Cray supercomputers, 128-processor Origin servers, $10,000+ workstations, etc. While each unit sale at this level seems highly profitable, in reality the R&D costs involved in pushing the envelope of technologies at the microprocessor, OS, server, and applications levels was simply impossible to maintain, and the company spiraled deep into unprofitability as companies like Intergraph ate away at their core graphics business with low-cost graphics workstations. Now SGI has become the first traditional Unix company to truly embrace Linux, making it their platform of choice for the IA-64 architecture. Perhaps they were lucky to fall into the trap while they still had time to ride the first wave of the Linux movement.

But that leaves a question for the rest of the OS, hardware, and computer services worlds: are you willing to concede the entire low-to-mid-range server market to smaller, faster companies that position themselves as Linux early adopters? Do you want your future customers to think of you as a Linux pioneer that can accurately evaluate and deploy the OS, or as a second wave Johnny-come-lately that doesn't really understand the phenomenon at all?

Clayton Christensen's prognosis for these established players is unequivocal: unless they form new business units with the autonomy to embrace such disruptive technologies "the probability that they will survive as industry leaders is zero" (Bloomberg Personal Finance, October 1999). In my opinion, some of these market leaders need even more fundamental changes in their structures to address this disruption; they need thorough forward-looking reorganizations, such as SGI is undergoing with Linux and open source, as Microsoft restructured their product lines to face the internet, and as I advocated that SCO should change in my last article.

This discussion has been archived. No new comments can be posted.

Free Software and the Innovators Dilema

Comments Filter:
  • A bloke at my work thinks it will.
    Will this make hardware the business of choice?
    or tech support etc....
  • I seem to recall that B&N developed a plan for an internet bookstore and deliberately didn't implement it. They didn't think it would be profitable. But more importantly they thought that if someone did come in and start an online bookstore that was successful, B&N could simply implement their plan and crush the rival with their superior brand name and financial resources. Obviously they were wrong about that. Though the fact that their site is simply a lame ripoff of amazon surely doesn't help.

    This story may be apocryphal and I don't remember where I heard it.
  • by euroderf ( 47 ) <a@b.c> on Tuesday October 12, 1999 @05:12AM (#1621075) Journal
    The new consensus seems to be that you have to actively try to eat away at your old product lines, by developing new and improved product lines.

    3M Corp always stood out; IIRC every division is expected to garner 35% of its income from products that did not even exist three years previously. Obviously this kind of organisational dynamic requires some funky organisational structures, and these structures might provide cover for those "intrapeneurs" who would eat away at the edges of the large organisation's older, more established, still-highly-profitable core businesses. Case in point: for the IBM PC, they sent everyone to Boca Raton and sheilded them from the rest of IBM.

    I'd like to hear some "war stories" from people in the "skunk works" of large outfits, about battles fought to keep alive ideas that scared the hell out of other parts of the company.

  • The simple fact:
    Larger corporations tend to move more slowly and carefully, they are unable (due to commitments to their shareholders and employees) to set up risks where they might take a bad fall.
    They tend to have a large overhead as well as a large profit. If MS were to drop win98 from $99 to $98, they would lose millions of dollars a month, which they can't afford to lose.
    They tend to have employees which demand high salaries which they can't shove off easily.

    Small companies can make large risks.
    Small companies have little overhead.
    Small companies can get dirt-cheap employees that are really good because they can bill themselves as 'the next big thing' which will vault said employee up to positions where they can demand high salaries and force the company to move more slowly.

    -Adam

    To minimize loss and damage in a quake, try not to own things.
  • by Anonymous Coward on Tuesday October 12, 1999 @05:28AM (#1621077)
    The article seemed interesting until the author let loose with the typical linux apologist lines:

    When companies like Sun, SCO, and Microsoft dismiss the OS, they talk about its lack of scalability, its relative newness, and its lack of a journalling file system. These features, however, are irrelevant at the workstation and workgroup server level

    What does this have to do with the "Innovator's Dilema"?

    Once again, an opportunity to do some useful writing is wasted by ridiculous, tired, pro-linux jabs.

    Just for once I'd like to see some interesting technology writing that doesn't mention linux. You guys are really flogging the dead horse beyond all reasonable limits.

  • Speaking as "the bloke from work".

    Free Software is at its strongest when it takes an existing idea, say a C compiler, or a Unix-like OS, makes a lifesize copy, and then keeps adding features until it is as good as, or better than, the commercial competition. I'd hazard a guess that very, very few Free Software projects are doing anything but copying, cloning, or porting existing products.

    So if you're looking to throw a few million into research and development, in order to turn your Cool New Idea for the Next Killer App into a reality, do you really want to have to be asking yourself "So... how long until a few bearded hippies come up with a free (speech/beer)knock-off?"

    There's a lot of talk about how venture capitalists are wary of high-tech startups, because they fear that if the product is too successful, Microsoft will just clone it and give it away with Windows. The Open Source factor is just as dangerous. Any product, useful or not, highly profitable or just scraping by, is fair game if a few hackers decide they want to clone it.

    That said, Microsoft's "Defend the right to innovate" line is crap. I mean, since when have they innovated anything?

    Charles Miller
  • by sphealey ( 2855 ) on Tuesday October 12, 1999 @05:36AM (#1621079)
    "Larger corporations tend to move more slowly and carefully, they are unable (due to commitments to their shareholders and employees) to set up risks where they might take a bad fall."

    The customer's needs and demands play a part in this as well. An ongoing enterprise needs some assurance of longevity and stability from its vendors. If I am building a worldwide information system, at a cost of (say) 5% of corporate revenues, which is a huge investment, that system has to last x years. x can vary but certainly not less than 5 for a Fortune 500 company. Therefore, I cannot afford to base this system on untried technology from a small startup. No matter how good they are, I can't take the risk.

    This will lead me to use large, established, conservative vendors, and pay them big bucks. Digital, (the old) IBM, Control Data, Sperry, etc. But of course, these large and profitable sales are exactly what holds the vendor back from innovating in the long run (note that none of the vendors I list above really exists today).

    Crucially, this also affects the type of people a vendor has to hire and promote to fulfill these contracts. If you want a stable, secure, high quality system you have to employ knowledgable, experienced, conservative, belt+suspenders+extra-string-inside-the-pants guys, who are probably somewhere around 40-60 years old. It makes me shudder to hear that Microsoft doesn't hire anyone older than 22 (per Bill Gates interveiw in Newsweek 2 years ago) - these are the people I will trust with my mission critical systems?

    But at the same time the conservative guys by their nature will resist innovation (having seen numerous failed innovations in the past, no wonder), and eventally prevent the vendor from advancing with the "next wave".

    No real answers here - just some observations. But I think my observation does go a long way toward explaining why we have seen a substantial decrease in the quality and reliability of software and computer systems in general over the last 10 years: reliability and stability are in direct conflict with speed and innovation.

    sPh
  • Doubt it. Some software, the OS model is great at, some software, I don't think the OS model can cover well. Embedded software, enterprise focused software (think network management -> http://www.dmtf.org, groupware, probably others).

    Certain software, you just need the organization of a large company.

    Lets see VA Linux and RedHat spend millions of dollar on developing enterprise management software to compete against Dell's hardware and HP's Openview, and the GPL the software. If they are able to do that, AND survive, then I'll be convinced it will work.

    Heck, I talked to representatives from VA, RedHat and Caldera during LinuxWorld, and none of them seems to have any plans in the network management area. Didn't even know it existed. So while Linux is trying to get a proper SNMP agent working (and someone has to register a LINUX mib branch, which again cost $$. Are RedHat willing to register it and spend money supervising it, and let Caldera, SUSE, TurboLinux use it?), the industry is looking at CIM and WBEM.

    Those of you who are interested, go to any major player (Intel, Microsoft, IBM, HP, even the hardware people like DELL, COMPAQ), and look at what they are doing with Network Management.
  • by acfoo ( 98832 ) on Tuesday October 12, 1999 @05:45AM (#1621081)
    The consensus that you cite is less widespread than you might imagine. The Christensen book that is described has some additional fascinating "rules" for why traditional, and well-managed, companies can fail when faced with a disruptive technology.

    Rule One: Although managers think that they control the allocation of resources within a firm, the firm's customers actually control the allocation of resources. This is because well-managed firms ask their customers what they want in a product, then focus the resources of the firm (including the best engineering talent) on putting those new features in the product.

    Rule Two: New or emerging markets cannot satisfy the growth needs of an estabilshed firm. This rule is important and clearly illuminates the problem that managers have in confronting disruptive technologies. The initial return on the investment in new technology will always fall below the ROI for a sustaining technology in an established product area. This will create a drag on the apparent current profiatbility of the firm, and the stock price (and the value of a manager's options) will fall as a result.

    The prescription for remaining nimble in the face of disruptive technology change, offered by Christensen himself, is to "spin off" the new technology into a nearly autonomous unit whose size is small enough for its initial growth needs to be met in a new market.

    HP did this with its InkJet division, which is completely separate from its established LasterJet division. HP's move positioned it to benefit from the winning technology for the consumer market, whether it came from the InkJet or LasterJet division. This is why HP's Laser and InkJet products often seem confusing-- they really are targeted for the same market segments.

  • This sounds a lot like the history of the animal kingdom on earth...who is the "best designed" animal? The Big, Bad and Mean T-Rex or the small, fast and adaptable rat?

    We can see that in technollogy, as happened in biology, the winner usually is the rat, able to stay hungry for long periods of time, eating whatever insects it may catch, adapting fast to circumstances.

    I firmly belive small companies (even when they are small "companies" within big companies, like the PC division in IBM mentioned in a previous post) will allways outmanuver the bigger monsters...yes, MS may have all the money in the world, the may have the big market right now, they may have even proven that they can turn on a dime when needed...but they are still not fast enough, when compared to a company like RedHat or a community like the one driving Debian, and when the next big breakthrough comes, MS will go out of the track trying to take the curve, while RH, Debian and others will continue in the race.

    I think most of the people here will agree with me when I say that I rather work for a small company (untill it grows) than for a big company (till it sinks?).

    Better be a rat alive than a dead T-Rex :)

    Vox

  • by LL ( 20038 ) on Tuesday October 12, 1999 @05:47AM (#1621083)
    Part of the trick is understanding the difference between your job and your business. Example is the collapse of the railway tycoons who forget their purpose was transporation and not constructing superfast trains. Companies that understand what fundamental role they play and can stick with the "last-mile" to the consumer will have a decent chance of surviving disruptions. Thus companies like Coca-cola have shifted from dominating the carbonated drinks segment, to being the market gorilla of the liquid refreshment sector, to the mindshare game (associating their taste with pleasant memories such as rock concerts). It is usually the less complicated systems that are the most robust to disruptions (something to keep in mind when designing software). Any complex manufactured capital good can usually be undercut by a lower cost substitute and any fad (whether toys/games/culture) will be difficult to sustain.

    One can apply this trick to existing OpenSource vendors to ask what their role is. The clearest example I think of is:
    Mandrake - bundling/packaging
    MacMillen - distribution/catalog
    LinuxCare - support/reassurance

    Thus Mandrake could probably be more profitable customising/bundling specific packages for various predefined sectors (business, education, small business, etc), MacMillen on finding alternative distribution channels besides books (e.g. sponsor contests to find who can convert the largest number of boxes to Linux), and LinuxCare in demonstrating key metrics such as cost of repair/replacement, as well as alternative mechanisms of support (e.g. user groups to weed out basic computer problems unrelated to Linux). So long as each company understands the core role and what are bolt-on businesses, then they can adapt even if new technology comes along.

    The disk manufacturers are in a bit of a bind since they compete with anything that can store bits ranging from punch-card to next-gen holographic crystals. At least the file system experts like Veritas are less vulnerable.

    Technology is like a treadmill in Alice in Wonderland, you have to run as fast as you can just not to go backwards. Stick with coke if you don't like the guessing game.

    LL
  • by Anonymous Coward
    This is just linux FUD. Making vague allusions
    to the past, and presenting a certain set of limited viewpoints intended to convey a sense of historical inevitability. Those are some nice quotes, but how about the ones you didn't mention:..

    "Linux is nice, but there is no real desktop environment." (GNOME is a piece of shit and doesn't count. KDE is the only thing that even comes close.)

    "Where is linux professional video editing software" (Nowhere.)

    "Where is GNU mathematica"(Nowhere, never ever.)

    "Where is a good compiler for Linux?" (Not GCC!)

    Now if I'm confusing the issue of open source and linux, it's because the original post seemed to, and lots of people are talking about the end of the software business. Open source is great for things like a web server, or a kernel, or a compiler. But bigger stuff? Like antenna simulation software? Things like that? Forget it. No one is going to write something like that and just give it away.

    Well in fairness, there are a lot of reasons for companies to go to linux. I personally wish more of them did. But Linux has a lot of problems, GCC being one of the more prevalent ones, as well as the heavy reliance on free software, most of which _sucks ass_. I couldn't stop laughing at the commerical thing the other day. You think commercial software is bad? How about my xplaycd that crashes every time I move a track? I can think of 5 or 6 open source tidbits that come standard on redhat that are pathetically broken. Just earlier today, GIMP went in to spastic mode and opened about 200 windows with an error message, before KDE crashed from the load. And GNU software is usually the best of the bunch...

    Rather than just dismiss the companies that aren't going to linux, consider the many valid reasons that they don't.
  • This is why economists have been discussing the "service economy" for decades now.

    If you don't offer a service that distinguishes you from a competitor, you will find it hard to compete on the basis of commodity products alone.

    For linux vendors, this means you have to offer more than just a nice CD and install script. You have to provide a unique service.

    The best way to avoid this is to not get involved with the production or distribution of commodity products at all. Take the Yahoo approach and develop a super-strong brand name built purely on services.

  • by morzeke ( 100541 ) on Tuesday October 12, 1999 @05:55AM (#1621086) Homepage

    The phenomenon of small nimble companies growing under the radar of a large corporation market leader is not exclusive to IT industries.

    FedEx grew in the late 70s despite UPS's dominance of the package delivery market because it's hub and spoke system could get more packages to the right places faster and cheaper. It was ignored for a long time by UPS as a small time, regional carrier. Eventually UPS got screwed(though they still have the lead in overall number of packages delivered annually) by its willful ignorance of potential competition.

    On a larger scale, Japanese manufacturers were able to slip under the radar of their American counterparts, manufacturers of everything from TVs to steamshovels, by coming out with a cheaper product, being ignored by the market leader, and improving until their quality met or bested that of the former market leaders. (BTW: It is a process currently underway with Korean manufacturers, of everything from TVs to steamshovels, who are undercutting both their American and Japanese counterparts).

    This even happens with countries. France, the market leader in wool production in the early 18th century, ignored Britain's increasing productivity due to early industrialization and lost its lead, which contributed in part to the economic stagnation that predicated the French Revolution. Western Europe has consistently underestimated Russia, and was caught by surprise when, under Leninism, it exploded economically and fully became a world power. The Arab world ignored Zionism when it was a small and powerless movement and did not realize what was happening until control(or market dominance of the political sphere, if you want to think of it like that) had shifted, at least for part of Palestine.

    Now these last two examples bring up an important point: sometimes the overturning of the market leader by a previously ignored underdog is a good thing and sometimes it is not. As applied to the current situation, I would hazard that an overwhelming majority of /. readers think that a potential for Linux to overturn the current market of OSes would be a good thing, but, looking at other examples of similar phenomena, we ought to hold judgement on the benificence of all such overturns until the facts are in.

  • Depends on the software, I think. The consumer software, probably, but what about specialized software within fields that most consumers are not in. Movie editing, enterprise management, groupware, fingerprint recognition software, etc etc.

    If a software type doesn't really make an individual software hacker want to scratch, would the software be developed by OS?

    And how user-friendly is OS software? The attitude I've read here on Slashdot, seems to indicate people should get a license (or at least be able to hack the kernel) before they are allowed to use a computer. Seems to me, that's another marked that OS is ill-equipped to handle.

    Maybe I'm wrong....
  • While the inertia of established companies does not allow them to move as quickly into new tech solutions. Most companies can roll with the punch and minimize the damage by downsizing or diversifing or, like B&N, attempt to make up the lost ground even if it is too little too late.

    Another problem occurs when new technology undermines a company's product. Anyone hold any stock in slide rule or buggy whip manufacturers?

  • I think the fact that conservative businesses (those that shy away from innovation) will eventually fail is almost tautological. The alternative (only slightly extreme) seems to be that there will be no more substantial innovation ever. I suppose it's also possible that big companies will suck up all useful innovation with their financial muscle, but things haven't gotten quite that bad yet.

    Not to beat a dead horse, but it does seem pretty intuitive why conservative strategies won't be successful against a lot of competition. Shakespeare would be justly nervous against a sufficiently large fleet of monkeys with typewriters. This is not necessarily good news for each individual competitor. It does suggest that Shakespeare would do well to experiment a bit more and rest on his laurels a bit less, more so the more monkeys are out there.

    I think if you had to plot the probability of long-term survival for big companies and small innovative companies vs time, the guess would have to be that big companies would start off with a huge advantage, and end up with a small disadvantage. By embracing innovation, one would imagine the initial downwards slope for the big companies would increase, but that they shouldn't cross over the line. I guess this just suggests a graphical representation of shortsightedness, but also I think suggests why shortsightedness is not such a bad idea sometimes.

    dan
  • in theory at least.
    You get a phenomenon known as "punctuated equilibrium" - periods of "relative" calm, then a disruptive event, then a period of rapid evolutionary change.

    There are definite parallels here between evolution and business (especially if you view a business as behaving like an organism).

    Big successful businesses concentrate on core markets, and get specialised. Then the disruptive event comes (e.g. meteor from space, Linux from Finland ;) ). Specialised beasties can't adapt, and the smaller more nimble ones end up filling the void left by the death of the specialists..

    In the Cretaceous case - small furry mammals (or I wouldn't be at the keyboard), and in this case - Who knows?
    Maybe Linux services companies? or just any company focusing on quality? Or maybe something new entirely? ..Big business definitely finds it harder to adapt to changes in business environment faster than a small one. I know, I've lived the nightmare politics (and I think many others who post here too).

    Oh well, back to grindStone(nose);

    Jari
  • by SpinyNorman ( 33776 ) on Tuesday October 12, 1999 @06:04AM (#1621091)
    The best companies realize that it is better to make their products obsolete themselves, rather than waiting for their competitors to do it. There's a good example of this in the current issue of wired - Schwab aggressively moving into on-line trading despite the fact that is would cut into the more lucrative traditional market. Sure the stock price took a short term hit, but the resultant growth more than made up for it.
  • If you're not innovating by small increments in a fast-moving market, you're going nowhere. And even standing still in a growing market means you're going out of business.

    If you've got the capital to invest in big development projects, sure, you can go for a huge return -- but with an equally huge risk. But small, rapid investments yield more efficient returns; it's pretty simple to demonstrate that the innovation investment against return graph is an inverted quadratic curve.

    So maybe our Bazaar, with its many and rapid small 'investments' could end up a better business proposition than the monolithic Cathedral...
  • by jflynn ( 61543 ) on Tuesday October 12, 1999 @06:13AM (#1621093)
    Speaking as "a bearded hippie." :)

    "Free Software is at its strongest when it takes an existing idea, say a C compiler, or a Unix-like OS, makes a lifesize copy, and then keeps adding features until it is as good as, or better than, the commercial competition. I'd hazard a guess that very, very few Free Software projects are doing anything but copying, cloning, or porting existing products."

    Open source isn't that good at overall design and architecture. It works best when it forms around an existing kernel of code. This allows religious flamewars to be quelled with the cry: "Show me the source." This is why most projects are working on something definite, most usually an existing body of code, or an application being cloned.

    "So if you're looking to throw a few million into research and development, in order to turn your Cool New Idea for the Next Killer App into a reality, do you really want to have to be asking yourself "So... how long until a few bearded hippies come up with a free (speech/beer)knock-off?"

    Figuring a modest $40/hr per developer, and 100 developers on an open source project, a few million amounts to roughly 750 hours, or five months of full-time work from each developer. Figuring half-time instead, call it a year -- not that long. This suggests that new ideas won't fail to happen just because the proprietary model is no longer profitable.

    Open source software is indeed a threat to software that is packaged and sold on shelves. Fortunately the vast majority of software is not of this type. Nearly all of it is built to purpose to solve some company's need, or to control some hardware. These jobs are not threatened.

    It is correct that what is being contemplated is a major change in the software industry and painful displacements will no doubt result. It is not a question of eliminating software as a profession however, merely software as a product. The question is whether you think the goal is worth the strife.
  • So create some interesting non-Linux technology. I always think of the oh so famous line about 'If you build it they will come.' Stop whining about our so-called dead horse when you don't even have a horse. Nobody made you read it did they? No you decided on your own to do so and it cost you nothing more than your time. Besides any cool thing that is built will most likely be assimilated by Linux. Look at all the fun Linux stuff for Lego Mindstorms! :)
  • You may not be able to get Mathematica for Linux, but there are other, similar programs.

    There are plenty of good compilers for Linux. I have alphas and I use Compaq's compilers. For Intel machines, I own good compilers from the Portland Group (PGI).

    Your criticism would be better if you had more concrete points which were correct. Elsewise you just like someone who is criticizing something he doesn't understand.
  • This sounds a lot like the history of the animal
    kingdom on earth...who is the "best designed"
    animal? The Big, Bad and Mean T-Rex or the small,
    fast and adaptable rat?


    Door number 3. The crocodile. It ate sauropods,
    it eats rats, and it'll be around long after the
    rat breathes its last.

    If you're perfect, why adapt?

    K.
    -

  • Big established companies aren't always perceived to be better. There have been several stories in the news lately about where large companies are going to get help on setting up e-commerce sites. Large companies are choosing small dotcom e-commerce consulting shops over big consulting firms like Anderson Consulting because they believe the small shops are more knowlegable about new e-commerce technology. -Keith
  • Linux being representative of a new tech *is* the point of this article. The new tech is free source software, exemplified by Linux:

    How have today's successful technology firms, then, fallen into the "innovator's dilemma" with respect to Linux?

    --
  • So if you're looking to throw a few million into research and development, in order to turn your Cool New Idea for the Next Killer App into a reality, do you really want to have to be asking yourself "So... how long until a few bearded hippies come up with a free (speech/beer)knock-off?"

    1st: "bearded hippies" is not an fair way to describe the free software (open source) developper community. not that I have anything against bona fide, real life bearded hippies (other than the fact that it's been a while since I've seen any).

    2nd: if you're looking to throw a few million into R&D in order to turn your new Cool New Idea Next Killer App into reality, you don't have a choice between asking yourself "how long until someone comes up with a decent free software clone" or not. if your app is interesting to the mainstream *and* actually useful, it'll happen. live with it, and be happy that the unenlightened masses will still prefer yours if you make it flashier (which is typically the one area where the free version won't quite match the proprietary one).

  • ...and someone has to register a LINUX mib branch...

    I agree, we really need some Men In Black of our own to compete with microsoft and unfriendly aliens :-)

    Slow afternoon...

  • ...is The Tom Peters Seminar: Crazy Times call for Crazy Organizations. He advocates not just "re-engineering" businesses (an overused term if there ever was one), but wholesale the dismantling of businesses, processes, etc., and putting things back together into small, nimble units. I'm only part way through it, but he seems to be heading in the same general direction as Christiansen's book.
  • Actually, you can get Mathematica for Linux [wolfram.com].

    But there's no free (GNU) equivalent, as the original message pointed out...

  • IOMega strikes me as an interesting example of a company that fits on both sides of this fence -- they tend to create new and innovative products every so often to keep customer intrigue; they keep backward reliability in some of their lines but don't worry about inter-compatibility (R&D would suck to make Clik! work with Jaz!) ... but at the same time never brings out low price-point hardware ...

    ... just a rant :)
  • I've never had any of the problems you mention with GIMP. I just installed RH 6.1 with GNOME/E on a P75 Toshiba 400CS laptop and it actually works most of the time--much better than I would have guessed it would on a P75, even with some of the E bells and whistles turned on. It's never crashed and locked up the system (cough, Windows?) but the session management did blow up on me once when I tried to save a session. I still think the GNOME folks need to make their stuff easier to install and maintain and clean up some of their code, but it'll do for now.

    KDE's alright, I guess, but I'm not much of DE guy anyway--I prefer Window Maker on my primary PC.

    Linux--problems? Maybe, but it works, it doesn't crash, and I get my work done with it. Sounds like maybe you have a few configuration problems to work out--just my guess.

  • by bhurt ( 1081 ) on Tuesday October 12, 1999 @06:49AM (#1621109) Homepage
    The thing you have to remember is "what is your core market?" A disruption in your core market is devistating, but a distruption in an ancilliary market isn't- in fact, it's often an opportunity.

    Two examples: Sun and Linux, and SGI and PCs.

    What killed SGI/Cray was _not_ Linux- it was cheap PCs. SGI's (and Cray's) core market was selling hardware to solve large linear algebra problems (almost all uses for super computers- from scientific simulations to weather predicting to Hollywood special effects- are at heart simply large matrix problems). These problems parallelize very easy, meaning it doesn't matter if the MFLOPS come in one big box, or a whole bunch of little boxes, what matters is the total cost for the necessary MFLOPS.

    As such, it wasn't Linux that killed (is killing) SGI/Cray, it's the explosion of cheap powerful PCs and low-end Workstations, which make it possible to buy a few hundred PCs and wire together the same number of MFLOPS as a major super-computer, at a much lower cost.

    Now, let's look at Sun. Like SGI/Cray, Sun's core buisness is selling hardware. The main purpose Solaris has in life is to give customers a good OS to run on the hardware Sun is selling. But, unlike SGI/Cray, Sun concentrated on selling Database machines. Unlike large linear algebra problems, databases don't parallelize so well- the various CPUs end up needing to communicate a lot more. It's easy to replace a Cray doing weather prediction with a stack of PCs, but extremely difficult if not impossible to replace an E10K running Oracle with the same stack of PCs.

    If Sun can convince Linux users to run Linux on Sparcs, then Linux is no threat to Sun's core buisness (hardware). In fact, it's an opportunity to cut development costs of Solaris (allowing Sun to target Solaris at the high-end, where Linux isn't a good fit at the moment). If Linux were to grow to the point where it could replace Solaris at all levels, then Sun could drop the overhead of supporting Solaris altogether. While I don't think Solaris is a loss for Sun, it certain isn't a profit center. In either case, Sun doesn't lose in it's core market.

    Now, a company whose core market _is_ expensive PC-based operating systems has a lot to lose to a disruption in that space. So it's important to look at what the core market is for the companies before panicing about a possible disruption in the low-end.
  • SGI won't be "saved" by linux. It is adopting Linux in a last-ditch effort to keep it's head above water in the low end of the market. Smarter, savvier companies like Penguin Computing and VA Research will eat them -alive- in the x86 Linux sweepstakes. SGI's problem with the low end stems for a reluctance to make and -push- low-buck RISC/Unix workstations. The Indy was wildly popular, because it was (relatively) cheap, fast, and feature-laden. SGI made it faster and better and -more expensive-, so new customers eventually lost interest. Their competitors took the hint: the Sun Ultra line of workstations are the best selling computers Sun has -ever- made, and rescued them from the "Wintel Menace". On the other hand, the O2 was a bit too expensive, and not pushed anywhere near as hard as they should have been in the face of "wondows everywhere". Instead SGI decided to go the commodity hardware route: Intel processors and commodity operating sysytems. These were -monumental- flops, and SGI had eviscerated it's R&D core to come up with their Visual Workstation failures. Meanwhile, Intergraph, the first company to turn it's back on RISC/Unix, is now out of the computer buisiness. The HP Kayaks aren't making any more headway into the workstation market. Commodity, mass-market crap does -not- a decent workstation make. Digital (now Compaq) is a better illustration of your point. The Alpha was designed to power the next generation of VAX systems running VMS. Digital, however, has always liked to cover all of it's bases, and ported OSF/1 Unix, and then it's own homegrown Unix (now True64). They also hired Linus to port his little kernal to low-end Alpha workstations. The result? The alpha is the second most popular Linux platform, supported by Red Hat, Debian, TurboLinux, and even FreeBSD. The Alpha has moved so far beyond what Digital had hoped for in the low-end market, that it is now a cornerstone of Compaq's IT strategy. By adopting new operating systems, allowing clones, and adopting commodity PC manufacturing techniques, Compaq has positioned the Alpha for ongoing success in an otherwise wintel world. Sun has aped the Alpha buisiness model, with OEM system boards, licensed clones, funded Linux ports, etc, etc. As I mentioned before, the Ultra series workstations and WGS's are doing -very- well for sun... Adopting Linux is not a golden ticket. Something unforseen could come along to disrupt the little penguin's steamroller momentum, be it a new open source project, or a sudden, global realization of how -good- OpenBSD is . The key to success to to remain flexible, to see new opportunities and sieze them, and to structure your product strategy around constant and unpredictable change. It works for Compaq, Sun, IBM, and it will work for your company. Just because it is new and popular doesn't mean that it will make you money if you're not smart. SGI and Intergraph, who shackled themselves with rigid "vision statements" and had a slavish devotion to "industry standards" got themselves in deep, deep trouble. Intergraph is out of the game for good, and SGI close on it's heels... SoupIsGood Food
  • by alienmole ( 15522 ) on Tuesday October 12, 1999 @06:55AM (#1621111)
    You're right that the author lost focus in some places, but his basic point is valid: that Linux, and more generally, open source, has the potential to be, or already is, a disruptive technology.

    At the moment, there are few cases of established companies that have been seriously hurt by open source (SCO?), but the signs on the ground are there that it's going to start happening more and more, specifically in connection with Microsoft.

    As one small example, I've talked to IT managers at smallish companies (50-200 employees) that have been through the cycle of migrating from file servers running Novell to file servers running NT. Some of them are now wondering if they should migrate again, to Linux servers (with their existing Windows workstations.)

    Why would they do this? Obviously cost is an issue: why pay expensive MS per-user or per-server license fees for something as basic as file services, which can now be had from any of a number of reliable, free operating systems? Also, customers don't like Microsoft's upgrade-pushing style: one company I know recently was told by MS support to install IE5 on their main file server, in order to fix a problem they were having. This raised some eyebrows, and reminded the customer that with MS, they don't necessarily get to control what software they install on their own boxes. Besides, this says one of two things about Microsoft: either they're deliberately forcing customers to upgrade to newer versions, even of software the customer shouldn't need; or they're not sufficiently competent, either at the support or development level, to maintain separation between their products. Either way, the customer isn't in control, and these are good reasons for customers to consider alternatives.

    The same goes for web & proxy servers, etc. - I'm seeing Microsoft shops starting to experiment with these kinds of services running on Linux.

    The Linux enterprise server and/or workstation revolution may still be some ways away, but Microsoft is going to start feeling price pressure from Linux on its server licensing policy sometime soon, if it hasn't already.

    Articles like this one [computerworld.com] about General Motors considering Linux for 7,500 dealerships are, at the very least, forcing Microsoft to cut deals with particular customers, or lose business.

    The Linux FUD page which Microsoft recently posted on its web site is proof of its concern.

    In fact, Linux and open source are particularly interesting disruptive technologies, because they aren't controlled by for-profit companies acting in their own interests, as previous disruptive technologies have been. There really does seem to be the potential here for a fundamental shift in the economics of intellectual property, that may eventually extend beyond the software world (into media, for example). That's not to say that proprietary software will necessarily die, just that it will have to change in some difficult-to-foresee ways, to make room for an unusual competitor.

    In many ways, it isn't so hard to understand: companies and individuals are learning the value of sharing on a global scale.

  • I keep seeing misspelled words in the title of the postings, and within it. If people are to take this site seriously, you guys either need to learn how to spell correctly, or get a copywriter.

    "Dilemma" is not spelled "Dilema"

    Look it up... I'm sick of this.

    -Justin
  • by Anonymous Coward
    This is not flamebait. It seems that the strength of Linux is that it has taken existing ideas and re-implemented them well. When I look around at the projects going on I haven't seen anything on Linux that I would call innovative. Even in the KDE/GNOME desktops, there is no pushing of the boundries.

    And no, I haven't done anything innovative either.
  • Well, have Korean cars (no offense intended...) killed off the luxury car market? No. There will always be echelons of price/quality. Why should software be any different from this product cycle?

    Even in high-end stuff, this cycle exists. Although ERP systems like SAP R3, Lawson Software (no relationship...), etc., are expensive, the price probably looks comparative to CIO/CEO types than building a custom ERP system from the ground up. So while the few consultants/programming houses who can/could pull off the custom ERP system for a large company are pissed about being squeezed into becoming "just" SAP R3 consultants because their God-given specialized market was killed by the broader undercutter bastards SAP, then this pattern is a bad thing? It's inevitable. Companies ride the easy street as long as they can, knowing that commoditization is going to squeeze the bottom line eventually (thus the drive to keep coming out with new stuff with higher margins).
  • The prescription for remaining nimble in the face of disruptive technology change, offered by Christensen himself, is to "spin off" the new technology into a nearly autonomous unit whose size is small enough for its initial growth needs to be met in a new market.


    This further illustrates open source's lack of formal organization as a strategic advantage: every new project has as much support from the "organization" as it can attract on its own merits, as judged by individual "customers" (users/developers), regardless of the potential to undermine an established product. Formal organizations can only hope to appoximate that ideal.
  • A major problem faced by big established businesses is that it's not easy to determine what innovations will be successful. Most new ideas do not result in a successful product, most new businesses go bust. Even in this business climate, most startups will never reach the IPO stage, and most stock options will eventually be worthless.

    When innovation is an integral part of the corporate culture, as it is at 3M, the expertise to turn an idea into a product is there. But I don't know of any company other than 3M that has learned how to do this.

    Occasionally something comes along, like the Internet or Linux, that is clearly a winner; and then it's easy to decide to jump on the bandwagon. Today, any company that is not at least seriously checking Linux out is asleep at the switch. But just because two such phenomena happened so closely together does not mean that we have entered a period of highly predictable innovation. Yet another obvious winner may arise this year or next, but the most likely future is one in which the technology winners emerge from a confusing market battle. Microsoft is a good example of this. Until they had achieved dominance, it was not clear that they actually would.

    Small companies bet the farm on their ideas. The individuals are willing to put in long hours for little pay in exchange for a chance to get very rich. Big companies, and the people who run them, are already very rich, and don't have the same motivation to push themselves. More than that, no single company can afford to back every possible new idea; and picking the winners is a very difficult job.

    Even with something like Open Source development, it's tough to figure out a winning stragegy. While it's clear that Open Source development is a winner, it is not clear for any given product whether it would be best to create it using the cathedral or the bazaar. ESR has presented some guidelines in this area, but only time will tell whether they will stand up to real use.

    All in all, I don't see any way for a big company to avoid the innovation trap.
  • I have the book on my desk right now. When I read it I instantly saw how Linux fit in. The point was that there are these products that are "good enough" for some jobs. They are nowhere near as good as the industry leaders at most jobs. Pick a random job that the industry leader is good at and the new guy can't do it. But it can do some jobs "good enough".

    So the new guy charges less and learns how to produce his product for less in order to survive. Then he starts to improve his product. (If he doesn't charge less or improve his product he remains a niche player or disappears so we don't care about him.) Since he has an inferior product he has more room for improvement. He delivers the improvement at the cheaper price because he is used to the working cheaper and also he can cut corners because he just has to be "good enough" not "the best in the industry". The cheap product actually opens up a bigger market because it is cheaper (think sub $600 PC's for instance.)

    Meanwhile, the industry leader is making improvements but at a slower rate. He is at the top of the line and is the best in the business. It is HARD for him to improve. The new competitor may never get to the point that the top guy is at now but he can get to a point just below the top guy and do it at a cheaper price. Soon everyone but the very top customers is buying the new guy's stuff and the leader can't afford to compete.

    Right now Linux is "good enough" to be a web server or a file server or a mail server. It will probably become the OS of choice on really cheap PC's, dedicated 1 function servers and embedded devices because it is "good enough" and cheaper than Solaris or some NT variant. Whether it takes the desktop or really high end server market or not is irrelevant in the near future. The desktop will become a small percentage in the overall market. Your Fridge and Microwave will probably run something like Linux. Not because it has a journaling file system or supports 5 Terabyte individual files. It will because it is good enough and cheap.

    It will improve because anyone who wants to will improve it for various reasons. Individual fanatics. Computer science students. Big companies trying to run Bill Gates out of business. Hardware companies wanting to sell hardware etc. These improvements will be cheap because the improvers aren't out to make a buck on the software. They make money elsewhere. Solaris and NT may always be better than Linux but Linux will improve, be "good enough" for more and more things AND be cheaper.

    If they come out with a new edition of the book Linux should be used as a case study. They have hard drives and the Intel Celeron chip in there as examples. Linux should be a good one too.

    That's what the book says to me. It's small and has applications to almost any area of business. I recommend it highly.
  • "Where is the OS/390 desktop environment?"
    "Where is the SCO video editing software?"
    "Is there a decent Mathematica-clone for Netware?"
    "Call Fujitsu and get theirs, or wait for Borland's (already ported) or Metrowerks' (coming very soon) compilers."
    None of these is a good reason not to use these operating systems in a server environment.
    I would also never argue that commercial software, that Windows software is inherently bad, or that open source is a magical solution to all our problems.
    Instead, I argued that Linux is a key, low-end server platform that will be (and already has been) disruptive in the operating system industry.
    I agree that most Linux distributors have very, very poor QA for the software they distribute. That's part of the opportunity that exists for established vendors (hardware, OS, applications, whatever) with very high QA standards to enter the market before Red Hat or Caldera really takes a solid hold.
    --JRZ
  • A great deal of academic/government research ends up as open source. Take Beowulf, for example, or some interesting ideas like the Reiserfs filesystem. Part of the problem here is that the academic CS community and the OSS community aren't one in the same (any more). So good ideas from one side don't always flow to the other. I know of several really great research projects in everything from compilers to OS that have been released under the GPL from my school, but I don't exactly hear about them on Freshmeat.
    --JRZ
  • Heh, I even went to m-w.com to make sure :)
  • troll me if i'm repetitious, but biz that wants to survive accelerating churn [slashdot.org] doens't need to re-organize re-assemble or re-engineer.. it needs to *re-conceive*, chaorganize [slashdot.org] , distribute ownership equitably among all participants, including customers. keywords: ingenuity, (innovation), loyalty, (repeat bizness).
  • Actually, you can get Mathematica for Linux.
    But there's no free (GNU) equivalent, as the original message pointed out..

    Personally, I can't see the problem with this - If there are Commercial versions of the software for Linux, as well as Commercial versions for Mac, Wintel and the commercial unixen, why is this a problem?
    While free, open copies of software are nice, as far as I can tell, Mathematica is a proprietary language owned by Wolfram Research, Inc, for which they hold the trademark - I am not too sure to what extent a Open Source compiler for this language would be possible, as I don't know how much and/or which parts of the language are copyrighted - but I was unable to find ANY other implimentation of this language but the "offical" one by Wolfram in a websearch....
    --
  • Many failed CEOs use "common business sense" as excuse for failing to adapt at a crucial time, but I wonder where they'd say GE fits into the picture.

    GE has managed to grow and survive more nimble competitors despite its being the archetypical large, slow-moving ship. True, it was in _serious_ trouble before Jack Welch came on board, but by most analyses, Welch is only part of their magic, although the Temple of Superstar CEOs has raised him to demi-god status.

    I only know what's been written about GE in general business journals, so I'm sure someone more knowledgeable can come up with a better example (or counter-example) of GE's survival instincts, but remember when EDI first came about, and it was shaking the established manufacturing practices to the core? Newcomers such as Sterling and Harbinger were the nimble new entrants, and yet GE managed to largely beat them off, turning GEIS into one of its most profitable divisions and charging straight into EDI. Now that Internet EDI is knocking at the doow of the traditional, _very_ profitable EDI Value Added network business (a sort of EDI ISP), it is actually Sterling and Harbinger that have been slow to embrace the future. All VANs are slowly acceptiing the reality, but GEIS has adapted with more alacrity.

    Now I'm not playing praise-singer for GE: I still think they are terribly over-extended, and it's amazing that they haven't gone the way of so many other mega-conglomerates in the new economy. Nevertheless, if anyone is teaching that it's such good business sense to avoid embracing the future rather than cannibalize your current business, they are certainly using the wrong companies as example.

    --Uche

  • I think you may need to look a little harder. I would guess that most of the projects that are easily visible do not push the limits. The only thing I can think of off the top of my head that is pushing the limits is the Extreme Linux work. While other companies and OS are doing large scale distributed computing it is still the bleeding edge.

    I also don't believe it will ever be easy to find "innovative" work in Linux. The main reason not being that Linux tends to reimpement existing ideas better, but because Linus and AC are more conservative than people think. Read the kernel archives sometime, and pay attention to what they say. Nothing gets into the main kernel tree without a damn good reason and being fairly stable and tested.

    This seems to contradict the idea that Linux develops fast, but it really doesn't. New features get developed tested and become good enough to enter the kernel tree very fast, so being conservative in Linux isn't nearly a slow as it may sound.

    Also, I really don't believe Linux shoudl be that innovative. It should be a rock solid platform for others to innovate on top of. And, that is happening, I would consider the uSIMM a good example of people innovating on top of Linux.

    Finally, I wish someone would actually define innovative. MS says that the governement is stifling innovation, but they have never defined it. Every talks about innovation and nods there heads like they understand, but I have never heard anyone who can really describe it. Personally, I consider making something better, faster, and cheaper is innovation. New software ideas generally don't come from companies or OS communities. Generally, they just implement and distribute to the masses. The original ideas come from individuals whether inside a company R&D organization, more often academics. And, most of those ideas are probably unfeasible, stupid, or unworkable. The trick is finding the good ones, and taking advantage of them.

    Dastardly
  • While free, open copies of software are nice, as far as I can tell, Mathematica is a proprietary language owned by Wolfram Research, Inc

    And indeed, the marginal benefit from opening the source of Mathematica would be pretty minimal, as it is already subjected to genuine peer review of frightening intensity by its testers, who are all high-end mathematicians. Open source is good news compared to "quality through obscurity", but doesn't really compare to real, focused peer review.

    For some applications, any hacker is as good as any other. And indeed, there might be a guy out there who could improve the hell out of Mathematica, because he knows his stuff better than Wolfram. But, I tend to think, this is one case where the Capablanca theorem kicks in (Raoul Capablanca to an anonymous chess challenger: "If you could beat me, I'd know you").

    There may be other such cases, and a set of criteria for deciding which kinds of software are best served by "limited peer review" would be good news. But don't look in "The Cathedral and the Bazaar" for such a set of criteria, because they ain't there.

    jsm
  • Despite the fact that this is probably a troll (witness the "as well as the heavy reliance on free software, most of which _sucks ass_" line), I'll address some of your points.

    I haven't used either KDE or GNOME really heavily, so I con't comment on your first complaint. From what I've hard, though KDE sounds quite good. I also don't know of any video editing software. It's not an area in which I've done a lot of work.

    A GNU Mathematica equivalent? Try Octave [wisc.edu]. I probably haven't used the thing to its fullest capabilities (I'm only an undergraduate), but it's been able to handle everything I've thrown at it.

    Why do you think that GCC is not a good compiler? It certainly seems to be working well for me.

    Now if I'm confusing the issue of open source and linux, it's because the original post seemed to, and lots of people are talking about the end of the software business. Open source is great for things like a web server, or a kernel, or a compiler. But bigger stuff? Like antenna simulation software? Things like that? Forget it. No one is going to write something like that and just give it away.
    I'm not so sure. "No one is going to write something like that and just give it away"? People said the same thing about operating systems. We currently have at least Linux and the free *BSDs. People said that about various applications. "Nobody'll write a compiler and give it away. Nobody would think it interesting enough." "Nobody would write a free graphics program anywhere near the same class as Photoshop. It's too esoteric a market." Both GCC and the GIMP have now been around for a while and have proven themselves. "Nobody would write a free office suite. It's not a sexy enough project." Both the KDE people and the GNOME people are currently working on office suites. On the GNOME side (not familiar with the KDE), Gnumeric and AbiWord are very useable, if not yet feature-for-feature comparable with Office or WordPerfect Suite.

    So, it may not look like anyone will write a free version of software program <Q>, but you just might be surprised at what the free software world can do.

    Finally, you had some complaints about specific programs. If you have problems, please take the time to file a bug report. The whole point of free software is that it gives everyone the freedom to rewrite the program. Even if you, personally, can't fix the problem, at least let others know that you had a problem so they can go looking for it. (I'll also note that there are many, many CD players out there. If one doesn't work for you, try another.)


    --Phil (Non-rabid (hopefully) free software advocate.)
  • You guys crack me up. You can't even see the validity in the post, since it attacks linux (heaven forbid).
    Spot on - we can't see the validity of the post, as it attacks Linux - it doesn't make any points, throw up questions or answers, or indeed consist of anything but "It talked about Linux, I gave up at that point because I don't want to hear it"
    The author would like to see a discussion of Open Source software that doesn't mention Linux. No doubt there would be people who would like a discussion of Webservers that doesn't mention Apatche, or of Operating Systems that doesn't mention Sun, Microsoft or HP - but they would be trolling as well.
    Yes there is open source software out there that not only isn't Linux, but without which Linux itself couldn't have been created - GCC for example; but it has to be accepted that Linux, particularly in the eyes of the media, is the Flagship of the Open Source community, and that a piece without any mention of it at all is likely to be flamed for that omission and it's otherwise valid points overlooked.
    --
  • by Allen Akin ( 31718 ) on Tuesday October 12, 1999 @08:21AM (#1621136)
    I also recommend Christensen's book; Linux has many of the characteristics of the disruptive technologies he discusses, and his insights certainly have prompted me to look at the industry in a new light. However, I don't believe the Innovator's Dilemma was the primary problem that led to SGI's failure. Christensen does a good job of characterizing the ``flight upmarket'' that established firms do in the face of disruptive innovations. Although SGI maintained a strong presence in the high end (the high-end graphics group was still showing significant profit, last I heard), for years it was pushing actively downmarket into the ``disruptive'' areas -- with NT-based systems, and more importantly, with projects like Nintendo 64. I suspect the fundamental problem with SGI was not that it failed to handle disruptive innovation, but that it failed to remain competitive within its core markets. Sun, in particular, avoided this problem. Arguably so did HP. But SGI was late with new machines in its traditional strongholds of CAD and content-creation, and when the new machines shipped they were underwhelming in terms of price or price/performance. Like most SGI veterans, I have opinions as to why this happened despite the fact that everyone from senior management to the lab technicians saw the market shifts occurring. But I'll skip that for the moment. The Innovator's Dilemma is an excellent work, and well worth reading, especially in the context of Linux. But be careful when applying it to any particular corporate failure; it's not the only reason technology companies stumble! Allen
  • Everyone is suddenly making a big deal about it. I looked it up on the web and it says its kinda metadata on all the files that have been opened and their states. Is this log good for data recovery. Is this why NT takes so long to shut down? It seems to be some sort of marketing checklist thing that no one has explained WHY it is useful.
  • So if you're looking to throw a few million into research and development, in order to turn your Cool New Idea for the Next Killer App into a reality, do you really want to have to be asking yourself "So... how long until a few bearded hippies come up with a free (speech/beer)knock-off?"

    If not a "bearded hippie", then Microsoft will (eg. web browser). So if you really have a good idea and want to implement, you might as well GPL it, to keep M$ hands of it.

    You'll just have to figure out another way to make money from your work.

    ...richie

  • Try "Corporate Level Strategy." I can't remember the author's name. It's about multi-business corporations and GE is a major case study.
    They do a lot of compartmentalization of their different businesses, allowing them to grow at their own pace.
    It's not a bad book about corporations that have grown a lot by acquisition, but it's not targetted at the brightest bulbs in the box, if you know what I mean. Aimed at big-company CEOs.
    --JRZ
  • If a server is mysteriously shut down (due to crash, hardware, or power failure), an OS with a good JFS won't lose any data or have to fsck like Linux does after a bad shutdown. If every byte is mission critical, or if you have a large database that would take hours to fsck, this is a must-have.
    --JRZ
  • If one thing, history has shown us that the software industry, while it parallels many other industries, cannot be compared accurately to other industries. The developmentally fast and easily adjusted nature of software development means that while large companies like Microsoft may easily be toppled by a relative newcomer, those same large companies can make small changes very fast return the favor. People talk about what a behemoth Microsoft is, and they are quite large, but that largesse does not compare to, say, a large steel company. Adjusting a steel company means developing new technology, constructing new facilities only after producing the same thing for many years. In software, developing new technology is an ongoing process. New facilities simply aren't needed, just new computers. And, if the same exact thing is produced for many years, something is incredibly wrong with the world around us. It's unfair to Microsoft to assume that they can't quickly change to address the oncoming threats of smaller companies. I think they've already shown with the Netscape fiasco that they can, and ruthlessly, and they've shown that they're on the ball regarding the Linux threat, at least in terms of recognizing and addressing it. Barnes & Noble got shafted because they had to setup all the back-end stuff to implement their plans and that takes time and money, but there's very little back-end stuff in software development. If something's amiss, you just put the programmer on a new task and set him to it. If anything, many open-source projects are followers, not innovators (notable exceptions being Apache and mySQL) as they simply react to other people's innovations. Yes, their product won't cost anything, but something has to become established before someone decides that a free program would be a better implementation of it. This has gotten a little off-topic, but basically, Microsoft won't go the way of Barnes & Noble because a) they can adjust much much quicker as a result of their product and b) because they recognize the threat.
  • Does NT support this by default? I should be able to just yank the power cord on one of my NT boxes and it should recover just fine right? Well it doesn't as any NT tech can tell you :) Mention this test and the face turns pale. So I don't understand why Microsoft is making such a big deal about it.
  • Something else is starting to happen with increasing frequency (in the technology sectors, at least). Startups, funded by venture capital, are becoming sort of contract R&D labs. If they come up with a good idea that seems to work, some big company buys the startup outright. Cisco and Microsoft, among others, are known for doing this. Keeps the pesky little mammals from getting at the dinosaur eggs. :)


    I can tell you the meaning of life,
  • The problems you cite have little to do with matrix algebra.
  • A decent article, the old railroad companies was a good example. However there are some differences when comparing this to the linux/open source vs. microsoft/software companies.

    Small, competitive companies with new ideas that are superior and cheaper aren't always able to topple the giant (actually I'd say it's uncommon). There are several ways to defeat it. One is FUD tactics, which is a marketing technique, and understood by everyone who reads slashdot.

    The other techniques, however, linux seems to be immune to.

    Cost of entry is minimal in the software industry.
    If I come up with a car thats runs on water, I can't simply build them in my garage and make a dent in car companies bottom line. I need big, big, cashola, more than what any venture capitalist could give me. Car companies know this, and will never develop this product either, why waste money on R&D when you can make just as much money off the public with the cars you sell now? Oil companies might make sure your operation never gets off the ground either, they might buy you out and run the company into the ground. It's different for software, however. You don't need a big manufacturing plant or fancy test equipment to write good code. You just need a computer for coding and an internet connection for distribution. Market exposure can be a problem, but word of mouth can be just as powerful. This is similar to what is happening to the record industry with mp3s, and will soon happen to the movie industry and publishing industry (to varying degrees, I won't get into that). Code is basically just a form of information, like movies and music. With the internet information can be reproduced and transmitted at an almost negligable price per copy. The means of production are taken away form the big companies, into the hands of everyone.

    So maybe I'll just sell my idea to the car companies and make millions off it, or get bought out if I got enough money to make my manufacturing plant and make a dent in the car companies bottom line. This is something microsoft does as we all know.

    We also know that linux can't be bought. This is point number two. I suppose microsoft could buy RedHat, but redhat doesn't own the rights to linux. It will always be available for free (or at least much cheaper than windows for some sort of "user friendly" version) because of this. Even if Linus sold the rights to the kernel to microsoft (does he actually own them? I'm not sure about that) linux is much more than the kernel, its the "loose" network of people around the world who contribute to it. These people can't and won't be formed into a capitalist corporation.

    These are the reasons I consider linux, open source software, and the internet as such a huge threat to software companies, or any company that deals strictly with "information".
  • The NTFS filesystem journals metadata (information about files). This ensures that the filesystem is consistent if the system crashes and allows the system to be rebooted quickly. It does not journal file data. That means that the contents of files can be corrupted by a system crash. My guess is that the designers of NTFS were primarily interested in avoiding the common UNIX problem of lengthy fscks after system crashes with large filesystems.
  • aren't ya taking that a tad bi far there bddy? ;)
  • To distill this all down to one pithy saying, it seems that the key to corporate survival is to always work to put yourself out of business before someone else does the job for you.

    (A trickster type of lesson.)

    T. Coyote
  • ...if you want another example of a company which "suddenly emerged". Wal-Mart's market presence was first really noted in the 1980s. The firm had been around since the 1950s (possibly earlier), and was almost completely missed in a competitive analysis done by Sears in the 1970s, except for a brief mention.

    It's the lilly-pad (or bacteria) growth phenomenon -- a long period of slow but steady exponential growth takes you from too small to notice to "bit player" to "how'd we miss that!" -- much faster than the old school cares for. It's also why momentum is important.

  • Call me a fool, but why is Fry's (in San Diego) selling MetroWorks CodeWarrior for Linux on the Linux shelfs?

    At least there's a decent S (statistics package) clone called 'R'...

    OS/390 desktop environment. x3270/tn3270 good (bad) enough?

    mathematica? Try Maple (http://www.maplesoft.com).

    Hmm... To say that there are NO alternatives for the above is, for the most part, futile, is it not? So why do you continue to do it?

  • Call me a fool, but why is Fry's (in San Diego) selling MetroWorks CodeWarrior for Linux on the Linux shelfs?

    At least there's a decent S (statistics package) clone called 'R'...

    OS/390 desktop environment. x3270/tn3270 good (bad) enough?

    mathematica? Try Maple (http://www.maplesoft.com).

    Hmm... Nice try.
  • The larger mass of deadness should be the more successful species. It's been around long enough to accumulate such a mass of deadness. That being the case, I bet the rats would win this one hands down.

  • Even file/web/print servers need tweaking once in a while, even if they don't seem to be such maintenance hogs. The majority of IT projects, however, seem to be quite more labour intensive. I don't see linux breaking through in day to day business practices, unless IT people start picking up linux skills at a much faster rate than is the case today. And since services account for the bulk of IT business, I guess that leaves linux as being promising, but nothing more than that.
  • Most software written is custom software written in-house, or contracted out but written specifically for the company's needs. This stuff will continue no matter what.
  • The same arguments could be said about other technologies - those that haven't done so well.

    Circa 1994

    "With this network computer, also known as an 'NC', we will revolutionize the world. With your funding, we'll be in every household in 5 years!"

    "Great, I'm sold! To who do I write the check...?"

    ...

    "With betamax, your videos will be much cleaner and clearer. This will blow VHS away!"

    ...

    Etc. etc.


    - Darchmare
    - Axis Mutatis, http://www.axismutatis.net
  • Western Europe has consistently underestimated Russia, and was caught by surprise when, under Leninism, it exploded economically and fully became a world power.

    You have got to be kidding. The real truth about the former Soviet Union was that it was an economic disaster. There was no real question about this before World War II; the surprise result of the war against fascism was that such a strong defense could be mounted despite such obvious economic weakness. And being on the winning side of that ward led not to the underestimation of the Soviet economy, but the consistent and enormous overestimation of that economy by the most world powers, most notably the CIA.

    There is no dispute that the Soviet Union was a world military power, but that pre-eminence was attainable because while central planning was very poor at creating wealth, the apparatus was very good at sustaining patriotic fervor and directing what resources there were into military hardware.

    Indeed, the ironic thing about the use of the former USSR in this discussion is that the whole regime crumbled when faced with the challenge of a truly piddling amount of free market capitalism. The fact that Russia is currently an economic moonscape is not primarily because the old system was dismantled (it just got new owners) but because it produced virtually nothing that had value in a free market economy, and turning that around will take a lot more time and effort than most Western governments would have expected.

    King Babar

  • Well, one example of a technology that was NOT very quickly replaced by a superior one (superior in some aspects anyway) was AM radio. In the 20's and 30's a very large patent holding company named RCA had a large vested interest in both capital and experience in AM. One radio pioneer named Maj. Armstrong [erols.com] sought to find a solution to the problem of 'static' and invented FM. He even invented mux FM (multi channel, stereo), but had very great difficulty getting it out, particularly with the RCA problem. Eventually he just walked out a 13th floor window - and FM didn't 'catch on' untill the 60's.

    Chuck
  • A number of comments have either questioned whether Linux is superior to the existing alternatives or pointed out that the superior tech does not always win in marketing battles. Both statements have validity, but neither really gets to the core of the dilemma.
    Linux is not disruptive because it's a radical, new, great technology (it's not). It's disruptive because (a) It works well for reasonable-sized market (small servers) and (b) It's unbelieveably cheap and easy to advance (due to the GPL) compared to existing technologies.
    Put 'em together, and you have necessary and sufficient conditions for a disruptive technology, as they say.
    --JRZ
  • I'm surprised no one has mentioned this. The Osborne effect is when you (you being a successful technology company) screw yourself with your own innovation. EDO RAM and the Pentium MMX chip are some obvious examples. I'm sure many readers here are more learned on this effect than I, but it seems like something we should keep in mind when discussing the benefits of innovation to established companies.
  • Dammit, there's not a free Mathematica for
    any other platform. Why do we have to take
    abuse over everything not being free?
  • You can certainly gripe at their pricing, but bear in mind that Schwab are really trying to position themselves inbetween the full service brokers like Merrill, and the deep discounters like Datek ($9.99/trade), or Brown & Company ($5/trade for market orders). It's a segmented market.

    My real point was that Schwab were willing to eat into their $80/trade offline business, because if they didn't others would. You can argue about their positioning, but it's hard to argue that they'd be doing much worse if they hadn't made this move.

    In a similar vein, you can expect Microsoft to reduce Windows licence prices to compete with Linux, but don't expect them to reduce it to zero to have to compete. Even a Microsoft branded Linux wouldn't necessarily have to underprice other distributions if they market it correctly. The big mistake of course would be to sit pat, do nothing, and watch Linux gain unstoppable momentum.

  • The best explanation I've found was in LWN: http://lwn.net/1999/0923/kernel.phtml [lwn.net]

    Look a little ways down the page, at the section starting with "Ext3 support is getting closer".

  • Was HPFS a jfs? We all know MS took what it knew from OS/2 and made NT, with NTFS supposedly very similar to HPFS... If HPFS was a jfs, than it would be IBM's and MS's designers trying to avoid that problem.
  • HPFS is not a journaling filesystem. It was originally written by Microsoft (Gordon Letwin) for OS/2 1.X. Some of the design features seem to have been taken from the BSD FFS while others were new.

    NTFS is a new design. There is a book on it but I haven't read it. Its primary flaw is its tendency to fragment files.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...