How will software be sold?
In your discussions with the various entities of the computing industry, how do you expect to see software distributed in 5-10 years time? Should we expect to see a greater take-up of free speach || open source || free beer || restrictive licensing on the low and high level (drivers and word processors), low and high end (MS Paint and Adobe Photoshop) software? Do the current players believe that they should all be looking log-term into securing their positions through licensing agreements or that they should be selling a service? In particular have you heard any noises of hardware companies who are looking into OpenSourcing all their drivers (i.e. Windows) so as to achieve the maximum penetration of their products?
This question has changed so much over the years. At one time it was retail versus shareware. Then added to this argument was floppy versus CD and later downloading versus shrinkwrap. I think downloading is the long-term winner because everyone will soon be networked and the cost structure works well for buyer and seller alike.
Something else that has changed a lot is how software is written. OOP has paid off more than we even know, so there are a lot of chances to make businesses out of selling cogs that fit into other people's machines. Your driver question, for example, wouldn't have even made sense a decade ago.
But the real answer to your question is "yes." I'm not trying to be a smart-ass here, but there simply won't be a single winning method of selling (or being compensated for) software. Open source is nice, but it isn't a way to make a living unless you are getting a reward in some indirect path. That's why college professors write books (to get tenure) and open source programmers write code (to get laid). Just kidding. But look at the reward structure, because there has to be a reward or it won't work.
My gut tells me that system software will be mainly OEM'd and that applications will go the service route. But what's still missing is a payment structure for these services. That's the challenge still not being either faced or overcome. So I'd appreciate it if you would do something about it.
gender and technology
Robert, In a study that was announced a day or two ago, it was shown that the number of women who are pursuing degrees in computer science related fields is dropping substantially. I'm wondering what you think can be done to improve the appeal of careers in computer science to women, and how the domination of the field by males affects the cultures and product directions of the companies in the field.
What bothers me about this is that there is absolutely no evidence to suggest that men are inherently any better at this computer stuff than are women. But on the flipside, is there any compelling argument why there ought to be more women in the industry? Is it discrimination that is denying these women the chance to work 100 hour weeks?
There is some free will here, you know. Yes, I agree that we probably don't do enough to encourage women students, but I will bet that if we could somehow control for all those other outside variables that more men than women would still choose to walk the digital trail. This is, I believe, is because it is such a crappy lifestyle. Sure there is money (eventually) and success (sometimes) but at what cost? I don't blame the girls for choosing another path. But if we want to help encourage women to enter this field, I think we have to do it through the simple acceptance that coeducation is at fault. I remember years ago writing a story about Mills College in Oakland, an all-women institution. The thing that blew me away at the time was that the Mills computer installation was entirely home built. The women built their own PCs, they built and wired the network, they even built their own routers, mainly to save money. There is no doubt they can do it.
Men are pigs (I know I am) so the expedient answer is single-sex education for any women who want it.
by Anonymous Coward
According to this article:
The host of the three-hour documentary, "Triumph of the Nerds," is really Mark C. Stephens, one of several authors of a popular gossip column in InfoWorld magazine written under the Cringely pseudonym. Mr. Stephens, 43 years old, penned the column between 1987 and last December, when InfoWorld cut him loose. But in a case with enough twists to give anybody an identity crisis, the magazine and its parent, International Data Group Inc., sued Mr. Stephens in March for trademark infringement to block his continued use of the Cringely name.
So, Robert, are you still Mr. Stephens, or are you someone else now?
"Cut him loose?" That's an interesting way to put it. InfoWorld fired me. I was by far their top-rated columnist and had been for eight years. Why would a publication fire their top draw? It certainly makes no business sense. The best I can figure AND THIS IS ONLY MY OPINION is that I was fired to please the ego of Stewart Alsop. My column was always more popular than his -- a LOT more popular. Of course it had to be because of my placement on the back page, so Stewart (then editor-in-chief) had his column moved to the back page, too. But his survey numbers didn't change. At this time the price of newsprint was skyrocketing so InfoWorld several times changed its trim size -- the actual size of the page. As the page got smaller and smaller, Stewart's column remained the same size and mine dropped from over 1000 words to around 600 words over two years. Still, Stewart's survey numbers didn't change. Several times as many people were reading my column than his even though both were on the same page. Having to face the prospect that maybe mine was a better column than his, it was easier on Stewart's ego IN MY OPINION to fire me than to accept reality. So they fired me, sued me, lost, paid me off, and here we are today. What happened to Stewart? They cut him loose.
I think we are both better off this way, Stewart and I. I know my life is better post-InfoWorld and he is now a successful VC. I wish Stewart well.
Software and Computers
I'm a developer and I am curious as to how you think the software will change in the future.
I know from looking at many contracted software packages that quality is something usually forgotten in the windows world. Badly written hard to use and usually very buggy. Do you feel at some point that companies will finally stand up for themselves and demand good software?
As for hardware, with the standards being modified so quickly will we end up back at a proprietary level again? I ask because of the splitting between AMD and Intel on the type of interface on the motherboard for the processor (not to mention the memory style variations happening) Will programmers end up writing towards a proprietary box/cpu do you think?
I used to test software and the first thing I would do is bang on the keyboard with my fists. "Don't do that!" the developer would yell as the system crashed. "Why did you do that? No user would ever do that."
Ever had a three year-old user? They do that.
Windows software quality sucks for a lot of reasons, but then so does the quality of most software, including packages you think are great. That's because you are so good at working around the bugs you've forgotten they are there.
Part of this is because it is not in the interest of many companies to demand better software. That's because the very person who would be demanding can trace his/her power in the organization directly back to the bugginess of the software. IT managers want bigger budgets and more people and that comes from either using crappy software or pushing their company into using immature software.
This is not going to change.
Now to hardware. Remember how Gary Kildall came up with the ROM-BIOS? He got tired of porting CP/M to every new hardware platform so he wrote a middleware layer so the OS could be standardized. Then the hardware manufacturer could be made responsible for writing the drivers to that middleware -- the ROM BIOS. This was back when 10,000 computers was a big production run. So AMD and Intel are diverging a bit. Well now we are talking about production runs in the millions. Who cares if you have to write two versions? Picky, picky, picky.
Given prior history, who do you think will win
Given that we've had umpteen OS wars, like unto the crusades in both their bloodiness and the invective used, can you discern any patterns in what determines the survivors of such conflicts?
For example, is it really the games that determines the winner, the "killer app", the ease of use, the cost, the marketing, or is it the media attention. If it is one of these, what are the most important elements, IYO, in determining the winner.
And, given the /. bias, what would you change in how Linux and BSD is progressing to maximize its survivability. Or is this all 20th Century thinking, and is the OS truly becoming irrelevant?
To my Mom, the OS is already irrelevant, which says a lot about how we look at the market. To most of us, the OS is probably more relevant than it deserves to be. FreeBSD-versus-Linux feels exactly like Ford-versus-Chevy when I was in high school.
Who will win? That depends on your definition of "win." Microsoft defines winning as getting all the money, so over time they will bend their product offerings toward wherever the money seems to be. Linux doesn't work that way, since it doesn't really cost money. Apple is a software company that sells its products inside $1800 boxes, so its motivation is different again. I see market segmentation going like this: Enterprise backend -- small to medium servers -- business desktops -- professional desktops -- gamers -- home computers -- thin clients. Microsoft wants to dominate each of these and will fail in most. Linux targets only 2-3 of these niches and so can't hope to win overall. Same for Apple. But there is room for many winners here. Microsoft makes an average of $200 PROFIT from every Macintosh sold, so Apple's success is also Microsoft's. Linux may hurt Microsoft a bit, but not as much as it inspires Microsoft to be better. So Linix is good for Microsoft. Eventually, though, the market will zig when Microsoft zags and a new leader will emerge. Here's what I can tell you about that new leader: it hasn't yet been founded.
Being in and around Silicon Valley, and also having seen so much change over the face of the computing industry in the last 20 years, what mistakes do you see that are causing so many dotcoms to fail? What steps could they take/could have taken to prevent this from happening? Conversely, what do you think separates the ones that have made it from the ones that are floating belly up?
In the early 1980s, following the amazing success of Seagate, more than a hundred hard disk companies were found AND FUNDED, each one saying in their business plan that two years out they would have 15 percent market share. Why didn't the VCs see that? Well VCs aren't very original and they also aren't very smart.
Now the same thing has happened with dot-coms and the VCs aren't any smarter than they used to be. But what you have to remember is that they EXPECT a 95% mortality rate and still make a 40% compounded return at that. And failed companies are the ore from which new companies are refined.
Now to the rules for success. I started out to do five of these and ended up with eight. They apply not just to Internet companies but to any high tech startup. They are simple: 1) Fill a need that actually exists, not one you wish existed; 2) Don't count on customers to tell you what that need is (they don't know what they need until you invent it and they see it); 3) Don't push the technological edge because you'll nearly always starve to death; 4) Be very quick to recognize the greatness of others and copy it (in other words, let someone else be responsible for rule 2, above) because the second entrant wins more often that does the originator; 5) Success comes from selling things, so hire a better head of sales and marketing than you think you can afford and hire him/her earlier than you think you should; 6) Every startup has a change of course, a moment when it becomes clear that the original idea just won't work, so be willing to change course when you have to; 7) Know when to call it a day -- most startups fail and nearly every successful startup is run and staffed by people who have already failed, and; 8) Hire a mix of old and young including some people near the top who have already tasted startup success.
Do you feel that the computer industry is less innovative today than when you started out? More specifically, do you feel anticompetitive practices by certain companies actively restrict new technologies, or are these current titans just one great idea away from becoming also-rans?
There is always a tendency to glamorize the Good Old Days. I have been in this computer industry for 23 years and four months and as far as I can tell THESE are the Good Old Days. When I started we were inventing an industry and serving a customer base of a few thousand hobbyists. Every application was a horizontal application because the market had no vertical component. Well today the market is enormous and is so vertical you can get a nose bleed, which means that if I have an idea for solid state accelerometers, there is a customer waiting for my product. That is good.
Microsoft is a bully, sure, but understand this: the step after ubiquity is invisibility. Microsoft is too big to economically enter any but the largest new markets, which means there is that much more opportunity for the rest of us. In fact, Microsoft NEEDS the rest of us to show it where to steal. I think we should stop complaining, enjoy the cheap hardware, and get rich.
Commercialization of the net
by Dan Hayes
What do you think that the increasing commercialisation of the net is going to lead to? In particular do you think that the work the various standards bodies do is becoming increasingly ignored when it comes to what actually gets used on the net?
Increasingly ignored? I think it has always been ignored, with the only exception being the IETF. Back in the 80's everyone anticipated the rise of the International Standards Organization. Everything then was TCP/IP and SNMP and SMTP and we knew it couldn't last. The Europeans and their committees were going to come through and kick our asses with X.25 and CMIP and CMOT (remember those acronyms?). But it didn't happen. So too with Token Ring and even Asynchronous Transfer Mode. What a load of crap is ATM! It guarantees Quality of Service by crushing packets that under gigabit Ethernet would have gone right through. Does that make sense? No, it doesn't, and that's the point.
The beauty of the Internet and the IETF lies in a simple idea -- that the only standards under consideration are those already in use on the Net. Ready, fire, aim! It looks sloppy and it is, but with this system change is accelerated, crap is revealed as crap that much quicker, and we end up with systems that actually have a hope of both operating and interoperating. Now I know your question had to do with commercialization, but commercialization is good and committees are bad. Windows, for all I complain about it, has put a computer on 200 million desks. There is no Windows committee. for that matter there really isn't a Linux committee, either. Thank God.
From your privilaged position what technologies do you think should-have-made-it but didn't? What technologies do you think were ahead-of-their time but might resurface? Finally, what companies that suprised you by not making a go of it when they seemed like sure-things?
This is a great question hampered by my aging brain. I have written about so many companies and technologies over so many years that I'm sure I'll miss the really important points, but here goes.
It's not so much about technologies and companies as it is about timing and markets. Why did bubble memory fail and flash memory succeed? It's the price, stupid. Same for the Lisa, a great computer five years (and $5000) ahead of its time.
What if Amiga had been bought by another company than Commodore? Now THERE was an opportunity lost.
Had Apple been better managed by Sculley would it today have market dominance?
Pen computing was an obvious non-starter, but everyone had watched the success of Windows and wanted, through hope alone, to make the next wave come that much quicker.
And here's the lesson we learn over and over again: we overestimate change in the short term and underestimate it in the long term. That's why the first entrant into a new category almost always loses. Bought any Altairs lately? Even Apple was probably the 30th little PC company to be started in Silicon Valley, giving it a shot at success.
The technology that keeps being reborn is Unix. The first PC Unix I used was Cromix -- Unix for Cromemco computers running 4 MHz Z-80A processors. Now we're all hot for Linux, but what is it but Cromix reborn and supported by a bunch of enthusiasts? And the secret to a particular platform's success always comes down to the killer application. For Linux I see as yet only Apache and Sendmail as killer apps, which means it won't penetrate the desktop much further no matter how much we want it to.
Want to help Linux? Write apps!
Tell us about the early days
The early days are shrouded in confusion, myth, lies, half-truths, and blazing egos. For years nothing was very clear about the origins of RXC.
We'd like to know about the early days when R.X. Cringeley was used as a pseudonym for a gaggle of writers. Were you involved with the 'nym from the beginning, or did you join later? Who else wrote parts of those articles? Where did the source material come from? Any fun anecdotes?
Could you tell us about the early days without putting the 'nym spin on the facts? I would love to hear a single side to this story once and for all, and I consider you to be the only one who can give us the truth.
Cringely came to be as a guy on the masthead who could be blamed for fuck-ups. The idea was he'd be fired from time to time then reinstated when the advertiser (it was always an advertiser) had cooled down. He could never come to the phone because he was the Field Editor -- always out in the field.
The Cringely column Notes From the Field came into existence when John Dvorak quit. Dvorak was the gossip columnist and then suddenly he wasn't. Editorial management suddenly realized that all the effort they thought they had put into promoting the column had gone out the door with Dvorak. So they decided to replace him with a generic gossip columnist under the Cringely name. That way the value would remain even if the writer left. At least that was the idea.
The biggest myth is that there was a "gaggle of writers." The first Cringely was Rory O'Connor, who wrote the column for about nine months starting in 1986. The second Cringely was Laurie Flynn, who wrote the column for about another nine months ending in August, 1987. I started writing the column in the first week of September, 1987, and wrote every column until the second week of December, 1995 when they fired me.
That was eight years and about 420 columns, which hardly makes it a gaggle of writers. To be clear, items for the column were submitted by reporters. Or, more correctly, items were dragged from the clutches of reporters. But no reporters "wrote" for the column and typically 75% of the material had to be generated by Cringely him (or her) self. Often weeks would go by without any outside material.
I can't say what's happened since. Maybe they do use a gaggle of writers. I don't read that column.
As for anecdotes, two come to mind. I once received a spreadsheet containing Apple's detailed product plans for the coming two years! I got a lot of mileage out of that. And I found out about the Apple/IBM partnerships (Taligent and Kaleida) within hours of their happening, but had to wait months before writing about it to protect a source. What was wonderfully satisfying about that is when I finally did write about it IBM went ballistic, beating up Apple for leaking the story. My source was from IBM.
Has not having a PhD affected your work?
by Anonymous Coward
Back in 1998 you falsely claimed that you had a PhD and was a professor of journalism at Stanford. Of course the truth came out. How has the truth affected you and your work. Have you suffered any consequences by your lie? And why did you lie in the first place?
Of course this is a long story, but the compressed version is that I did every bit of my PhD including the paper and the defense. Coming out of the defense, my committee, chaired by Nobel laureate Kenneth Arrow, asked for some changes to the paper. All I had to do was make those changes and I'd be finished! Well it was a busy time in my life. I was writing my first book, soon to be followed by a job or two and, before I knew it, I had missed the five-year deadline. I was stupid, of course, not only for wasting all that time but especially for not asking for an official leave-of-absence, which would have frozen the clock. How the lie got started was that first book called me a PhD on the jacket. Of course we all expected the jacket to be correct given how little extra work was required. And that jacket copy followed me everywhere. So frankly it was a lot easier to just accept what Random House had decreed than to go to the trouble of explaining all this. Sure, I blew it, but let me make this point: I have all the qualifications to get a university teaching job TODAY. I turn down at least one offer per year. Like Popeye said, "I yam what I yam."