Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
News

No More Suits; IT Worker Shortage Will End Soon 380

A lot of people (even Jon Katz) have been telling me I should write a Slashdot feature myself now and then. Fine. I'm in a strange mood today, and a lot of strange thoughts have been buzzing through my head this week, so here goes. My first "observation of the week" is that the word "suits" is no longer viable to describe managers in tech companies. We need a more accurate term, and I have one for you. (More below.)

I was up at Andover Corporate HQ Tuesday. It's 400 miles from my home office, so I don't get there often. This time, for no particular reason, I happened to notice that while Andover has plenty of administrative and marketing and other suit-type people floating around the office doing whatever those people do, none of them wear suits to work any more!

But there was still a clothing division between the execs and the workers: ironing. The programmers, artists, writers, and hardware wranglers wore basic, simple, unpressed t-shirts and jeans or other working-type pants, while the biggies over in admin-land all looked like they spent significant time and energy getting their casual outfits to look "just right" before they came to work.

After I realized what was happening at Andover, fashion-wise, I called some friends who work in other new media and tech companies and asked them if the same thing was going on in their offices. To a man and women, they said it was. Nowadays, there are no suits in tech companies unless network TV cameras are there and rolling, and often not even then.

From now on, in the interests of journalistic accuracy and linguistic precision, I am going to refer to the executives formerly known as suits as "Its," an acronym for "Ironed T-Shirts."

"Yeah, I had a great idea but the Its were too clueless to figure it out!" is an example of how you might use Its in a normal workday sentence. Feel free to do so. I have not copyrighted the word. It's now yours as much as mine to mess up, mispell, or whatever else you like to do to words in your spare time.

The IT Worker "Shortage" Will End. Soon.
Once upon a time, back when the world was young and "engineer" was a word used to describe hairy-eared men who designed real, physical things and programmers were looked down upon as glorified typists, the U.S. had an "engineering shortage." All through the late 60s and early 70s publications like the Wall Street Journal ran article after article about how America's potential economic growth was being stifled by a shortage of engineers and technicians. Business-owned politicians loosened visa restrictions for engineers and technicians from other countries because of this supposed shortage, engineering salaries shot up, and suits (which is what Its were called back then) constantly whined about the impossibility of managing their arrogant techies, all of whom knew they could find other jobs in seconds and, therefore, demanded all kinds of perqs, up to and including free coffee and sodas, in-house gyms, flextime hours, and so on.

You could take any of those 60s or 70s WSJ stories about the "engineering shortage," change a few words in them, and run them today as panic pieces about how it's impossible to find competent programmers and sysadmins at reasonable salaries, and how when you do scare up a few of these rare beasts, they won't hew to the corporate line and respect corporate authority and salute their MBA bosses like good little workers. Indeed, the WSJ may actually be changing words in those old stories and rerunning them. Who would know?

But those of you beyond a certain age will recall that, one day, all those formerly high-rolling engineers were suddenly seeking exciting new careers in convenience stores, service stations, and fast food outlets that didn't pay enough to cover the mortgages on their nice suburban houses, which suddenly became hard to sell because there weren't enough other engineers with good jobs available to buy them. The economies in places like the Boston suburbs and Silicon Valley and other "high-tech capitals" tanked. Life was rough, and a lot of people (including me) got burned hard and ended up with scars that they/we carry to this day.

All good things come to an end. Right now, yes, it's good to be the king (or at least the Network Administrator). But remember what happened to Louis XVI when the rabble got fed up with paying for his high living and decided to take him down a peg.

And does anyone here remember the oil crisis of 1973? I sure do. The U.S. seemed to be spending all of its money importing Arab oil, which climbed to nearly $50 per barrel at one point when OPEC [the Organization of Petrolem-Exporting Countries] got especially feisty. If this trend went on, economic pundits said, the Arabs would own America (and most of Europe) outright within a decade or two. By extrapolating then-current trends and drawing them as lines on colorful charts, this thesis was easy to display on TV shows, on newspaper front pages and in slideshows at business conferences so that everyone could get nice and worried about it.

But last I looked, OPEC was just about dead and oil was selling in the $10 - $20 per barrel range. The danger of predictions made through extrapolations is that something always seems to come along that messes them up. In the case of oil, it was a major change in consumption patterns. Oil got too expensive, so we (the oil-importing countries) simply stopped using so much of it. The most visible example of this change: what we call a "full-sized American car" today wouldn't be a pimple on the bumper of, say, a 1970 Buick Electra.

Believe me, somewhere in a secret cavern beneath the Wharton School of Business (which is to finance as Stanford is to Computer Science) or someplace similar, teams of fiery-eyed MBA candidates are plotting to take down today's computer professionals as hard as OPEC, engineers, and Louis XVI all got slammed in their respective days.

So enjoy the ride while it lasts. It's great fun. But don't take out a 30-year mortgage based on it. Something - it could be genetic algorithms or some other new, less labor-intensive programming methodology or it could be an overall economic downturn that ripples through the high-tech industries and brings Internet growth to halt the same way the construction-driven economic boom in Austin, TX in the early 80s collapsed in on itself when a comparatively small number of construction workers lost their jobs and couldn't afford to buy houses, which led to even less housing demand, and so on all the way down - will throw a lot of high-tech workers out in the street. I have no more idea than anyone else of what the proximate cause of the next tech-industry recession will be, but I guarantee that it will come. One always does.

Indeed, if this thoughtful article from Linux Journal has any truth to it, today's shortage of computer professionals may be as false as many people thought the 70s oil shortage was, so it may already be time for IT workers to start doing a little financial hunkering-down, especially if they're over 30 and unwilling to work slave-length workweeks.

Is Slashdot a Magazine?
I have always considered Slashdot an online magazine. And I have always respected the American Society of Magazine Editors [ASME] and believe their stringent code of ethics should apply as much to online publications as to those printed on paper. So I decided to join. $225 a year, and Andover'll pay for it anyway, so why not?

But guess what? This august body still only accepts members from print magazines. As a purely online editor, I'm apparently not worthy. Which means, by extension, that you, as an online reader, are not as worthy as a print magazine reader. No big deal. I find it more amusing than alarming - for you and me, at least. But this is sad for the ASME; it is freezing out the most vital, highest-growth part of the periodical news business when, instead, traditional publishers' and editors' organizations should be courting us online people in order to assure their own future survival.

Here is the last paragraph of my e-mail response to the turndown I sent to arhodes@MAGAZINE.ORG:

Depending on your reckoning, the 21st century starts in either ~3 or ~15 months. If ASME decides to enter it at some point, please let me know. I'll be there, waiting for you to catch up. ;)
- Robin "roblimo" Miller
Elkridge, Maryland, USA
10 October 1999, noon EDT
This discussion has been archived. No new comments can be posted.

No More Suits; IT Worker Shortage Will End Soon

Comments Filter:
  • by Anonymous Coward
    Guess it makes morons harder to spot though.

    But it is not surprising that people dress down
    to geek level now when they have inherited the
    world.

    End of it-worker shortage? I'm not a sys-admin
    but it seems a lot of their time is spent on
    helping clueless users. As long as people are
    as stupid as they are today when it comes to
    computers it-workers will be in high demand.
  • by Anonymous Coward
    Rob, usually love your stuff, and your "chronological advantage" over a lot of the less experienced element out there.

    But not today. The "Internet Economy" go down the toilet? It /is/ possible, but frankly, your rationale is like someone in the 30's saying that if the economy slowed down, they'd be able to get rid of the new fangled "telephone" they'd just got in their office - return to the good old days, etc. Not very likely.

    The Net's changing everything, as you kind of admit with your comments on ASME. Sure, the /profile/ of non-suit employment can change [hopefully MCSE's get real jobs] but this net thing's only just started. Don't write it off yet.
  • by Anonymous Coward
    When I dropped out of college in 1982, my aunt told me that without a degree:
    • I would have to wear a suit and tie everyday.
    • Work 9 to 5 (the ones during the day).
    • Get paid minimum wage.

    In the late 80s, people said that programmers would be put out of work with all the new application generators. Why pay a programmer to write a program when I user can just paint a program.

    Now computers will be programming themself???

    Companies are claiming that there is such a programmer shortage, so that that the visa limits must be increased. Many employers will say, if you don't have X number of years of XXX, then we can't even look at you. In 1988 I was called by a headhunter looking for someone with more than 10 years of DOS programming experience.

    It's always going to change and people will keep saying this sort of thing all the time.

    Injured worker wins against Mattel! [sorehands.com]

  • by Anonymous Coward
    The 'fear factor' plainly shows. There is another reason to go to college, you know. I have met a lot more geeks in college than I do, say , working in IT departments. We want a guarantee. Yeah, Ive been coding assembly since I was 15. Yeah, Ive been doing production level stuff for years now. So What?? I think every good *engineer* must have went to school, and a good one to boot. I doubt that many of the 'well known' programmers have done half of what the geeks from good golleges have done. But your right about one thing -- goingto college wont make you good. Me and the fellow geeks notice this all the time. So we breeze through each course, bored to tears because we have actually applied bits bytes and boolean math to real world already. But what going to (a GOOD) college WILL do is make a good GEEK a whole lot BETTER. What job makes you build assemblers, linkers, compilers, documentation, and every data construct ever known to man within 2 years?? sounds more valuable to learn that doind SQL and perl all day.
  • by Anonymous Coward
    I live and work in NYC. I am both the system administrator and MIS director for a fast growing company. I have observed there is not actually a labour shortage. Certainly finding qualified workers in computer industries is much more difficult than in other fields, but not because of a lack of people, the problem is the wrong philosophy.

    Most companies in NYC fall into two catagories. There are the companies who think only the young are qualified for computer work, and there are the companies who are so entrained in corporate doctorine they believe those with several years of experience must get pay raises in scale with the initial abnormally high salaries, it is cheaper to hire or retire them so younger cheaper workers can be hired, (regardless of qualification.)

    The reason for the complaining to Congress about a lack of qualified workers is to create an influx of younger workers so big business can make all the old people retire. All of the retired computer industry workers I have been exposed to are very intelligent and versatile geeks, yet they are living off of social security because no one will hire them.

    If business made an attempt at hiring and maintaining qualified employees instead of cheap / young employees there would be no labour shortage, nor would we have the wide spread unemployment of the older generation. With the increased job security wages would actually decrease, and thanks to focuses on skilled employees company productivity and efficency would increase.

    [I am a 23 year old hacker and management in one package. Take it for what is is worth.]

  • by Anonymous Coward
    Let me point a few things out before you make even bigger fools of yourselves.

    First, just because someone has a different accent, even one that's hard to understand, does not mean he or she doesn't know English. India was ruled by England for hundreds of years, and there is a long tradition of English literacy, particularly among the middle and upper castes. English is practically the mother tongue for tens of millions of Indians, and many more are educated in English from their earliest years in school. Assuming that Indians are illiterate in English, in many cases, is akin to thinking the same of someone from Mississippi or Ireland or the Scottish Highlands.

    (NB I say this as one who grew up in Arizona and acquired a country-boy drawl that is sometimes mistaken for Texan. My Ivy League medical school classmates teased me for being a hick and a cowboy, but I got better grades than most of them.)

    Second, there is a cream-skimming effect that means the Indian who is after your job may be substantially brighter than you. In my field (medicine), we see a lot of applicants for American residencies from people who went to medical school in India. A doctor from India has to successfully compete with a billion other people for a medical school slot, then jump through an incredible number of hoops (including a difficult exam) in order to qualify for an American residency.

    They are rewarded for their efforts by being shunted into the worst residencies, caring for the gunshot victims and TB patients in the mangiest public charity hospitals, and then battling for jobs in lower-paying practices in undesirable locations. I suspect we'd lose half of our American graduates if they had to compete under the same circumstances.

    Now, sometimes in medicine we laugh at our Indian colleagues for their accents, and no doubt there are some incompetent ones, but in my experience an Indian who successfully surmounts all these obstacles is every bit as good as an average American doctor and in many cases better. (The best research physician I ever met was an Indian.) We are lucky to have them and we should not close our doors to the best and brightest who want to immigrate. I am sure the same is true for Indian computer programmers.

    Third, like most of Asia, Indian culture highly values education and hard work. American culture admires success, it is true, but doesn't seem to care how it is achieved. Some mediocre talents have lucked out by being at this time and place in history, in a line of work they chose so they could keep wearing the nose ring and filthy t-shirt. If not for computers they might be flipping burgers or parking cars. I have worked with some of these dolts and know more about how to get a computer to help me in my work than they do, by a long shot.

    The Indian in the next cubicle undoubtedly had a more rigorous grade school and high school education than you did. His college education was impaired by a lack of the resources one finds at an American junior college, to say nothing of Stanford and CMU. However, he is better at math and basic algorithms, and given the speed at which knowledge becomes useless, may adapt to new languages and technologies better than you will. And don't forget that he may have an entire village back in India depending on his earnings. That has a way of concentrating one's talents and energies.

    Businesses focus like a laser beam on the bottom line. They would not use Indian programmers if they were producing jerry-built systems that had to be rewritten from scratch by the almighty American programmer with the understandable Midwestern accent. This griping reminds me of Detroit in the 70's when the big auto makers were thrashed by cheap, reliable Japanese cars. Some of it is pure xenophobia and even racism. This is the dead ideology of the past and those who hold to it will end up on the ash heap of history. You have to work smarter and harder than the Indian or he will take your job, and he will deserve to have it. If you try to interfere with market forces and reasonable immigration policies, then the jobs will be shipped to Bangalore. In this circumstance it will not be just the American programmer who suffers for his incompetence and greed, but the suits and secretaries and janitors who depend on him too.

    Chris May
    Anonymous Coward of convenience
    ccmay@gateway.net

  • by Anonymous Coward
    This is amazing. Now the bad science and math skills of most American UG's ( and it's true... I have been a TA for quite some time now ) is blamed on the foreign TA's now. What about the crappy American high school system which ( to throw out a random example) requires students to buy expensive graphical calculators to do basic math and many can't even use a calculator to,say, add one fraction to another. I know many Americans are frustrated that so many of the high paying technical jobs have been taken by the Indians. But don't forget that Indians are right at the center of the whole booming tech/net economy too. Check out a roster of, say, the top 100 new tech companies started last year and look up the list of founding employees. About forty percent of them are started by Indians ( At least that was the statistic for two years ago) Then go through their stock prices. This is what Indians have contibuted to the general prosperity of US in the last year alone.

    So stop this borderline racist/xenophobic talk right now and get down to improve your high school system so that it actually teaches something to the students ( the non-asian-origin students that is... the asian kids can extract the maximum out of even this crappy system )
  • by Anonymous Coward
    Rob, please show me one print magazine whose articles are riddled with spelling mistakes and grammatical errors. Slashdot is not a magazine, and you are not an editor. Which is not to say that there isn't some interesting reading on slashdot. But personally, as an IT pro with a liberal arts background, I am embarassed by the level of writing I see here. Why not clean it up?
  • by Anonymous Coward
    Argh, mind just went blank and erased what I was going to say.. sorta like yahoo mail likes to do... :) oh yes, I remember.

    First, the car thing. It's a kinda wierd hierarchy. Modern American car is to 70's Buick as Modern European Car is to Modern American Car. A pimple on the bumper. :D After the oil crisis faded, the cars from the States ballooned again (slowly) whilst "we" decided to protect against possible future repeats by keeping our vehicle sizes about the same. And I think, somehow, that We The Europeans have the better cars for it... You, sir. Do you think that the Lincoln Town Car is really suited to the town? Or the highway for that matter, in simulations I've tried featuring that model it's got 4.8 litres but can only make 73mph - what a fucking waste. And I mean that profanity most sincerely. Plus it looks like a box. Even Volvo and Fiat (long time fans of the straight edge and right-angle) have learned by now that boxiness sucks ass.

    Maybe such automobilic difference is another follow-the-sheep example in relation to the other things you were talking about! Now we've just got to work out who's the sheep and who's the mountain goat..

    [The preceeding text unashamedly inspired by Quentin Wilson's full-page, four-column rant in "Top Gear" magazine]


    Well now that I've filtered out the casual browsers and those who read every article religiously to the end, I'd like to say something about the IT worker stuff as well. Like some other poster mentioned (I think s/he was the one originally mentioning the Sheep Effect as well), most of my careers advisors at school, and indeed my parents for a while, were actively encouraging me to get into computers and the associated businesses because "that's where the money is, that's where there's a shortage of workers, that's the thing you really enjoy". Too bad that many programmers I've heard about get a fairly average wage, and I really only tinker about with PCs because they break down so much and I'm the only person in the family who reads the manual... oops, digressing. I meant -- too bad that these guys first spotted the trend two or three years ago. I'm going to university now. It'll be three and a half years from now minimum, probably four and a half, before I'm out the other end and looking for a job, starting low of course. Working my way up would take a while as well. Do they think that there's going to be anywhere near as many jobs left in four, five years time (that's seven to eight after the trend emerged)? By then a large amount of the positions will be filled by A) People who noticed the trend at the same time as my parents/advisors, but were actually preparing for Uni at the time, and B) other people who went on a quick 6 or 12-month course and went straight for the jobs.

    I know the score, I'm not dumb... I'm aiming for a Biology or a broader-based Natural Science degree, as I enjoy the subject, don't know anything about the state of the market at the moment (hopefully a steady growth, especially in the pharmaceutical departments), and by then everyone will have flocked to I.T... (damn, forgot my important third point and had to stick weak point number four in instead) :D Anyone care to follow me? For I am the shepherd.... bwahahahahaa!

    Whoops, it's getting late. Better stop this once-weekly net session before I hallucinate too many words.

    Tahrey@Yahoo.com
  • by Anonymous Coward
    Computing and the Internet will eventually become a commodity. In the early part of this century electrical appliances were big technology. If you could design or build or fix that newfangled gee-whiz technology when it was new, you were maybe accorded a little more respect by some levels of society. And maybe a little more money than the rest of the drones.

    Someday somebody's gonna invent a router or wide area switch that all you have to do is plug your little desktop appliance into and it configures itself instantly with no intervention on the part of anybody. Whether it's FR, ATM, ethernet or anything else, it will just sense it and configure itself accordingly. All the technology is built in at the factory. No need for hordes of wireheads and CCIE's to maintain the technology in the field. Appliances, just appliances.

    The information carried on these mediums like television and radio is still important, but the underlying technology is trivial to most of us.

    Engineers at TV and radio stations don't make a hell of a lot of money and I bet their numbers decrease every year due to further advances in automation technology.

    Computer and network geeks are headed down the same path eventually, it's inevitable.

    Bret Aguilar baa@relaypoint.net

  • by Anonymous Coward
    I think you're right on the money.

    There are _way_ too many inexperienced & incapable developers in this industry, often pulled in due to the incredible demand for _any_ developers. There is also a lack of experienced engineering management, w/ marketing or product management people taking their place. Too many of us have to fight to get responsible engineering practices put into play in our workplaces. How many times have you had to explain to engineering management the need for version control systems (ie CVS), bug tracking systems, source code commenting, and proper requirements specifications?

    The funny part here is that Java, which has been an enabling force for software development (especially OOD), has also made it easier for those w/o any software engineering background to get jobs as developers. Too many of the above-mentioned battles take place because of Java-only, non-engineering types who don't know about their shortcomings.
  • by Anonymous Coward on Sunday October 10, 1999 @08:51AM (#1625168)

    Working in the ecommerce industry, focused on Java/Corba/Unix, I can say competent developers are indeed hard to find. You can't hire them, you can't find them and they do charge a high hourly rate.

    I charge a high hourly rate and I still get 3 contacts a week wanting me to fly to Kansas City, Seatle or Montreal to do Java/Corba development.

    The problem I find is that most of us high paid consultants is that we stir up some sort of jealousy in the corporate converted crew. These converts are COBOL/Mainframe or VB/VC++ guys 'retrained' in the ways of distributed computing. They still think procedurally and structure their OO applications as such. They have a hard time understanding multi-threaded issues and are still scared to move forward into new technologies (such as XML or using COS service like Naming, Trading or Properties).

    Or there is the young crowd with no direct experience developing large scale systems that make snap judgements and write a new service to handle some new fucntionality. Forget logical partitioning of the application or requirements. Forget real knowledege even, they can talk the talk so they must be bright and know what they are talking about. Bullshit! They are inexperienced, so get their noses out of your asses management!

    This coupled with incompetent recruiters makes things even more complicated for us competent consultants. I get a call, 'Hey, you know Java right?' me: yes I do. recruiter: Well, I have this great JavaScript position me: click.

    It is generally thought that Indians, or other foreigners are generally brigther than their american conterparts. Obvisouly this is flawed. Just because one is an Indian doesn't mean they can walk the walk. They are like the rest of us. Some idiots, some extremely competent. The real difference is that I'll charge you for overtime, because it is the law that I get paid for what I work, while they don't, in general. (I'm not singling out Indians in this case, and, yes, I have Indian friends - I also know a couple of idiot Indians).

    Another point is is that management inforces incompetence. The Its allow for underperformers to continue under performing. 'Hell, we are a big corporation, he/she is a nice person, let them ride the system' Forget they can't tell you what requirements are met by their code, they can't tell you a damn thing about their code except that it works (sometimes) and they sit on their ass soaking up the $.

    Once the corporation (pick one) gets a hold of me, I tend to work my ass off (60 hour weeks) and get more and more responsibilities because I'm the only one in the whole freaking group that can do it. Finally I get burnt out (generally 6 months) and terminate my contract.

    You better pay me for what I'm worth or I just won't work for you. I don't give a damn if I make more that your CEO, I'm making you money, giving you what you want, not sitting on my ass surfing the net every 30 seconds, or talkin in the hall, or in the cube next door. I'm producing, I am the critical path for all of your assignments because your people can't do the work. Your success depends on me. Pay me what I'm worth or I'll go to your competition.

  • Oops! I knew that. Sometimes when you're trying to type at the speed of thought you screw up. Thanks for noting the mistake. Correction made.

    We can discuss the exact excesses of French Monarchs later; Lou the 16th may have been agreat guy, but last I heard he didn't turn Versailles into public housing and cut off funds to his horde of freeloading nobles and cut taxes on the workers or anything like that; he was just *nicer* about ripping off the peasants than his predecessors had been.

    Besides, as you all probably figured out, I was thinking about the Mel Brooks pastiche, not real French kings. ;-)

    - Robin "it's good to be the writer" Miller

  • 1. Hey! I did my research! I called people at other New Media and Tech companies and they told me their suits were now Its (pronounced like the contraction "it's" in my mind, but feel free to do it your own way) too.

    2. The "no current shortage of tech workers" thought was in the Linux Journal article. Hit the link. My personal feeling is that the tech industries are in the midst of a boom similar to one driven by construction, and those always end hard! "Those who don't study history are dommed to repeat it" & all that. But I'm no smarter than any other so-called pundit and I do not claim inspiration from [your favorite deity here]. I just toss out debating points, and try not to take myself too seriously.

    3. I don't care much whether or not the ASME decides online editors should be allowed to join. It doesn't affect me (or you) one way or the other. I just feel a little sad for *them* is all. There are plenty of fine associations for online journalists and editors where I'd be more at home anyway.

    As far as feeling ornery today, you're not alone. So am I - as you probably noticed. ;)

    - Robin "roblimo" Miller

  • I don't think things will come to a grinding halt; even during the "horrible" 1930s depression the U.S. economy only shrank by what? 30%?

    What ends "consruction booms" isn't usually a stop to the economic activity that created them, but rather a flattening of the growth curve that turns previous extrapolations sour.

    Note that I do not claim to know *what* phenomenon, either economic, social or technological, will end the current computer & Internet boom, just that something or other will.

    All booms end. And shortly before they do, all the people doing well during them come up with many reasons why this boom is different from all previous ones, and will continue on, unlike any that have gone before. And the people who make the "this will go on forever" predictions are always wrong. Every time.

    Any number of unpredictable changes, from the sudden emergence of a new religion or attractive but dangerous social philosophy to a climatic disaster, could cause the computer and Internet industries to stop growing or even to contract.

    By definition, "unpredictible" events can't be predicted. The only prognostication you can sanely make about them is that there will always be one sooner or later.

    Or, don't get complacent. This universe is not a kind and gentle place. It has a strong tendency to deliver major upside-the-head smacks to anyone in it who displays too much hubris.

    - Robin

  • Having just finished my BSBA and planning to get an MBA once I get enough work experience to get into a good program, I recommend against it. Only small percent of MBA's manage techs. But a LOT of MBAs are providing jobs for sysadmins and other techs. In the company were I work (non-tech), we have a fairly large work force of sysadmins who maintain network of computers used by MBAs - get rid of those MBAs and techs lose their jobs.
    Remember: not all MBAs are created equal ;-)
  • I'll agree with you, for almost the exact same situation. I mentioned big-O a couple times and I think there are only 2 or 3 people out of 30 that know what I'm talking about. But you know what, most of the time it isn't needed to know these things.

    What is one way that things are changing? VB, JavaBeans, and other RAD tools. I've talked to a lot of people and they aren't looking for too many highly trained programmers, they want a lot of people that can slap some components together with Visual Basic or Visual Cafe. Sure they need a few smart ones for architecture and a few other tasks, but they want to turn out a product quick not make the best product they can make.

    I think that attitude will bite some companies that want to be around for a long time. If their RAD tool disappears that'll hurt. Maintence may become a problem too. If the system architect gets hit by a bus you're screwed.

    Anyway there will probably always be a place for the good programmers and sysadmins, even if the shortage goes away. I don't foresee the shortage ending really soon now, but I also don't expect it to last forever. I know my old college CS department doubled in size with one freshman class when I was a senior. That was only a couple years ago.

    --

  • A similar thing happened in the 80s when "Desktop Publishing" came into being. Suddenly, with a Mac Plus and a copy of Pagemaker, anyone could be a typographer and layout designer. Or could they?

    Once the dust settled and the "desktop publishing" industry matured, those of us who were duffers learned that documents with 75 different fonts really didn't look all that good, and companies whose output really mattered (magazines, advertizers, etc.) still employed real layout designers.

    I expect the same thing to happen with the web. Sure, just about anybody can make a web page these days. How many people are making really good ones?
  • An anonymous coward said:

    The "Internet Economy" go down the toilet? It /is/ possible, but frankly, your rationale is like someone in the 30's saying that if the economy slowed down, they'd be able to get rid of the new fangled "telephone" they'd just got in their office - return to the good old days, etc.

    How many people do you think are making big bucks to make your telephone work? Probably not very many. How many jobs did it take in the 30s to make telephones work? How about today? Shucks, I'm old enough to remember when the phone was owned by the phone company, and there were repair men who came to your house to fix it. Today, if the phone breaks, throw it away and go buy another one for $20 at Wal-Mart.

    GrenDel Fuego (gboyce@herot.rakis.net.boing!) said:

    He's not saying computer programmers won't be needed. He's saying there would be a huge jump in the number of programmers, which would drop the value of the individuals. Basic supply and demand.

    Actually, he did say that the time may come when the demand goes away. Rob's words:

    Something - it could be genetic algorithms or some other new, less labor-intensive programming methodology or it could be an overall economic downturn that ripples through the high-tech industries and brings Internet growth to halt ... will throw a lot of high-tech workers out in the street.

    It's still supply and demand, but I agree with Rob that the demand is more likely to change (and will have a bigger impact than the supply when it does).

    I've been there, done that. I was an engineer in the 80s, and I'm a sysadmin in the 90s. I have no idea what I'll be in the 00s.
  • Rob put out a good article for us to consider. He didn't say that he *knew* what the next big thing was, just that his instincts told him there would be one. We'll see if he's right. Genetic algorithms aren't widely talked about or understood, and therefore it's a good candidate for someone to think it might be The Next Big Thing. Sort of like AI was. (I'm still waiting for Cyc to be finished... :)

    And yes, sometimes terms evolve. But sometimes people want to use more relevant terms. And I thought 'Its' was pretty accurate and clever, and not a bad introduction, either.

    Also, I think Rob was pretty correct about the engineers, too. My girlfriend wants to be one, and it'll probably be harder for her to find work than it will be for me (as a programmer), so I'd be willing to get a job near wherever she can. If this ever changes, I'm sure we can take care of ourselves, but it's a good thing to consider. Putting aside some money for a rainy day isn't a bad practice.

    The problem with Jon Katz was he never even figured out how to talk to most of the population. Even his choice of diction was generally inappropriate for this forum. Now, if he wanted to write for Wired... :)

    And although we *have* picky specialists here at slashdot, it's really for the computer enthusiast, like Byte was originally, or more generally anyone interested in "News For Nerds". It'd be nice having a specialist answering questions, or chiming in on "Ask Slashdot", but for something like this, I'd rather have a journalist speculate on a social phenomenon, or a historical trend. Also, I don't trust economists, because I know enough about statistics to not trust future predictions. Unless you find an important trend that people have apparently overlooked, you probably won't get anymore insight, either.
  • DOW Dress-Oblivious Workers. Us.

    Remember the proverb: The DOW that can be ironed is not the DOW.
  • Absolute baloney. I'm a sysadmin: I couldn't code my way out of a paper bag, and I only know scripting languages. (Shell script, apple script, perl...you get the gist.)

    I am surrounded on all sides by genius programmers who eat, drink, sleep and breathe C++. They make jokes entirely in algorthyms. They also wouldn't know a logical volume manager from a poisonous snake.

    True story: an entire company full of programmers, young and hungry GenX'ers, old and savy baby boomers...twenty experienced and smart people. And they could not, for the life of them, figure out how to get a Sun workstation, with the latest version of solaris, running NIS. They spent the better part of a month arguing with Sun, trying to understand what the people on the newsgroups told them, and they eventually just gave it up as a bad job.

    My first day, I walk into the computer room, and walk out fifteen minutes later with a properly confugured workstation. I even added a spare SCSI disk they had laying around to it.

    I can only conclude that there are two orders of geeks: those who make the toys, and those who get to play with them. One cannot exist without the other, and neither has any clue how the other side works it's black magick.

    SoupIsGood Food
  • Ahhh...but therein lies the challenge. Sun offers a -lot- of documentation, almost all of it poorly indexed, impoissible to read, and chock-full-o-errors on a CD. You could, in theory, hack away at all of the necessary files in /etc based on what you read on the documentation CD, or in Oreilly's book on NIS/NFS, or the TCP/IP book...or the USAH book...or perhaps the Armadillo book. Who knows? You might get lucky and cover all of the bases you need to. They weren't lucky.

    I wander in, run /usr/sbin/sys-unconfig, et voila, five minutes and another reboot later, and everything is covered.

    Now! Where is sys-unconfig ( a solaris-only tool ) covered in the Oreily's manuals on network configuration? it isn't. It's on the Sun documentation, but good luck finding it. I picked up that trick from someone who knew more than I did at the time, and I traded him what I knew about resetting TPT-2 connections.

    On an AIX system, NIS isn't even installed as part of the base operating system. You have to install it by hand, or know that installing the "client" or "server" software bundles -after- you slap on the BOS will get you where you need to go.

    Where is this documented? Deep, deep within some forgotten file on that two-CD set IBM ships with AIX, I guess. Damned if I could find it on their on-line libraries.

    So how did I figure it out? Experience! As a sysadmin, my mind travels the well worn paths laid down by vendors the world over. They -all- have dirty little tricks and gotchas, and just as a good programmer knows when to use a b-tree heirarchy, I know that I probably needed to install supplemental software. It's detective work, and you have to -know- how this stuff all fits together, because it's -not- all in the documentation. Oreily's Armadillo book does -not- cover how to set up a LVM so it actually works, and IBM's documentation is worse, and it's man pages on the topic are incomprehensible. But they all have clues I can use, and I know how to fill in the blanks by now.

    I'm a sysadmin: It's what I -do-.

    SoupIsGood Food
  • We had some doubts there, but at least this suckup piece reaffirms Slashdot's ownership by Andover. See, everyone back east wears a suit and short circuits when productivity takes over formality. Got forbid the grey and blue of formalware get replaced by the green and yellow of power computing. There are too many companies who aren't Andover to do away with the word "suit" Instead of renaming them ITS let's instead create a new category for just Andover.

    Yes, in 1995 anyone could be a programmer. Today your BS must be in an engineering field to even get looked at. The best coders at today's database firms are answering phones by day and hacking the software of tomorrow by night. The surplus of genious isn't in the future. It's right now.
  • Whatever. Maybe there's a shortage of suits because it's so damn hot but you have better luck getting oil out of a water spout than a programming job in Fl*rida. No entrepreneurs. No venture capitol. Just a lot of unemployment.
  • I found quite interesting your article... even if Katz missed the point again ;)
    However I'd like to add some comments:

    Suites, Its, scmhmits, who cares?
    Are you trying to coin a new term? Not worthy, indeed. Suits are far from extintion. Good for you and Andover that you have its. But Andover and your friends aren't the whole Corporate America, are they?
    While you can see some dressing codes more relaxed than a year ago, some companies (both client and service providers) ask even developers to keep on some dressing codes.
    It's not that I agree on that: I feel that any company that ask you to use 'Armani like' dressing can also ask you to wear 'Mao like' dressing anytime, I find it outrageous... however that's the way it works yet.

    Using Its will be also confusing with a lot of more meaningful words and acronyms... nobody will use it. I encourage you to try a different word, original, non-acronym, meaning 'suit escapees' or something you like.

    On IT shortage:
    We live in a closed system called Earth, like it or not. As far as I know, we still can't hire martian developers. Yet.
    However, as most people, you look in the american-centric way, being as guilt as ASME for overlooking a world wide web... or a World Wide economy.
    Maybe IT shortage will finish in *United States*, however it has triggered a boom on foreign IT salaries trying to compete with american offers. That's good, of course, because encourages a world market and somehow leverages humongous salary differences among world salaries.

    When I tried to enroll some friends (yes, we have a 'hire your friend' bonus), some of them accepted, but a lot of them also got significative pay raises at home. Having half of the world's developers in US doesn't mean that corporations over the world don't have to compete, so don't miss the point. Global Economy is the word.

    So the guys on the cavern will have also to account for world economy impact... maybe you got cheaper IT people, but paying increases abroad made your cup of cofee slightly more expensive because Colombian companies had to pay a little more to their developers to keep them at home. I certainly hope that Wharton School of Bussines take that into account, since you seem to forgot it.
    And, please: ideas and software development aren't oil. You can't compare both concepts (a natural resource and a human resource) in such an easy way.
    Sooner or later there won't be oil. And if in some point of time we humans are out of ideas, then it will be time to call a Vogon fleet to clean this place.

    And last, but not least, has anyone called your attention about being rude with your e-mail? Jesux case didn't make you look good at all, and your last paragraph also made me doubt... maybe it's time to go back to engineering or refine your style a little.

    I hope this won't be moderated to flamebait... I really try to call your attention on some points and keep high /. content.

  • Don't worry. We have long been developing safeguards that will ensure the security of our respective jobs. I'm thinking of inventions like C, C++ and Perl. ;)

    Jokes aside, here are my two cents on the ``IT people shortage''. The problem is the availability of useful statistics that can be used to understand the problem, if it exists.

    There is much diversity in jobs related to computing. You could be creating a web page, debugging an embedded OS, fixing some COBOL code on a mainframe, or gluing some SQL queries to a graphical interface and in all cases be called an IT worker. To lump these kinds of people into one category and declare a shortage of them is a big mistake.

    What if the alleged shortage exists because so many small to mid size businesses business need eight dollar an hour VB monkeys and HTML jockeys. It could be that the shortage can be attributed to the dramatic surge in the use of inexpensive PC's? These platforms comes with an inadequate OS and inappropriate applications that require endless tweaking, tailoring, and propping up with wooden sticks. Sure you can get the core middleware from a vendor, but making it all work for the business requires labor. Thousands of little companies are reinventing the same thing in-house. Also the dramatic rise of the internet has created opportunities, but many of these are menial work that is looked down upon by a real software developer---for example, the mindless work of cranking out HTML and web related scripts in order to create cheesy web sites. From what I hear, jobs are dull, often stressful and underpaid. People that know how to do them don't want to do them, hence shortage.

    The question is, how relevant is a labor shortage in a particular area of computing to someone who has not interest in that area?

    The way I see it jobs that are are at greatest risk are jobs that involve very narrowly defined set of skills. For example, if all you know is how to customize some particular proprietary software package, your job will disappear along with that software package. One year there may be a shortage of people whose resumes bear acronyms relevant to that package, the next year, those who don't retrain are gone. But the change which brought that about was ultimately due to changes brought about by real developers.
  • It was not my intention to knock RIT in general, I have several friends there majoring in Computer Engineering and Computer Science. I am just saying that the IT program there is characteristic of IT progams elsewhere, and there are alot of people with these degrees that just learn NT and TCP/IP and not Unix or much real meat behind computing and technology. You have to hand it to Microsoft, teach the kids NT and they will stick with it even if it is buggy and unstable in the workplace.
  • I live in Buffalo, NY and I am a freshman at SUNY Buffalo majoring in Computer Engineering. Last year I was horrified when I visited what is supposed to be one of the best technology schools in the region, RIT (Rochester Institute of Technology), teaching a corse in IT. They were just teaching them Oracle and NT. 5 years at a top notch college to be an NT drone. I see people with less knowledge than I, who have such degrees, making top notch salaries as system administrators. I think that you are correct, a shortage will come to an end, and it will be the shortage in such people. Those that will survive the shortage will be those who have unix and linux expertiece and will be able make the transition from NT to Unix when Windows 2000 isn't what everybody thinks that it will be. I think that Unix people, that real programmers who know their stuff, Web designers, and engineers taking computers and finding new and provocative ways of using them to make the work of industry and commerce easier will still be in short suppy as long as colleges keep churning out glorified MSCEs. Then again that is just my opinion and I could be wrong.
  • First of all: Most of the people I know -- and I'm a junior in an excellent engineering-oriented university -- really don't know what they want to do. So when you're pressured to declare a major at the end of sophomore year, and you have no clue, and everyone tells you that you need to make money -- engineering seems like a surer bet than most anything else. Other safety-net fields are prelaw and premed.

    My point exactly. You don't know what you want to do so you'll spend (I'm guessing here) close to $10,000.00 a year to find out? I'm not directly attacking you, but rather the mentality that has become ingrained into youth these days: Go to university or you'll never be successful.

    I've convinced myself to slog through a major in Managerial Studies so that I have some sort of marketable job skills after I graduate.

    Managerial studies can still be applied in the literary field, can it not? Perhaps it is not 'pure' but you could take that and hone it closer towards what you'd call a wonderful career? Same with business studies on the whole... I don't believe anyone goes into any kind of business field without some other kind of passion to apply those newfound business skills to.

    I may not like it as well as something purely literature-oriented, but when it means the difference between 20K a year and 50K a year, preference becomes pretty irrelevant -- at least, it does when you're my age (20).

    Are you expecting the $50k/yr as soon as you graduate? Unless you really know what you're doing, I can't see anyone getting that, at least not for more than a single year before being 'found out'. I started at around $20k but have reached $50k in 3 years by doing what comes naturally. And for some of the knowledge I possess, I'm still underpaid but the atmosphere where I work is worth the monetary 'hit'.

    I thought when I was a kid that I could have my dream job and be well compensated for it. But unfortunately, society does not believe that contemporary art and literature are worth money.

    At least not while the author/artist is living. :-)

    It sucks, but maturity, I have found, involves accepting that life sucks, dreams usually don't come true, and The Man will win in the end.

    Hmmm... perhaps I'm not exactly mature then... I do have a kickass career doing what I love and I still do believe that the world is what you make of it. And I did it (actually still am doing it) without a university education.

    It's sad, however, to think that accepting that the struggle is futile is what our youth are living these days. Then again I have been accused of being stubbron, pig-headed and insensitive in the past. :-)

    ... and I keep saying "today's youth"... I'm 23 but I feel so much older. :-)
  • by tzanger ( 1575 ) on Sunday October 10, 1999 @08:22AM (#1625187) Homepage
    Great first feature. It's exactly the kind of thinking that you point out that gets people in trouble and exactly why I never listened to the guidance councellors in school.

    "Get thee into computers! Programming! We nEeD you there!" they practically screamed at me.

    I headed into electronics instead. Not just digital electroncis which is what they are screaming everyone wants, but rather the hard stuff. Analog. RF. Magnetics. I still program, but it's certainly not applications programming. A little web stuff here and there to make things available but that's it. Embedded systems design is quite different than app design. Bugs (hardware especially) are a lot costlier and as such people aren't as ready to jump into the field unless they really know what they're doing. Why worry about doing as perfect a job as you can in the time alloted when you can do as good as necessary and release a patch later on?

    Actually I take that back... it's not becuase they could do a better job, but rather because they probably can't. You need to have a knack, a passion, for programming rather than just be trained in it to do a good job and most programmers, IT professionals, web designers, etc. just don't have that 'edge'.

    So back to what causes this. What happens when there's a shortage of position 'x'? All the sheep run for it and then by the time they've got their training there's a surplus and all those skills they spent years in school for (and for most, gathering tidy loans for) are now not so much in demand as they once were. The bills are coming in and the high-paying job isn't there so what do the sheep do? Listen to the guidance councellors again and when they say there's a shortage of position 'y', blindly repeat the cycle.

    Education isn't bad. I'm not saying that. But most kids these days are taking the words of a fortune teller and basing their future on it instead of finding out for themselves and/or persuing what would be right for them. Worse than that, the fortune teller has a perogative to get as many students into university as possible to make the school stats look better. Who cares if they don't need to go. College is thought of as "for people too dumb to get into univeristy" in Canada, which is sad. Get in, get skilled, get out. Get a job and make some money so that if you want to really get into the field, now go to university and learn all you possibly can, if you have to go to university at all.

    Sorry about that, but whenever I think of school that's all that comes to mind. People going into computers and IT and web design because everyone is telling them to. They aren't particularly skilled or have a knack for it, but by God they're gonna make some serious cake doing it. Until they find out everyone was thinking the same thing and they've surplussed themselves and the fancy cars and big houses don't have anything supporting them anymore.

  • It only takes about 15 seconds to tell someone: "Side with the label goes on top, metal part goes in first." THERE. That wasn't so hard, now was it? Now he/she knows.

    Nope. This is the wrong way to explain it, because they might come across a PC with the fdd on its side. "Angled corner to the eject button" is the correct way :-)

    dylan


    --

  • Heck no. Aside from the fact that IT workers tend to be raging individualists, unions work against the very principles that make IT jobs fun in the first place.

    -Mars
  • He's not saying computer programmers won't be needed. He's saying there would be a huge jump in the number of programmers, which would drop the value of the individuals. Basic supply and demand.
  • Considering what programming is - describing how to do things - its not plausable that an automatic way of doing it will ever be developed, at least not for anything interesting. If you can do it automatically, its not really programming.

    To put it another way, programming is inherantly less automatable than the tasks that are being programmed. As long as there are activities that can't be done automatically, there will be a superset of programs to do those things that can't be written automatically.
  • I wondered about you when you came aboard /.

    But the History of the World, Part I references, plus the extremely cool session at AWE, have convinced me you're a cool guy ;)
  • by aheitner ( 3273 ) on Sunday October 10, 1999 @07:12AM (#1625193)
    But that was Louis XVI.

    Louis XIV was the Sun King. He was an absolute monarch who finally managed to get the nobles under control (which Louis XIII, of Muketeer and Cardinal Richelieu fame, had never managed to do).

    Louis XV was an even more extreme successor. His famous quote was, "After me, the deluge", and he lived like he believed it.

    Louis XVI was a genuinely nice guy who was not totally unamenable to reforms towards a more constitutional monarchy. But the rabble cut his head off anyway. Honestly, I'm a fluent French speaker, and I don't understand the French either :)
  • To be a top notch software developer, you need two things: motivation and talent. Motivation is a function of the individual, but there really isn't much you can do to increase your level of talent - it's much like playing the violin, you either have the ear for it or you do not.

    Now consider the fact that due to the increased salaries many people who otherwise wouldn't have chosen computer science are going into that field. For the sake of argument, let's call those people 'opportunists'. Now there are two possibilities:

    • The opportunists, in general, do in fact possess the talent required to be good programmers

    • The opportunists, in general, do not in fact possess the talent required to be good programmers

    One way to analyze this problem is to consider what these opportunists would have done if computing salaries weren't sufficient to draw them into that field. Let us suppose that the opportunists would largely have chosen engineering, math, physics, etc. if not for the increased salaries. I know quite a few outstanding programmers who came from those fields, so if this is indeed the case we might well be on our way to seeing an end to the shortage of developers. On the other hand, suppose the opportunists would have largely chosen management, finance, etc. if not for the increased salaries. While I admit the evidence is largely anecdotal, I personally don't know anyone from those backgrounds that I would consider a top notch developer, and I know a lot of developers. Certainly, I believe we would all agree that in general, a person with a math/science background is more likely to be a great programmer than a person with a business background (though of course, there are always exceptions to the general rule).

    So the questions is, which set of people is more likely to answer "money" when asked: "What is your primary motivation when choosing a profession?".

    • The math/science folks

    • The business folks

    I believe it is far more likey that the business folks would list money as their primary motivation. What people often do not realize is how rare the talent for programming is, in general. Throwing more bodies at the problem (particularly bodies that come from a group not known for its history of producing quatlity programmers) won't change that.

    The only thing that could ultimately change the situation is economics - if the demand for new software falls, the shortage of skilled developers would be diminished. I think this would only occur via an economic downturn as computers are an integrated part of our lives and businesses and that integration will only increase.

    Anyway, that's my take - flame away.

  • by Trick ( 3648 )
    1. Where I work (and in my position before my current one), you see a lot of suits actually wearing suits. They're not gone yet, by a long shot. I suspect Andover's not your typical tech company.

    2. Was the reason there's not really a shortage of tech workers in that article somewhere? If there was, I missed it.

    3. If they want to be a journal for print magazine editors, that's their business. I've got my doubts that hurling insults at them is going to change their minds.

    Guess I'm just feeling ornery today.


    ---
    Consult, v. t. To seek another's approval of a course already decided on.
  • Software industryis different. S/w engg never write s/w without bugs. Try to find out how many people today are still fixing bugs and enhancing software in the 1970's and 80's. You will be surprised. The problem with this industry is that there is no good proven quality standards being followed. More than 50% jobs today are maintenance related.

    If you want to survive in this industry you have to be on the bleeding edge of technology. Good enggs just cannot afford to do a 9-5 job and go home , They have to do extra reading to keep up with the new technology. People who dont keep up with technology are the ones who will bet left behind. I usually classify technology in two branches :

    - Hard. This one takes a lot of time to learn and you got to have solid grasp of concepts to be working in this area. example writing protocol software, OS's etc.

    - Soft. This one is easy to leanr. You dont really need a BS degree. A six month course is all that you need. This technology can get outdated easily..but it is also easy to learn similar new soft tech existing at that point in time. example html, perl etc.

    I havent got time to organize my thoughts...but a last thought is..if we really predict that the world is going to see major technological advances next century and I am sure there will be many related to new internet technologies (access, interface) I dont really see a point where the glut os s/w engg will end.

    CP

  • On the whole "Its" thing, it's not going to work because the meanings for its letter triple, i-t-s, are way too confusing already.
    I say stick with the old term. As far as I'm concerned, they're still suits even if they don't wear 'em. Just like main memory is still "core" even though it's no longer made up of little donuts.
  • But last I looked, OPEC was just about dead and oil was selling in the $10 - $20 per
    barrel range. (...) Oil got too expensive, so we (the oil-importing countries) simply stopped using so much of it.



    The end of the oil crisis had nothing to do with us using less oil. Oil/energy consumption continues to grow. High oil prices made it worthwhile for oil importing countries to start off shore oil production, and buy oil from more diverse (i.e. not all middle eastern) sources, which reduced the control OPEC countries had on oil price, and caused the prices to drop.


    Although I dont have any evidence for this, I am convinced that the death (or hybernation?) of OPEC is Saudi-Arabia's secret payment to the US for fixing the Saddam Hussayn problem.

  • If every group of humanoids with a nickname for another group of humanoids were infantile, we would need alot more diapers and bottles.

    Who are you, Anonymous Coward, that had not realized they were being called suits for quite some time. Why would you be so hypocritical as to play the name-game on those who you say play the name-game.

    Strange world, Stranger people.

    --
    Marques Johansson
    displague@linuxfan.com
  • You did a fine job on your feature, Roblimo.

    I thought your piece was very though provoking and "to the point", which is difficult to find in today's modern media. While I don't necessarily agree with you in the fact that the current tech shortage is a falacy, I do agree with you on some of the other points you make.

    I also found your discussion of a number of topics particularly interesting. I am all about getting to the meat of the matter and cutting out as much verbiage as possible. It was refreshing to see so much content is so little space.

    For what it's worth, I think Slashdot (and many other fine online publications out there (although Wired is certainly questionable these days)) has every right to be called a magazine. Good luck on your attempts to gain the respect and peer acceptance Slashdot rightfully deserves.

    Please grace us with another feature soon!

    - Froggy
  • There's a major, major factor that you're missing. Until the early/mid eighties, there were very few tools for engineers to use. Everything was done on paper, the only real 'tools' were calculators, lookup tables, and (in the real early days) slide-rules. Because of this, companies employed *scores* of engineers. My dad has told me stories from one of the engineering departments he worked in. There was row after row of engineers with their desks/drafting boards. My dad worked in a group that designed in-house test equipment... that group alone had around 20 engineers. Contrast that with the situation today: he has one other engineer working for him (admittedly, they're horribly short-staffed). I design engine computers for one of the Big Three (I'm actually employed by Motorola), and we have about 5 engineers on two different product teams.

    Ironically, what changed much of that is what (indirectly) created the current IT shortage: the computer. Instead of manually changing a paper schematic, it's now done electronically. Design tools are much better, you can simulate your design without building it, etc., etc.

    Another major factor in the reduction of engineering force is what today's technology is capable of. My dad designed a 6502 based embedded system in the mid 80s... he put over a man year into it. I've designed an engine computer that has approximately the same functionality (though, a lot more horsepower obviously) using a PowerPC derivative, and I've got maybe 2 man months into the actual design (if even that much). Most of this is because most of what I need is already in the micro I'm using... my dad had to use discrete circuits all over the place.

    So, how does this tie into IT and programming?

    Ask yourself this; how good are the tools you're using? Are they really all that good, compared to what they potential might be? Of course it's difficult to answer such a forward looking question, but I think that the tools will continue to get better... *much* better. Technology will probably keep improving too. 5 years ago, the idea of doing a time-critical embedded system using C code was revolutionary, if not unheard of. But this is exactly what our customer will be using on this PowerPC engine computer. Of course, there are countless other examples.

    BTW, I find it interesting that after the 'layoff culture' of the 80s that it's now very difficult to find engineers. It will be interesting to see how any IT trends affect engineering trends.
  • Comment removed based on user account deletion
  • Comment removed based on user account deletion
  • See this article in National Review [nationalreview.com] for the non-WSJ right-wing point of view, from clear back in June '98. There is a lot of overlap with Bryan Pfaffenberger's article, with the exception of the professional organization nonsense.

    Re unemployed hardware engineers: this happened mostly because the Defense Department procurement budget was gutted by the Bush/Clinton administrations and the pre-11/94 Democrat congress. So far the post-11/94 Republican congress has failed to correct the damage, courtesy of Clinton's veto pen. Decimate the biggest market for those engineers and yes, salaries and employability go all to hell. Throw in the technophobic attitudes towards nuclear power for good measure. (Had an interesting conversation with a disgruntaled nuke-tech a while back...)

    Re don't spend the future: ditto. I'm keeping my debt level in check, even tho the U.S. federal tax code is rigged to encourage massive mortgage debt (best tax deduction on the books). All these folks with their heavily mortgaged McMansions and nice debt-fueled stock portfolios are going to look real stupid when/if the market tanks and their debt level doesn't. (This is what Greenspan is really worried about when he talks about "irrational exuberance", but he hasn't found the right words. Nuking the deductions in favor of a dramatically lower tax rate, as the Flat Tax proposed by Steve Forbes [forbes2000.com] does would correct this serious economic instability.) I'm not saying don't have a little fun, and certainly not saying don't buy stocks, just watch that debt!

  • At the moment I guess it depends were you are. Here in the Midwest it's almost impossible to even get your resume looked at with less than five years experience.

    You ever tried to get five years experience as a System Administrator when everyone wants five years system administration experience before they will even hire you?

    Maybe you can't see the end yet but around here it is fairly obvious.
  • The story was Player Piano by Kurt Vonnegut. Probably many more with the same theme, but that was Vonnegut's breakthrough novel. It's also the one that really got him labeled as a SF writer because it was rather ahead of the times in 1952 or so. But as many in Detroit could tell you in the late '70's or '80's, it was fairly accurate.

  • Well, I attend school at RIT. I'm in the Computer Engineering program, but I think I can speak to a possible reason for what you saw.

    About five years ago RIT started their IT program. Previous to this it did not even exist. The enrollment for that program was something like 50 students I think. In two years time (when I first started attending) the enrollment was up to on the order of 150 freshman for 1996, I think. The next year enrollment in the freshman class was doubled. This caused the school to have to shuffle the IT classrooms around and to build an entirely new building to house the next years incoming IT class.

    Point here is that the IT department was started on 50 students and a few professors as an experiment of sorts. The industry and demand for IT grads then exploded (very few actually saw this coming, especially those in control of programs at universities, etc), leaving RIT and many other schools, I suspect, scrambling to put together course plans, hire professors, find room for classes to be taught in, etc. Even last year and now, the department heads are still trying to come up with a solid IT curriculum. And to be fair, IT is a very agile and quick moving target. It has been for the past few years and it probably will continue to be for another few. Creating a solid and industry applicable curriculum is a hard thing to do, especially in these circumstances.

    As an aside, RIT excells in their CE, EE, SE, and CS programs. Their imaging sciences related programs are top notch as well. Don't knock the entire school simply because it has a young and developing IT program of study.
  • by Uche ( 6766 ) on Sunday October 10, 1999 @10:20AM (#1625208) Homepage
    But I found almost nothing with which I could agree in either the "Suits" or the "Labor Shortage" article.

    First of all, there is nothing any less "precise" in the term "suits" than in the word "tawdry". Not too many people buy their baubles at fetes of St. Audrey any more, bu the word remains, partly because it is aberrations like this that give human language its vibrancy. In several contexts "suits" has ceased to have any relevance to the dress of the referent. This is a good thing, even if you're not an etymologist. The phenomonon goes by several names, but in honor of the educational board of Kansas, let's call it "evolved terminology". When we try to stop the natural flow of language, we come up with abominations such as Roblimo's "its". I wonder whether Roblimo would junk the term "its" if the referents stopped wearing T-shirts.

    On to more serious matters: it is pretty odd to paint engineers as a bunch of whimpering, over-privileged prima donnas (look another evolved term!), especially when you're putting MBAs on the other side of the fence. The immense bonuses and stock-options that are thrust at even mediocre businessmen in this economy is far more of a distortion than the perks of engineers. But, at least if you take the viewpoint of that august journal _The Economist_, both trends will continue as long as America continues along its asset-price inflation. When this bubble bursts, it will affect everyone: some over-privileged engineers and managers alike will find themselves pumping gas, and many unfortunale members of society will find themselves yanked under the poverty line.

    This is pretty basic stuff, and it has nothing to do with the real or supposed shortage of engineers as much as it does with the distortions of the present economy: a lens that affects workers at all levels.

    Of course, The Economist might be wrong and there might be a true miracle in productivity afoot. But in this case, Roblimo would _still_ be wrong (and more so because of his comparison to the mid-century recessions).

    And as for the very idea of genetic algorithms replacing programmers. Please park this futuristic nonsense. GAs are a very deterministic way of harnessing implicit paralellism inheret in certain problem-spaces. They are not some form of dark voodoo. There is nothing in them that will make them sudden arbitrary problem-solvers.

    I really don't like being so harsh, but it's the same issue that came up when Jon Katz started writing features. On a forum such as /., with so many picky specialists, it's probably better to be encouraging features from specialists rather than journalists. I don't consider /. to be a main-stream medium. Yes, I know it might be a bit hypocritical of me to say so when I haven't submitted a feature myself. I might just find the time to correct this.

    --
    Uche

  • It's important to distinguish between supervision and maintenance, on the one hand, and research and development, on the other.

    Sure, system administration requires a high level of competency, but it is something which many people can learn to do with training and adequate experience. Let's be honest, this is glorified janitorial work most of the time, something that needs to be done, but doesn't require a whole lot of genius. There are exceptions, of course, but generally it requires only the understanding and use of tools which have been provided.

    On the other hand, to develop a new product or tool requires significantly more skill. And it is a well known fact that top programmers are ten times as efficient (at least) as the rest. This has to do with a natural capability, something which cannot be taught or even learned through experience, you either have it or you don't.

    For those who are in the first category, high salaries will not last forever, as the market will certainly supply an ever increasing number of people who are willing to learn the skills needed in order to board the gravy train. It's actually somewhat astonishing that so few people today have so little clue how to administer even their own desktop. I think more and more people are becoming clueful, but this is counterposed with the rapid increase in new users which have even less of a clue than the prior set. This trend is likely to continue for awhile, so there is some job security in the medium term, but eventually there will be some equilibrium and salaries will trend downward.

    On the other hand, those who have skills which are innate, whose abilities cannot be reproduced by formal methods, will continue to remain highly prized and well compensated forever.
  • I am in complete agreement that the current booming market for people to develop new applications won't last forever. At some point IT will cease being a place where a skilled person has a non-zero chance of becoming a millionaire. There is one thing, however, that will keep the more skilled programmers paying the rent long after the businesses have their e-commerce in place. As poorly as this bodes for what I'll be doing with the rest of my life, I'm guessing that even after the gold rush is over we'll all be doing bug-fixes and maintainance (sp?) for the rest of our careers. People will become dependant upon the applications we're writing now. Unfortunately, for most applications there is a certain development life-cycle at the end of which is a decline in quality. The original programmers are gone, the new people doing the bug-fixes don't have as complete a grasp of the big picture as the original programmers, new bug-fixes are not quite as consistent with the original design as they could be, code-quality degenerates, maintainability degrades rapidly, and a new code base is required. The only way I'm going to be humping a gas-station job 25 years from now is if somewhere along the line people figure out how to write software that becomes perfect and then stays so. I don't see it happening. Don't get me wrong.. I'm investing and saving my money, as there's never any gaurantee that I wont' need it tommorrow.. I'm just not going to try looking for that next killer industry any time soon.
  • by tob ( 7310 )
    Last I looked oil was doing $22-$23 per barrel. Not $50, but definitely more then the mentioned range.

    Tob
  • Just because RIT teaches a lot of NT related stuff in the IT curriculum does not mean we are unix free. RIT (especially the C.S. and C.E. people) are very in to unix. There is a very strong group of linux users here on campus, many of which are of the IT major. In fact, it doesn't suprise me at all to walk to class and see 5 slashdot shirts, 6 GNU shirts, and a dozen other nerd shirts (FYI, there are at least a dozen IT people I know here with "NT sucks" shirts).

    (BTW, SUNY Buffalo just kicked RIT's arse at rugby, but that's OK, cuz i'll just root the RFC union's computer and change the scores :)

    -Gextyr
  • By saying "I am the state" he was merely expressing the attitude of every monarch of the time. There is nothing unusually egotistical about it.

    Quite literally, the sovereign of a monarchy is the state.

  • I have a 2 year degree and while I can't claim to be a "great programmer", I can say that I know a hell of a lot more than some people I've met with 4 year degrees. And unlike many here on Slashdot, I won't claim to have learned assembly when I was 3 years old either. :^)

    I had an Apple ][ when I was a kid and messed around with it. I learned some BASIC and Pascal and even assembler, and I was pretty good at it. But I was a lot more into skateboards, punk-rock, chicks and beer :^) So I forgot about computers for about 12 years.

    But then I got married and settled down and decided to go back to school and get a real job (being a vagabond is fun but there's no security in it :^). I remembered how much fun I used to have with that Apple and how I was one of the better programmers in my classes. So I went to the local community college and took the standard CS track which included VisualBasic but also Pascal, C, C++, algorithm analysis, systems analysis, unix, win32 with MFC, and x86 assembler.

    I aced every class and graduated with a 4.0; they handed me a diploma and said great job! Summa Cum Laude! You ought to be proud.

    And I've been working as a sysop / helpdesk operator ever since because, due to all the mouse-clicking, M$ worshipping ditto-heads that are come out of "IT" programs nowdays, nobody in their right mind would hire a programmer with a two year degree and no experience.

    So if people are so desperate for IT workers, why am I making $8/hr and answering the goddamn phone all day while all my applications get rejected by hiring managers who want someone with 9 years professional C++ experience who will work 60 hour weeks for $36 grand / year ?

    I work with a "programmer" who doesn't know the difference between ftp and telnet and thinks COBOL is really cool. And a network manager who couldn't write a DOS batch file to save her own life (I kid you not, I tried to debug one of her login scripts and she was MAKING THE SHIT UP AS SHE WENT ALONG! Really! I mean making up syntax and keywords out of thin air! The mind boggles...) And of course they both get paid a lot more than I do.

    I wrote some robust, self-documenting Perl scripts to make the lioves of us operators easier and management is like, "Oh, that's nice."

    I'm going back to school for a real degree but after seeing this culture of money grabbing scam artists and liars who pretend to be computer experts getting the good gigs while I waste away here, I'm having second thoughts about computers as a career, as much as I love to code.

    Hmmm, maybe I'll be a plumber.

  • I live in the Dallas area which is supposedly good for computer jobs... not for me so far.
  • Funny... you just agreed with everything I said, but yet you're disagreeing with me?!

    I just got done explaining that people who are only in it for the money will wind up competing with themselves - not people like you and I!!!

    --

  • If you want me to 'grade' you, I'd say you got BIOS, squeaked by on EIDE, missed SMB (it's windows filesharing), and guestimated on toothpick - I *really did* mean the kind you use to clean your teeth with.

    I have a friend that is more of a SW than HW guy, so no biggie.. but if you really want to be a geek, you gotta know both. And about your ignorance - hrmph. Most people can't fake it more than a few minutes at most.. and if there's ever any doubt, I wing a couple random acronyms by them and see what happens. Since you have a sw background, I'd probably throw GUI, API, OLE and FAT at you. :)

    On your excuses part, formal education probably won't do you much good until college -- it's just a given. Best bets are to find semi-current books ( less than 3 years old ), and track down a few local gurus to help you out on the grey areas. QBASIC isn't programming *g*, and it's quite obvious you're a high school student... I'd guess frosh/sophomore. You're in the ballpark as far as your computer knowledge, I wouldn't fault yourself too much.. it sounds like you live somewhere that's still in the technological dark ages...

    As for me having a touch of 'elitism', you're probably right. I have a strong aversion to stupid people - esp. ones that ask you the same questions over and over again instead of asking you once, clearing up any ambiguities they may have about the answer, and moving on. I don't like people who waste my time - I only got 24 hours each day to do something productive, and spending a couple hours of it helping some retard figure out which side up to put the floppy into the drive doesn't fit that definition.

    --

  • by Signal 11 ( 7608 )
    Eh, let's all start a campaign to print slashdot out on our laserjets, dot matrix printers, and post it all over the office (just like those UF comics I know you have hanging in your cube). "not in print" my arse. It's just that the news happens so quickly there's no point to printing it - it's not that we can't... :)

    Anyway, back to the issue of IT shortage - yeah.. right. I don't know about you, but I spent alot (and I mean *alot*) of time doing tech support. The vast majority of people may eventually become computer literate enough to send e-mail and browse the internet without having to call us someday.. but I can guarantee you there will always be the same percentage of people who actually enjoy testing the limits of computers. And 'geek posers' are very easy to see through.. just ask them what BIOS, EIDE, or PLL means. If you really want to be mean, go into 'raw Data mode' and start throwing out random tech terms like so:

    Well, after I reconfigure SMB on a VAX to use my 8.4 Gb HDD instead of my old SCSI, I'm going to restart all the daemons, and then check for network connectivity using vi and a toothpick. :) If networkweek was still running bofh weekly's, I'd suggest the excuse-of-the-day as well.

    Anyway, my point is that while lots of people will flock to the money to get into computers, it really won't affect those of us who *really know the technology, and enjoy working with it*. The wannabes will compete with themselves...

    --

  • I started out in computers way back with an Atari 800, and my modern computing started with Windoze, but I quickly learned NT shortcomings for myself and when I was introduced to UNIX, Xwindows, and Unix-like systems, let's just say it was love at first sight. My main PC at home is a Linux SMP machine, my fav WM is XFCE.

    I live in the NYC metro area (No. NJ), and I just laugh when I hear LANOP commercials on 770 WABC AM radio about how MCSE's are making so much money and about the shortages. I LOVE IT! I love it because as long as the general public knows so little about UNIX, UNIX people will be scarce and my salary will keep going up!

    The avg salary I am seeing in this area for good UNIX admins is anywhere from upper US$50k to US$140k if you know UNIX and Oracle/SAP well. There are even Junior UNIX admin jobs from US$65k-US$80k/yr. I see many ads in the US$80-US$100k range for intermediate to advanced UNIX admins. Then again, just to live in this area comfortably you need to make close to $100k but that's another story. I live in Bergen County, NJ, which is one of the top 10 most expensive area in the US for Real Estate. Sucks.

    Keep getting those MCSE's and keep me scarce is my motto.


    ==============================
    Windows NT has crashed,
    I am the Blue Screen of Death,
  • One probable reason Suits are disappearing from the workplace is because companies are starting to see the value of making their employees comfortable. (Yeah, I know the corporation as a whole doesn't care about Joe Employee as a person, but they've figured out that comfortable employees are productive employees.)

    Take the 'Power-Nap' craze. Employers are actually setting up rooms for their employees to take 10, 15, 20 minute "power" naps. It refreshes them, and boosts their energy and productivity. Likewise, a lot of employers have figured out that techies, and programmers, and others are just NOT comfortable wearing suits to work. So? So that means they're busy concentrating on their discomfort, therefore they're not as productive. Ties feel restrictive, and suits are heavy and bulky. Get rid of that crap, and employees are happy.

    -- Give him Head? Be a Beacon?

  • Yes, you always hear the argument about throwing a bigger processor (or more hardware) at something in lieu of efficient programming. That kind of thinking can work if the character of the job stays the same, but it doesn't. The complexity of the jobs are changing and getting more complex, particularly with broadband and real time media delivery.

    Just look at our desktop systems now. Applied to the kind of work being done five years ago they'd burn through the floor. So why does a 450 MHz machine still feel slow sometimes? (I must be getting old, becuase I still can't believe a machine rated in 100s of MHz could ever be slow). We aren't doing the same things we were five years ago.

    Straight line approximations using boosted processor speeds will only go so far, but as the job gets more complex, the need for efficient programming can only increase.
  • by mrsam ( 12205 ) on Sunday October 10, 1999 @07:23AM (#1625243) Homepage

    I do not believe that the shortage of highly-qualified, high-skilled, programmers will end anytime soon.

    Sure, there'll be plenty of people whose eyes glaze over while reading the classifieds and seeing the salaries and rates of computer programmers. They'll turn around, and say to their drinking buddy: "Hey, Zeke! Look how much computer programmers are being paid these days (burp)! I think I'm going to go and become a computer programmer!!! (hic)".

    So, they'll go to some diploma mill, go through the motions, and, presto! New! From Spishak! It's "The Programmer In A Box!!!!!!" Instant ASP! Instant Perl! Instant C++!!

    So, the great unwashed will be hired en masse by clueless companies who will think that they'll save a bundle by hiring these new programmers at entry-level salaries or rates. They'll tinker around, for a little while, and things will seem to be fine for some time. Then, everything starts to crash and burn, the environment at work starts to get a bit tense because all the problems, so the new programmers will split, and the companies will be left holding the bag.

    Who do you think the companies will turn to, now?

    Yup, meanwhile, the Programmer In A Box[tm] is busy running the scam at another clueless company.

    I did not go to college and sign up for the comp-sci major because I wanted to make big bucks. In fact, when I was in school, programmers didn't really make that much money. They made a good buck, or two, but not that much. I became a programmer because that's what I really wanted to do.

    Predictions that IT worker shortage will end soon are generally based upon the alleged scores of students signing up for computer science majors or computer schools, nationwide. My opinion is that the main attraction for most of these people is only the high salaries and rates that are being paid to highly qualified and skilled programmers.

    Except that just the desire to earn big bucks will not make you a good programmer. There's a very good reason why good programmers make good money. Computer programming is a very mentally intensive job. To be a good computer programmer you not only have to know the computer language of choice. It also requires a certain mental discipline, I'd even say that it requires a certain way of thinking. Just knowing how to write printf("Hello world.\n"); is not going to help you much when you've been given a core dump, the source code, no way to reproduce the problem, and were told to figure out what happened, and to fix it.

    I believe that very few of these people, who are looking in to cash in on the supposed IT worker shortage, are really prepared for the job. And I wish them luck. I really do. The more they screw up, the more money the rest of us will make, cleaning up their mess.

    And even if I'm completely wrong, the bottom line is that we'll always have 10, 15, or more years of experience more than they will do. That cannot possibly ever change, so no matter how many bodies you'll throw into a computer science major, the number of people who already have decades of experience will never change.


    ... Man... /. needs a good spell checker and grammar checker.
    --

  • Ok, I agree with a lot of the posts I've read so far on the "shortage" of IT professionals. What is the term money people use with regards to a market... artificially inflated?

    That's how I view the high-tech job market right now. Last semester I was browsing through the CS/IS books at UVSC, where I go to school. What was the text being used for CS 425? Sam's Club "Teach yourself C++ in 21 Days." At UVSC, Discrete Mathematics is covered in 1 semester. And we only looked at about 5% of the textbook, which I paid a hefty $50 for.

    I'm an IT manager for my company, and I've interviewed a lot of kids with 2 year associates degrees from various colleges. What do they learn? Crash course in Microsoft VB, ASP, Office, etc.

    It seems that even our scholastic training (from many institutions) is superficial in this area, designed only to "artificially inflate" a job market.

    Although there may be plenty of people out there pretending to be programmers and administrators, when was the last time you ran across a real programmer -- someone with real experience who actually understands the "whys" of programming?

    True programmers transcend any specific language or platform. They don't break down in tears when they are asked to write something outside of their native Visual Basic.
  • by Sensor ( 15246 ) on Sunday October 10, 1999 @09:21AM (#1625254)
    I really don't belive that post is worth a 5 - but whats worrying is that lots of this readership seems to. This is the same sort of self agrandesment as we get with any Linux story - look at this objectivally guys if you don't you run very serious financial risks.

    What Rob actually said wasn't that there was going to be a huge increase in the supply of programmers but that there was almost certain to be a structural shock to the industry... he specifically said that he could not predict the source of this shock.

    This could be that:

    a) More problems are solved by pre-packed solutions - hence less need for custom solutions or from a sys-admin point of view maybe vast leaps forward are going to be made in reliability.

    b) A new language comes out which lower quality programmers can use to achive equivalent results.

    c) Large amounts of new labour become available - look at all the companies which have experimented with outsourcing their projects to 3rd world techies... these guys are just as bright, work just as hard (or harder) and cost fractions of a western worker.

    The likelyhood remains that if this industry (IT et al) where to remain structurally the same then salaries for techies will level off and then fall in the medium term due to increased supplies of new graduates (of which I am one).

    But look at the longer term and the chances of some random inovation reducing the numbers of us which are required or much more likely changing the TYPE of techie that is required is pretty good.

    We are still an infantile industry - demand is high prices are volotile but can it really last indefinatly.

    I doubt it and I'm certainly not going to bet my future/pension and morgage on it.

    just thoughts

    Tom
  • I agree with your point that it seems a lot of "CIS" students are really MCSE's in training. I have met several younger geeks lately, and their curriculums are like: designing networks, NT administration, and, oh yes a semester of c/c++ thrown in for good measure.

    I think to myself sometimes (while glancing at the Altair 8800 callous on my right index finger), Doesn't anybody learn what a computer does right after the power goes on any more?

    On another note, in the interest of fairness, I would like to throw out this info about RIT. I stumbled across the Computer Science House at RIT Page when checking out IMP [horde.org]. (Link is at the bottom of the page) Checking out their "house projects" page indicates, to me at least, that they are learning a tad bit more than just configuring NT boxes. A couple of highlights from that list: Porting NetBSD to DECStation 5000 and writing an FTP server for BeOS...Sounds a little beyond textbook, IMO.

    -- Have you hugged your assembler lately?

  • ...that makes me glad I'm graduating with a CS degree (BS or BA? I hate math... Opinions welcome) and a dual minor in education and history. Why, you say? What good does this do this discussion?

    Simple: One of the biggest problems with our country at large is the fact that we are simply not teaching what needs to be taught. I'm not a Linux guru by far (I can get a box up and running, stock install, in 5 minutes...including finding the CD and coercing a recalcitrant eth0 system into working, but that's another topic), but I firmly believe the computer educations we are providing to students is a joke. We teach them how to use PowerPoint, MS Windows, and nothing about how the PC actually works. My Comp. Lit. class in 8th grade (long time ago) focused on using IBM's Linkway to make cheesy presentations! Oh, how I'd hoped we'd be past this now, but we've since evolved into using the Office suite for everything.

    This is great for breeding kids who like a point and click OS (don't get me wrong, I like it as well, and have to use Windows for work, but I still piddle with Linux daily) and who don't want to think about what the computer can actually do, but where's the challenge? Why not teach them about how the innards work with each other, how the hardware interacts with the software, even if its only on a rudimentary basis. A little knowledge is better than no knowledge at all.

  • by LL ( 20038 ) on Sunday October 10, 1999 @07:34AM (#1625272)
    The problem with technology is that it comes in waves and therefore everybody starts paddling furiously at the same time. Naturally this leads to a severe shortage of the fad-to-be, whether programming language, app or digital-whatsits. The only real shortage is that of management talent as companies without a clue are losing people left, right and centre to those firms which do appreciate and treat their people well instead of trying to hire the lowest cost fresh-out-of-college app-builder. Given a choice between 120 hour weeks paying $250K or 60 hours paying $100K what would people choose? Burning out your engineers through stock option pyramid scams in order to cash out on an IPO is not a sustainable practice. Like in any industry boom/busts occur but IT is a portable skillset and if you're willing to travel, there will always be more relaxed opportunities elsewhere.

    Part of the problem IMHO is the relentless hyping of certain technologies. Sure the internet will change things but it will still be around (and cheaper) 5, 10 years later. It's absurb to think new businesses won't still be created n years in the future. From what I understand, part of this is market priming in order to adopt the most expensive components now (and thus preserve fat profit margins) before it becomes a commodity.

    Let's look carefully, people are paid (roughly) according to the value the market places on their labor, skills and talent. Society has deemed that a surgeon with megayears of specialist training is worth more than a janitor. Thus skills which are not easily acquired or substituted (e.g. high manual dexterity, intensive knowledge, or natural leadership) tend to be more highly rewarded. Oh and pick a field which is likely to be in long-term demand, pricing inelasticities and has natural barriers to entry. Plastic surgery sounds like a nice area :-). I recall this SF story (name escapes me at the moment) which invented a device that could immortalise a worker's assembly skills but once they've captured the performance of their best worker, they fired him. With increasing IP going into software, I wonder who else will be next on the firing block.

    LL
  • Your elitism is showing, friend. I also fear that I am one of your 'geek posers'. Even though I have never considered myself a hacker by any means, I am still trying to learn...

    Okay, let see if I pass your test:

    BIOS: check, firmware that is required for your machine to boot the operating system or boot manager along with other functions.

    EIDE: kind of check, some manipulation of an IDE hard drive?

    PLL: nope, never heard of it except I think I recognize the acronym.

    SMB: oh gee, I should know this one... symetric multi-tasking? No, that can't be it...

    VAX: from the context, it is probably a computer or perhaps an Operating Systrem (some brand of Unix?)

    HDD: check, hard drive, duh.

    SCSI: check, a type of hard drive with somewhat better performance

    vi: check, VIsual editor on all Unix systems

    toothpick: nope, never heard of this one before, maybe a brand of monitor? :)


    Well, it looks like I failed your test. Unlike a lot of people, I am rather open about my ignorance.

    Now for my excuses:
    • I am a software person, more interested in higher-level things like graphics, AI, user-interfaces, ect. Hardware I don't take a lot of time to experiment with.
    • No formal education in these matters. No money for books. So that basicly causes me scrounging for books at our limited library (many books about AppleSoft BASIC there) and looking for tutorials on the internet (which is quite limited, if you have done it, you know what I mean).
    • I have done quite a bit QBASIC programming, so if you consider that programming, I've done it
    • Still a lowly high school student. No real computer courses other than CAD and business technology (yep, you guessed it, MSOffice. Mr Paperclip is kind of cool until he pops up to help you.)


    Well, there you go. I am naked before the slashdot audience. And I don't think I am alone here.

    Let me see, what stereotype does that make me. I used to think I was a geek or nerd but now that I am a 'geek poser', I must reconsider. How about power user. There we go, much better.

    --

  • If you want me to 'grade' you, I'd say you got BIOS, squeaked by on EIDE, missed SMB (it's windows filesharing), and guestimated on toothpick - I *really did* mean the kind you use to clean your teeth with.

    SMB, I know I heard that acronym before. Samba, right? As for toothpick, it was a bad-humored joke. That was what the smiley was for.

    I have a friend that is more of a SW than HW guy, so no biggie.. but if you really want to be a geek, you gotta know both. And about your ignorance - hrmph. Most people can't fake it more than a few minutes at most.. and if there's ever any doubt, I wing a couple random acronyms by them and see what happens. Since you have a sw background, I'd probably throw GUI, API, OLE and FAT at you. :)

    I like to think that my ignorance is due to inexperience. I use the family computer and my dad tells me not to download anything, let along experiment much (not that I don't deserve it, I have switched mouse drivers, replaced io.sys, and upped the resolution past what the video card could handle; all by accident... this is before I discovered GNU/Linux).

    GUI - graphical user interface
    API - application-program interface
    OLE - object linking and embedding
    FAT - file allocation table

    These acronyms are kind of easy though. I'll add these to prove that I am worthy:

    Posix - common system calls on various OSs
    OpenGL - 2D/3D graphics API
    Corba - object-oriented networkable API
    Recursion - function that eventually calls itself
    Vector - two numbers that define a line from an origin
    Perspective Projection - Method for 3D rendering

    Of course if you ask me what a telnet port is, I'll just give you a blank stare.

    On your excuses part, formal education probably won't do you much good until college -- it's just a given. Best bets are to find semi-current books ( less than 3 years old ), and track down a few local gurus to help you out on the grey areas. QBASIC isn't programming *g*, and it's quite obvious you're a high school student... I'd guess frosh/sophomore. You're in the ballpark as far as your computer knowledge, I wouldn't fault yourself too much.. it sounds like you live somewhere that's still in the technological dark ages...

    Well, I taught myself C from a book made in '94. QBASIC is as close I am going to get to programming experience. And that is mostly in games and my awkward attempts at 3D programming (from scratch). Obvious I am in high school, eh? Hmmm. I am actually a junior though.

    As for me having a touch of 'elitism', you're probably right. I have a strong aversion to stupid people - esp. ones that ask you the same questions over and over again instead of asking you once, clearing up any ambiguities they may have about the answer, and moving on. I don't like people who waste my time - I only got 24 hours each day to do something productive, and spending a couple hours of it helping some retard figure out which side up to put the floppy into the drive doen't fit that definition.

    That's all right, I think. It seems to be natural, I've done it too: even though I don't know much to be elitist about ;) But I probably need to be more patient than most, I got to help my mom (who still treats Works like typewriter).

    And then a friend of my mine starts his own web page using some web-based interface. And I get all upset because he puts up a web page without learning anything. He says he wants to learn HTML for the 'advanced features' but I don't think he has enough patience to learn the simply things first.

    Oh, and I think that high schools should have a *required* computer literacy courses that actually go in to file management, system maintenance, and some basic vocab to know what the System Requirements mean!

    Oh well.

    --

  • I agree. One other thing, the majority of people reading slashdot are in the USA. Currently the US controls the internet.. many countries are locating their servers here because it's cheaper and on a faster route to the rest of the net. This creates more jobs in here. But this is surely not going to last very long. As voice over data networks becomes more prevelant these same countries will be forced to upgrade their interconnecting infastructure... making the US less of a central hub for the world.

    Try to get a job in China or India making 1/4th of what you can make in the US. Even the Germany with it's vibrant economy can't offer salaries near silicon valley levels.

    Certainly not all software jobs are tied to networking. But many of areas of programming have been taken over by general purpose solutions. Just as programs can make other jobs obsolete, so too can they make many programmers obsolete. Need a database? Few people write their own anymore, and GUIs are making setup and configuration easier and easier. System Administrators could easily be replaced by thin-clients and a central phone-company-like administration. Kids growing up today make their own web pages for fun... what's going to happen to "web designers" when everyone knows how to do it?

    I think there will always be a good market for good programmers, just as there is still a market for good areospace engineers. But some people getting into the field because it looks like easy money could run into trouble in a few years.

    In the game industry there is already a glut of entry-level programmers. The average starting salary is less than $30k there.

  • Code first, optimize later. Time = money. Big O notation doesn't matter for a huge number of problems. Optimization is an art, but knowing when to optimize is also just as important. A good consultant will save you money by *not* optimizing things that don't matter. It's the ones that have to get everything perfect you need to watch out for.
  • Quality programmers are hard to find exactly because our leading programmers keep raising expectations. Ten years ago, even a superstar programmer isn't going to be able to set up a national bookstore because the web infrastructure just isn't available yet. It's somewhat similar to athletic inflation in the Olympics. Today's runners just run faster than runners 50 years ago. We know more about how to train (& have better drugs :-). And computers give even more leverage because reuse of talent (good code) is relatively easy.

    So can we keep getting smarter forever without a paradigm shift? We have been through the industrial revolution and now we're in the information revolution. What's next? Or more fundementally, what is intelligence?

    I think what people call intelligence has a lot to do with the ability to concentrate. Are chronically distracted people intelligent? I've spoken to people who can't hold a conversation for 10 seconds! In other words, I don't think it's really important what you choose concentrate on, but if you can concentrate you'll probably be considered intelligent. After all, I think that people who concentrate a lot literally perceive more clearly.

    Hypothetically, let's say you are a superstar programmer but then you see that the competition is getting too strong. You're no longer going to be able to bring in the big bucks. Let's say that another field opens up. If you can apply your skill in concentration to the new field then you'll be able to pick it up faster than anyone else. So, I don't think fundementally smart people need to worry about making money. They'll always be on top because they can perceive quicker and more clearly where the top is. However, let's imagine life speeds up a lot more and that paradigm shifts that used to take a generation now happen more often; Internet time reduced by another order of magnitude.

    Might competition itself be made an object of concentration? Can the behavior of an ultimate competitor be boiled down into a simple description or diagram? If so, what would that mean?

    Maybe I'm an idealist, but I think that everyone could have more wealth (be more satisfied) if folks were meticulous about avoiding the destruction of wealth. What really bugs me is when I'm having a good time and someone else barges in and does something stupid. "Gee, you've written such a nice piece of software. How about if I sell it and give you .05%? I want to build a new castle so I'll need my fair share (50%)." What's Gates going to do with all his money?

    To sum it up:

    • Concentration is like abstract intelligence.
    • People should try to learn how to compete more optimally, not just within their chosen field but in general.
  • Of course, there are those of us who do prefer suits for work and leisure. Granted, they take a little getting used to. But so does wearing clothes in the first place. There are several advantages to traditional clothing:
    • Layered--you can take off layers as the heat goes up and put them back on again as it cools down
    • Better protection from the elements--shorts & t-shirt lead to all sorts of nasty sunburns and wind chapping. As an old hiking hand, I would rather hike in slacks than short pants any day.
    • More professional--not for the obvious, culturally-based reason (i.e. 'That's what professionals wear'), but because a tasteful outfit is infinitely better than any 'Eat at Joe's,' 'Linux Rox,' 'M$ Sux' shirt could ever be
    • More comfortable--you laugh, but wear a pair of slacks then a pair of jeans and you'll see

    I was greatly disappointed to see that at my job with IBM we switched last year from shirt and tie to 'business casual.' At least here at college I can wear a coat & tie every day of the week if I like. Which I do. I haven't worn a t-shirt more than twice in 3 1/4 years and don't plan on it any time soon.

    Nothing against other people wearing their underwear on the outside (that's what it looks like) if that's their thing. But a little bit of understanding for those of us on the other side of the equation would be nice.

    But man are t-shirts an ugly form of pseudo-clothing. Blecch.

  • One of my professors got his undergrad degree in aerospace engineering -- he worked on the Apollo program and helped send people to the moon. In his domain, he was great -- but one day we stopped sending people to the moon, and he had to find a new job.

    In January of 1996, I was programming at an aerospace firm, using C, Oracle, and Pro*C. Two months later, I was working at a credit bureau, for a significantly higher salary -- using C, Oracle, and Pro*C.

    If aerospace goes downhill, we can pack our bags and switch over the finance. If finance takes a downturn, it's off to the energy sector. (At least here in Houston it is.)

    Of all the professions I can think of, only programming and accounting have such portable skill sets; and when a company goes out of business, the accountants are the last ones to be let go.

    We have reason to worry about a general economic downturn, but no moreso than anyone else. Petroleum engineers, aerospace engineers, medical technicians, and others have to worry about downturns in specific sectors; but programmers abide, because Windows and Unix are everywhere.

    Just keep your tools sharp and your resume up-to-date.

  • On the whole "Its" thing, it's not going to work because the meanings for its letter triple, i-t-s, are way too confusing already. (Take the previous sentence, for example.) If choosing between its and it's is a brain-numbing challenge for most of us today, why make the problem worse by throwing another meaning into the mix?

    May I suggest an another option:

    • DAWDress-Aware Workers. Our bosses.
    • DOWDress-Oblivious Workers. Us.

    Some notes:

    • That DAW, when pronounced, sounds much like an enthusiastic "Duh!" with a Texas drawl is not a coincidence and is highly suggestive of the deep truth hidden within.
    • That DOW is sometimes used to refer to the Dow-Jones Industrial Average is also not a coincidence. When the DOW climbs, we know whose work made it possible!

    So please consider DAW and DOW.

    Cheers,
    Tom

  • Kids growing up today make their own web pages for fun... what's going to happen to "web designers" when everyone knows how to do it?

    I hear they're teaching kids how to use watercolors and colored pencils in kindegarten and gradeschool. Suppose all those Graphic Artists know about this? Bet they're counting THEIR days...

  • Perhapse you could offer some quotes from these economists before wrapping yourself up in the American flag?

    Honestly, this spawn reads like flamebait. The thread didn't say "foreigners work harder than Americans." If anything, people pointed out that Americans weren't the only IT workers to be had on the globe. To stear this converstation into a "who works harder" direction is just asking for nationalism and bigotry. There's enough of that noise to be had elsewhere. I don't see it contributing to anything here.

  • It's interesting you would mention the CNN Story [cnn.com]. This was discussed earlier [slashdot.org] on /. The general consenses seems to be that the study lacks any real meaning due to an inability to properly define "productivity" an a meaningful way (ie: more lines of code != more productivity).

    As a side note, the BBC article [bbc.co.uk] you mentioned states:

    But while US workers still lead the world in terms of productivity, European workers are closing the gap, despite working fewer hours.
    According to the article, US workers in fact lead in productivity. However, they also work longer hours and Europeans are discovering that longer hours does not mean more production. It's an old lesson. Heck - even the US miltiary knows it. I'm sure American industry would do well to learn it too (as well as a few readers here).
  • Personally I'd rather log into an Apache webserver with no graphics, and all high quality *text-only* content than some huge, bandwidth sucking, graphichal "Art" site where it takes 10x as long to find what you're looking for.
    I agree to a point. I go to a site for its content. If the graphics aren't a part of that content, they have the potential of getting in the way. To be more specific, the design of the site is dependant on its purpose.

    If its intent is pure information - minimal graphics are required. For example, I find a Open Source progect I'm curious about. I hit their progect's site. Many of these sites are very bare and offer more information than splash (to include an optional link to screenshots). Perfect. I'm there to find out about the progect, not see how many outputs of Photoshop plugins and script-fu can be strung togeather.

    However, some site's entire purpose is graphics. When I head over to themes.org, I expect some snazz. And I get it. Its heavy on the graphics and layout. But, IMHO, its clean and slick. I'll take the performance hit for the flash. I'm there for eye candy to begin with.

    Having said that, "art" has its limits. A bad layout destroys any reason to hit a site. This is only compounded if the images are huge and unweildy. This is where we get into the "100 font document" syndrome experienced in desktop publishing (as mentioned elsewhere in this thread).

    The real "art" to web page design is the combination of technology and graphics to make a pleasant, easy to use interface to whatever information is being offered. The designer has to consider bandwidth restrictions, target audience's computer resource limitations, differences in browsers, etc, etc.

    Almost anyone can slap something on a server that'll produce output. Not everyone can do it well.

  • Yes, it's difficult to get a precise definition of productivity. But a 50% discrepancy shows something is astray that's hard to mask no matter how you cut the cheese. Maybe the "true" figure is only 25%, a factor of 2 less than in the CNN story. It's still a poor showing. I'm not surprised that the US-dominated readership of /. came to a consensus that dismissed the report. It would have remarkable if they hadn't - can you really say that workers in the US IT industry are going to be the most objective commentators on that CNN story?
    Once again, I invite you to actually read the /. responces to that article. In summery, the point of contention seems to be the article claiming "lines of code" as a meter of productivity. First, the question asked is "what is considered a 'line of code'?" Then the discussion continues to questioning whether counting lines of code is a true indication. The examples brought up usually involve plugging in toolkits or sloppy code vs. optimized code (leading to less lines of code). Both get a progect working. But, according to this article, the optimized code is less productive.

    Would I trust the US IT industry dominated readership of /. to objectively comment on that piece? Sure. The arguments are valid. Non-US readers are more than welcome to poke holes in them. A peer discussion allows this. As an added bonus, there are some individuals who have no problem taking hits when critism is warrented. Granted, not all do. Thankfully, posts that read "Americans Work Harder" seems to be a minority.

    I think you have it backward when you say "Europeans are discovering that longer hours does not mean more production". I think Europeans have been aware of that for a long time.
    Yes, I think you're right there. I was completely wrong in that statement. European government seems to be a lot more concerned with worker's quality of life issues. Good for them. I'm reminded of the old saying "work smarter, not harder." With statments like "Americans work harder" coupled with stories of "no personal life" lifestyles demanded by some IT shops... I wonder who really is working smarter these days.
  • Who do you think the companies will turn to, now?

    It's happening now. And as a freelancer who comes into a company and fixes someone else's mess, I guess the answer to your rhetorical question is me.

    The only thing that is going to prevent the collapse of the market for software developers is the fact that just about every little trinket has a microprocessor in it now. Unfortunately, though, I think what is going to speed this collapse along is those very same "idiot" programmers you allude to--I've seen more than one company who, when faced with a project that needed some custom programming, decide to scrap the entire project rather than face some 22 year old with a diploma fresh from a diploma mill.

    *sigh* It's just all screwed up...
  • India was a British colony for many years. Most Indians who come to the U.S. speak excellent English, in addition to an Indian language.
  • ..it is actually over. Some people live there entire lives based on a fear of the future. After the great depression, many people took all their money and hid it around their houses. These people sometimes got ripped off by people who were robbing their houses, and their money would probably have been safer in the banks. I'm not going to drop out of USFs outdated, horribly taught computer science program because of this article. (Despite the fact that I have grown to truly hate the program, even though I'm getting average-to-good grades.) I think this is similar to people who figured Moore's law would eventually level out. Well, as far as I know it's still holding true. (Even if it isn't, it was still a truth for a lot longer than some people expected it to be.)
    Of course, anyone who is really scared about this (and currently working in the industry) there is a solution, unionize. Do it now while there is a shortage, and management can't crush the new unions.
    As to me, I'll wait and see. I already know enough to build connections and save money. (I'd be doing that no matter where I was working or what I was doing.) No one is safe, you know, no matter what industry they are in. If you work with computers, you should be doing it because you love it, or at least like it. I believe this about any career, money should be secondary (as long as you can make a living at it.) And don't spend your life living in fear, it isn't worth it.
    Doomsday predictions should only be uttered along with solutions. Hey, if I ever think that the world is going to come to an end in a few years, I will at least write letters to all my congressmen suggesting we build colonies on Mars.
    Hey, I went through college once, got my BA in English (at Rutgers) and ended up working in K-mart-type jobs for a long time before I decided to return to school. I'm doing better now than I was then (I still need my degree, unfortunately, so I can't say I'm doing great.) The reason why I didn't take computer science then was because it terrified me, I was used to being an A student (without no real effort) in High School, not struggling with the course material. So I took the "easy A" (for me) major, English, not because I liked Literature any better than computers but because I knew I would get great grades and keep my scholarship. (Incidentally, I think a lot of computer science professors at USF think that intimidating their students is the best way to make sure that only the best students actually stick with the program. They probably "break" a lot of good, enthusiastic kids who just can't take these professor's open hostility. I truly hate people with this philosophy of life, but it obviously drives a lot of people in positions of power.)
  • by babbage ( 61057 ) <cdeversNO@SPAMcis.usouthal.edu> on Sunday October 10, 1999 @08:25AM (#1625326) Homepage Journal
    Many of the comments I've read seem to be taking away the wrong point. *Of course* programming is difficult. *Of course* you can weed out the good programmers from the bad. But guess what? Engineering is difficult too! And some people are good at it while others aren't, just as with programming.

    And that could be precisely the engine behind what Rob describes. Did the engineering profession disappear recently? Of course not. And programming won't go anywhere either. But there are probably fewer engineers out there today than there were then, and the ones that remain are probably the more skilled among them. (I have no numbers to support this, only anecdotes). There isn't as much money sloshing around for the reamaining engineers to grab either. What's to say that the bulk of today's coders won't be driven out as well, with only the very best remaining -- if even them?

    I think it's perfectly reasonable to assume that IT will go down the same way. Consider that much of the growth behind IT is the move to get businesses onto the internet, and to build the infrastructure to get people on at home at better speeds. In other words, we're in a building phase, much like countries went through when they joined the industrial revolution. But just as with that period, it will end. Eventually, there weren't as many new factories to build, the telegraph lines had been laid, the rivers all had steamboats, and while these things persisted, they slacked off. So it may be here. Eventually, the fiber optic lines and satellites will all be in place; the companies will have their E-Commerce(tm) departments up and running; and the opportunities for new entry will, not disappear, but diminish.

    If you think this can't happen, you're delusional. Nothing lasts forever. We've got it good now, but something -- who knows what, who knows when or how soon -- will bring it all to an end. Plan for it. If you are not absolutely top notch, plan on a second career.

    One of my professors got his undergrad degree in aerospace engineering -- he worked on the Apollo program and helped send people to the moon. In his domain, he was great -- but one day we stopped sending people to the moon, and he had to find a new job. For a while, he bought a bar & lived as a bartender. Now he's a professor. But he'll probably never send people to the moon again.

    It's not pessimism guys, it's reality. Plan for it or get burned. Consider yourselves warned.







  • Perhaps you should tell that to all the headhunters out there who keep filling up my answering machine, despite the fact that I have not sent out a single resume in almost three years.

    Headhunters harvest names and phone numbers the way spammers collect e-mail addresses. Everyone ends up on their list sooner or later, and it's no great honor. Do you really think sending or not sending resumes makes any difference? They have their methods...

    And headhunters don't even care if you're suited to the job, and they're far too clueless to know one way or the other. A year after I left the profession, a headhunter tracked me down and wanted to set up an interview with a hotshot animation company just because I had a few animated GIFs on my webpage...

    Taking pride in having your voice mail filled up by headhunters is like taking pride in having your e-mailbox filled up with "make money fast" offers.

    Getting a call from a headhunter means nothing, and interviews are a dime a dozen. Getting a job offer to do exactly the same thing you're paid to do now is nice, but not exactly difficult in today's market.

    Branching out into something new, cutting-edge, with the opportunity to learn and grow your skills... that's where it's at, that's the reason most of us became programmers in the first place. But that's where the doors start closing when you get older. Of course you keep up with new developments on your own time and initiative, but that's considered "hobby" experience... all dressed up and nowhere to go, stuck in your day job.

    Why fight it? Stick a fork in yourself, you're done. Develop some business or entrepreneurial skills and get on with the rest of your life.

  • Someone asked me today what it was like to be a programmer. They supposed that it must be a lot of memorization, all those different languages. They knew a couple of people that did web design, you just have to learn HTML, right? And then they asked me about C++.

    I code in C++ every day. I thought for about a second what makes my job hard. It certainly isn't anything that you can memorize (Design Patterns anyone?). I memorized the syntax long ago, but it's the logical flow that makes programming hard. On the simplest level the question is "Should I use a for or a while?". More complex things like "Where does this go in my class hierarchy?" cannot be answered my someone without experience, good training, and most importantly, a logical mind. Complex applications involve a lot of problems, and they have to be broken down accordingly.

    I've wrestled with the question of whether just anyone can be a programmer for a long time. And I think that anyone can. But it takes a certain set of skills that you start developing at a very young age. You cannot deciede to be a Comp Sci major in college if you didn't deciede to do "something technical" in grade school. And you won't have done that unless it really interests you.

    So anyone can write a 100 line perl script that does something useful. But you can't throw anyone at a large project (my project has about 800k lines, and I don't pretend to understand it all) and expect them to be productive, or even useful.

  • Intuitively, we believe that Linux is a superior solution. So prove it: fire up excel and do a little NPV analysis. Show them with numbers that your alternative is viable and fits with the goals of the business. In your model's assumptions, explain the merits of the technology as best you can.

    It's not rocket science, and it would go a long way to helping foster acceptance of OSS by people who were previously clueless. Not to mention the fact that communicating effectively with management makes you that much more valuable to those you work for.

    While I'm all in favor of everybody learning to communicate better, I think you have to recognize what it means to be a geek. A geek is typically a specialist in some technological field. Communication may not be their strong suit.

    A manager, particularly one who is described as a "Master of Business Administration", should have significant skills in the area of communication. If a manager's primary job is not to communicate with employees, both expectations and to learn about the business from them, then what is it?

    That being said, I think that the typical geek is far more able to communicate with the typical MBA than the typical MBA is to perform the geek's job.

    It's frustrating to me when managers claim that they can't communicate with their people. That's like geeks saying that they don't understand the technology their using. It's an admission of incompetence.

    Look, I recognize that some geeks are EXTREMELY difficult to communicate with. In these cases, management needs to select or develop some geek interpreters among the technical staff. It's not acceptable to just give up and say that there are some geeks with whom you cannot communicated. If this were the case, how do you expect to set your expectations to these geeks?

    Sorry for the rant, but I too am tired of seeing superior solutions passed over because those in the know could not or would not make a proper business case to management.

    And I'm tired of management who expect me to do their job and "fire up excel" to prove to them that what I'm saying is correct. When technical people are excited about a given technology, it's management's job to get down with the techies and pull out the business case from the technical detail for utilizing this technology. Developing business cases is one of those communication skills that MBAs have gone to school for, after all.

  • An interesting article. A bit all over the place, but I like that. I particularly liked the link [207.178.22.52] from Linux Journal.

    One thing that I can't agree with at all is:

    Believe me, somewhere in a secret cavern beneath the Wharton School of Business (which is to finance as Stanford is to Computer Science) or someplace similar, teams of fiery-eyed MBA candidates are plotting to take down today's computer professionals as hard as OPEC, engineers, and Louis XVI all got slammed in their respective days.

    This is just paranoia. Maybe it's just meant to be funny, after all there's not really a secret cavern beneath the Wharton School. I don't believe that anybody "took down" OPEC and engineers in the '70s. I especially bristle at the comparison of MBA candidates to the downtrodden French peasants.

    It's popular to believe that cabals worked behind the scenes to destory OPEC, but really, they did it to themselves. The poor countries in OPEC have never been able to resist the desire to profit at the expense of the rich. In addition, there have been important holdouts (like the North Sea) that has made the OPEC cartel less than dominant.

    Engineers in the 70's were brought down by a bad economy and a marked decrease in defense/aerospace spending. They were not targetted by a cabal of management types who were jealous of their market successes. Heck, these same forces cost a lot of middle management their jobs in the 70's and 80's.

    You can bet that the first to get hit by a downturn in the IT economy today would be the MBA type middle-management. Upper management would first get rid of as much management as possible before cutting into the "productive" IT workforce.

    Having said all that, I do agree that any number of things can happen to cause IT workers to suffer similar fates to the Engineer's of the '70s. People shouldn't be smug and should be prepared. It may not be a conspiracy to "bring you down", but it feels the same when there's no work.

    Being prepared for a downturn is not showing your loyalty and commitment by working 80 hour weeks. Quite the opposite. Don't believe that management will appreciate it when you end up burned out and overly specialized. The real things you can do to improve your marketability are to sharpen your skills in a wide array of technologies and learn to communicate better. Although I've not tried it, it might be that participating in Open Source projects in your "spare time" might be a good way to sharpen your skills and practice written communication. (Note to those who want to improve their Karma, always work in positive comments about Open Source and Linux is a good way, no matter how far afield from the subject at hand.)

  • by JordanH ( 75307 ) on Sunday October 10, 1999 @08:21PM (#1625351) Homepage Journal

    What Rob actually said wasn't that there was going to be a huge increase in the supply of programmers but that there was almost certain to be a structural shock to the industry... he specifically said that he could not predict the source of this shock.

    How can one argue that the boom market will continue indefinitely for IT workers? It won't.

    The number one thing we have to fear is simple economic downturn. Your list of concerns however, makes no sense, and in fact you contradict yourself. You seem to be agreeing with Rob that there will not be an increase in the IT workforce to compete against, but then proceed to tell us about various ways that the IT workforce will be increased (new "lower quality" programmers, programmers from the 2nd and 3rd worlds, new graduates like yourself).

    I apologize for the long post, but here's my criticism, point by point:

    This could be that:

    a) More problems are solved by pre-packed solutions - hence less need for custom solutions or from a sys-admin point of view maybe vast leaps forward are going to be made in reliability.

    I've never really noticed that pre-packed solutions have lead to a decrease in IT workers. You can buy pre-packaged solutions for just about any business problem you can name today, and in fact, the development of pre-packaged solutions is the biggest growth market in existence, yet we are still (supposedly) suffering more and more acute shortages of IT workers.

    I can think of a number of reasons for this, but let me leave you with a few.

    First, the design and implementation of pre-packaged solutions tends towards huge DISeconomies of scale. Ever notice that one guy can knock out and support an application where a team cannot? Any time you try to build a standard application that fits a whole huge market segment, you are attempting to tackle a big "programming in the large" problem that leads to huge development times, ever-growing requirements, unbelievable lead times and often utter catastrophy.

    Second, even if you have a system that seems to fit your businesses needs, you typically have the need for many IT workers to install, configure, consult, train and support the use of this system. I don't have figures, but I would be surprised if SAP installations have had the net effect of less IT workers addressing the same need as before. Some businesses may claim that they've been able to get rid of some number of in-house developers due to a SAP installation, but that doesn't take into account the consultants and help desk people they've had to add to support the behemoth.

    b) A new language comes out which lower quality programmers can use to achive equivalent results.

    Ah, the Silver Bullet finally arives, eh? Well, I suppose anything is possible, but it's been a long time promised.

    This is essentially a twist on how Microsoft Marketing sells Windows to Corporate America. It's closely related to the possibility that pre-packed applications will lead to an IT worker glut.

    What you are saying is that with the correct technology X we'll be able to "deskill" the workforce, allowing just anyone to replace all those highly skilled workers we have now.

    While I've never seen a technology that empowers programmers of "lower quality" to produce equivalent results, I have seen technologies that empower programmers of "lower quality" to produce better results than they would otherwise.

    Know what? These technologies always allow the "higher quality" programmers to perform even better than they did before, completely out competing these "lower quality" programmers. The "lower quality" programmers are passed over for hot projects (and what project is not hot?) and the "higher quality" people gain more and more advantage of experience over the "lower quality" programmers. Ultimately, these "lower quality" programmers move on to another field that's less frustrating, or they find a niche where they can perform at their customary fraction of productivity of the highly skilled workforce. But, there's no total decrease in the number of IT workers here. The highly skilled you did have are still working. You've just added some "lower quality" people.

    It's not typically technologies that really puts people out of work, not in the big picture. Sure, some workers are displaced by a given technology, but others are employed. The people who typically are displaced by new technologies are the lower skilled workers. If new technogies are introduced, they typically benefit the highly skilled IT workers, not endanger them.

    c) Large amounts of new labour become available - look at all the companies which have experimented with outsourcing their projects to 3rd world techies... these guys are just as bright, work just as hard (or harder) and cost fractions of a western worker.

    Check the literature. These projects have often experienced less than stellar results. It seems that close communication is really necessary for projects to succeed, for the most part, and this is something that suffers by putting your development off-shore. Even when you have good collaborative tools, there are the difficult problems of cultural differences, timezone differences and just the headaches of long range management (managing a project completely by paper) that make these outsourced projects so problematic.

    Even so, there has been a HUGE growth in outsourcing projects to India, etc. in the last 10 years (this is not a new idea, and it's hard to imagine that it will become even more popular than it already is) and yet the proclaimed IT shortage grows ever more acute. Even when these off-shore projects are a big success, it's been observed that you still need a lot of analysts (to communicate technical requirements), help desk, trainers, consultants, etc. etc. to support these new wares from abroad. In the end, it's not much of a net negative to IT workers in the US and Europe, at best.

    From what I've said above, it seems that there's just an ever increasing demand for more and more and more IT out there. In fact, that seems to be the case. Every new technology increases geometrically the number of skilled people to support it. A new language comes out (and succeeds) and instantly there's a huge boom in interfacing this new language to all the legacy systems. You still have to have people to support all the legacy interfaces to the legacy systems as well. I saw a chart recently that showed that COBOL programmers (even past Y2K) will still be in demand at a slowly linearly decreasing rate. So, adding C/C++, Java and all the rest has mostly just added new IT workers, not displaced those who program in COBOL, RPG, etc.

    The software "crisis" is really just a crisis in the minds of hucksters and hypesters. If there were truly full employment of IT workers, you'd see such wage inflation that it would make your head spin. But, there's not really. There's been some increase, but it's really similar to increases we see for MBAs or high-tech Marketing people over the same time frame. Nobody is complaining about a Management or Marketing "crisis".

    What I think is really being said when they say that we are in a software crisis is that we could be more productive if we could execute all of the projects that we can imagine. Middle management is in the business of justifying new projects, it's often their entire reason for being. If their grand schemes can't be carried out because it's not as simple to deploy the technology required as they would like, for whatever reason, including inability to staff them, then perhaps their scheme isn't so grand after all.

    Business is in a feeding frenzy for more and better information. The more they get, the more they want and the more they want, the more information there seems to be. At some point, there will be an economic downturn which will pull the breaks on this spiral. Once things start to slow and they can find their way to actually lay off some middle management, this will be less managers asking for data. This will lead to the ability to lay off IT workers and then, less IT workers will need less management, who will need less information who will... This is the way recessions work. Less begets less.

    I am concerned about a real economic downturn and what it would mean to the IT world and (gulp!) my job.

    My advice (as if anybody cares)? Don't become overspecialized in cutting edge technologies that require big infrastructures. In a recession, there will be a paring down of technologies supported. Management will want things to be reliable, stable, supportable and not requiring consultants from 3000 miles away (sound like OSS/Linux?).

    Here's the simple, supportable skills that would be paramount in this environment; C/C++, SQL, Perl, shell scripting, possibly VB. You should be able to setup systems (Unix and/or Windows) with Web Servers (including setting up CGI scripts, all you would really need for UI development on-the-cheap would be CGI and UI development in a recession would definitely be on-the-cheap) and File Servers. Know something about computer network security. Know something about Cisco router configuration. Know something about system administration.

    In general, someone with a large set of diverse skills in relatively simple areas could replace a lot of specialists in a bind. Some DBAs will be needed in a recession, but new applications or ones where table and storage requirements change a lot will be at a minimum, thus requiring little DBA activity. If you already know how to do standard backup and other DB maintenance on Oracle or some RDBMS, that will be in demand with those other skills listed above, but I wouldn't expect that a lot of arcane DB tuning and in depth administration/setup knowledge would be in high demand. People would more likely suffer with poorly performing applications in a recession. New DB instances would be cloned from existing applications in so far as possible, requiring little DB Administration activity.

    Finally, a few stray comments on some things you've said:

    The likelyhood remains that if this industry (IT et al) where to remain structurally the same then salaries for techies will level off and then fall in the medium term due to increased supplies of new graduates (of which I am one).

    I've been reading that there are currently fewer people today getting CS and IT degrees. Your model seems to suggest that demand is static, people don't move out of the IT field due to promotion, burn-out and retirement and that there supply of new graduates is increasing.

    We are still an infantile industry - demand is high prices are volotile but can it really last indefinatly.

    The computer industry is almost 50 years old. Older than cell phones, microwave ovens, color TV, VCRs, CDs, DVDs and mini-discs. A lot of these industries seem to have fairly stable price structures. It seems to me that the industry was far more stable 40 years ago when you could really argue that it was in its infancy.

  • I can not predict the future and I can't speak for entry level admins, but I doubt there will be an end to a shortage of senior computer techies.

    Computer are becoming an integral part of every day life. Soon walking down the road you will be able to read email, browse the web and run apps from your personal computer over some sort of remote display X term or network app. How soon? Less than 5 years. The world needs programmers and geeky admins to make sure all these services are running and bug free.

    Its highly unlikely that someone is going to write smart software to replace the average programmer, and even if they do, we'll always need someone to optimize and debug that "smart" software.

    We can not dispute on average people are lazy and ignorant. Most people, 51%+, don't know anything, relatively speaking, about computers. Afterall, they are the reason AOL is doing so well. They couldn't possibly learn what it takes to take my job away. And the other half of the world, the professional, are trying to find a place that suits them, not all of them are going to become programmers or network admins.

    There's even a huge difference between a sys admin and a network admin. Network professionals don't need to know the equipment that runs on their network, but they do need to know the various routing protocols, topology of their network and security issues. Sys admins need to know the hardware and software every peon in the company uses as well as what servers need to be in place. And there are faded lines between these positions and even really senior people who can fill both, but those are extremely rare, even in the silicon valley.

    I bet it will be more competitive in the future, I will be forces to learn faster and work smarter. But without a major geological catastrophy, computers are not going to disappear off the face of the planet. Even a stock market crash couldn't put an end to computers. Look at the Y2k problems. If we didn't need them we would simply turn them off.
  • I agree. But a lot of those Programmer-in-a-box's are already out there. A friend of mine wasn't happy with the income he made from his bachelor's degree - even though he was doing what he'd "Wanted to do since I was six years old". So he got a master in Computer Information Systems. He now works for a well known, international company in their MIS department. His idea of programming is to copy blocks of code from the sample CDs that come at the back of programming books, without understanding what they do. Coding for him means moving them around, re-arranging stuff, and so on... but without a solid idea what the stuff does! He asked me to look at a computer program he was working on for class once. It was a C++ subroutine, and every four lines or so he had

    return 0;

    I asked him why he put that there, and he responded "Because it's in the book." He didn't understand that everything after the first return would never even be executed. He didn't understand call semantics. But he's now a "Developer".

    People like this are not a threat to the incomes of the readers of this board. We are a threat to their incomes, when their stuff stops working.

    I'm also tired of hearing how record numbers of people are entering CS programs. As Sam said, they're going in for the money... and they probably figure they know how to play Doom, so this must just be advanced doom playing. These people are never going to last four years in a CS program.

  • I can't disagree more. After 6 years in the computer industry (sales, support, netadmin ... several hats) I have returned to school to get a so-called higher education and a CIS degree (they pay more if you have the paper:) I would say that only around 15% of the students, even in the higher level CIS courses, are real geeks. The non-geeks can't handle anything that isn't "in the book". People who can't think logically (in a computer sense) won't make it as net admins or programmers, or any other position that there is a demand for. The difference between the rebound in the supply of engineers and what may or may not happen in computers is that Engineering is a trade you can train for and with enough practice become good at. (Now you engineers, don't go flaming me. I am not talking about designing satellites, I am talking about the average engineering job) In computers you can take as many courses and get as much training as you want, but unless you have "the knack", your just not going to be anything but level I tech support. Crow
  • This is slightly off-topic, so moderate down as appropriate.

    I am a self-described geek, have an advanced degree, and am finishing up an MBA at one of the aforementioned satanic institutions, so take this FWIW.

    The hostility that i'm seeing by techies to their PHB's really is (IMHO) about a communication gulf.

    Many of my peers in business school are getting involved in high-tech, obviously because that's where the money is.

    The more self-aware ones realize that they haven't got a clue about the underlying technologies, but they are making an attempt to learn. The M$ mentality still prevails, but the volume of people asking me about my linux boxen increases daily.

    I think that many business-types would appreciate the techies in their companies giving them the scoop on the latest technologies. These people are not idiots, but you need to do it in a language that they understand. Money.

    Want to use linux as a print server? Don't just tell your manager that Linux rules and NT is a bloated hog.

    Intuitively, we believe that Linux is a superior solution. So prove it: fire up excel and do a little NPV analysis. Show them with numbers that your alternative is viable and fits with the goals of the business. In your model's assumptions, explain the merits of the technology as best you can.

    It's not rocket science, and it would go a long way to helping foster acceptance of OSS by people who were previously clueless. Not to mention the fact that communicating effectively with management makes you that much more valuable to those you work for. [An important skill, if Roblimo's hypothesis is correct.]

    Sorry for the rant, but I too am tired of seeing superior solutions passed over because those in the know could not or would not make a proper business case to management. To me, this is just as egregious as managemnent forcing IT guys to wear ties :)

    docwolf

    [tieless, wearing a mumu like homer.]

  • I've put a lot of thought to these topics lately, so here are a couple of thoughts...

    Suits:

    Well, on the West Coast that is most definitely true. Almost every kind of office has transitioned to "business casual," even the German-owned insurance company my father is a director at, which previously required ties of even the technical people. However, many of my friends have noted that on the East coast, the traditional dress has for the most part remained, and in fact that they have to dress up more when visiting East Coast offices. It's a culture difference for the most part. On the East Coast you have many companies which are older than every Slashdot reader - on the West Coast, there's been a boom, and culture changes with a boom like that. When was the last time you went into a tech company and they didn't have free sodas? Were you surprised?

    IT shortage:

    Here's what it boils down to. As a network engineer I can only speak as to the network portion, but examine the field. It's only been around for 30 years. Before 30 years ago, *there were no network engineers* (not counting telco engineers, who have laregly stayed with their telecom networks, and not transitioned to IP, though I have no doubt that will change...). We are second generation at best. We are one of the youngest industries around. So it only makes sense that there is a lack of trained professionals, or even of mechanisms for generating trained professionals. Secondly, examine the fact that there has been a higher growth rate for networks and the internet than probably almost any industry to date (perhaps excluding broadcast media such as TV, which grew almost as fast, I think... I don't have the facts to back this up, but it's not really important). Again, this means you have far too few people to run far too many networks.

    Will this be true in 40 years? No, it will be a fully formed industry, with channels for training, set career paths, and A lot of very senior people. That's part of the key - in 20 years, I'm going to be damn senior as network people go. That's because I choose to learn constantly and develop. I'm not concerned.

    Also, the fact that networks are for the most part a closed kind of community means there won't be a massive influx of people until schools start teaching it, and even then, we all know that schools often produce people with no practical technical knowledge at all. Most kids can't reasonably set up a routed network in their apartment or dorm. This makes the learning curve a bit more difficult to mount, as compared to system administration.

    I think the sysadmin world is a little bit different. The main issue being that anyone can boot up a free UNIX on their home computer, or NT for that matter, and administrate it. They can learn enough from that to admin a small company's boxes, and maybe a large company's. There *will* be a glut in the market for sysadmins, because it's so easy to learn. However, as with everything, specialization is the key to preserving prosperity. Sendmail gods will still be sendmail gods. Security wizards will still be security wizards. But if you bill yourself as a UNIX system administrator, you might want to consider adding a specific specialization to your resume - you're a dime a dozen, from my observation.

    flames to /dev/null
    email to grey@enigma.mips4.com
  • I attend Ohio State where I am a senior majoring in CIS. The powers that be surveyed the incoming OSU freshman and found that approx. 25% of them plan on majoring in ... CIS. Dear god almighty - you say, eh? How will the IT industry survive such an influx of potential workers?

    Well, here's the thing: There's no way that many of these kiddies are going to make it through our CIS program. Why? A number of factors: 1) Universities are raising GPA requirements for potential CIS majors because CIS departments cannot accomodate the demand. 2) Many of these "potentials" think that they want to be programmers because they like Star Trek and they enjoy a good game of Quake. When they actually sit down to write a lab and it takes them 8 hours to finish it, they'll think twice. When they have to sit through a whole lecture on binary, or B-Trees, or regular expressions and finite automata, they'll think about it again. And some of them will drop.

    I graded for the department for 4 quarters, and I watched a lot of kids walk away. So, there _will_ be a significant increase in CIS graduates, but it will not be to the degree we've been hearing. The colleges can't graduate that many, because the infrastructure isn't there in the depts., nor will that many potential students be able to cope with the very mathematical, very dry nature of the course work.

  • by Chad Stansbury ( 100803 ) on Sunday October 10, 1999 @07:42AM (#1625404)
    While I agree that there is not a lack of IT workers out there, I would have to say that the percentage of *skilled* IT workers is very small. I can't tell you how many times that I've talked to a highly-paid consultant about how I made some algorithm faster, get into big-O notation, and see his/her eyes start to glaze over. Unfortunately for someone like myself, who cares about the efficiency of their algorithms, the huge advances in processor speed have rendered such details unnecessary in most business applications. 90% of the code I review nowadays is just total cr*p, and it's due to the attitude that everything can be fixed by throwing more hardware at the problem. I'm beginning to feel like an old man (remember the old days...) and I'm only 29...

What is research but a blind date with knowledge? -- Will Harvey

Working...