Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Education IT Technology

Top 10 Dead (or Dying) Computer Skills 766

Lucas123 writes "Computerworld reporter Mary Brandel spoke with academics and head hunters to compile this list of computer skills that are dying but may not yet have taken their last gasp. The article's message: Obsolescence is a relative — not absolute — term in the world of technology. 'In the early 1990s, it was all the rage to become a Certified NetWare Engineer, especially with Novell Inc. enjoying 90% market share for PC-based servers. "It seems like it happened overnight. Everyone had Novell, and within a two-year period, they'd all switched to NT," says David Hayes, president of HireMinds LLC in Cambridge, Mass.'"
This discussion has been archived. No new comments can be posted.

Top 10 Dead (or Dying) Computer Skills

Comments Filter:
  • c ? really? (Score:5, Insightful)

    by stoolpigeon ( 454276 ) * <bittercode@gmail> on Thursday May 24, 2007 @05:55PM (#19260803) Homepage Journal
    doesn't really match up with my experience. and putting it next to powerbuilder? that's just not right.
  • by Sycraft-fu ( 314770 ) on Thursday May 24, 2007 @05:59PM (#19260865)
    But C? Really? I guess that the fact that nearly every game, every OS, almost every high performance computation tool and so on are written in it (or C++ which I keep under the same heading) doesn't count. While it certainly isn't the be-all, end-all, it is still widely used. Even games that make extensive use of scripting languages, such as Civilization 4, are still C/C++ for the core functions.

    Until there's enough spare processor cycles that it really doesn't matter how much CPU time you use, or a managed language gets as good at optimizing as a good C compiler/programmer combo (unlikely) I don't think C is going anywhere.
  • ColdFusion Dead? (Score:5, Insightful)

    by AKAImBatman ( 238306 ) * <[moc.liamg] [ta] [namtabmiaka]> on Thursday May 24, 2007 @05:59PM (#19260875) Homepage Journal
    I can only hope. Terrible, terrible language. Of course, these days it's actually a template engine for a J2EE server. So it's not nearly as bad as it once was. Unfortunately, most of the ColdFusion projects are massive, sprawling directories from the CF4/CF5 days. You're not likely to see a nicely package JAR here. :-/

    Also, what's with "PC Network Administrators"? TFA must be referring to a rather specialized form of administrator, because last I checked we still needed someone to keep the desktops configured, the networks running, the file severs sharing, the login servers logging people in, and the IIS servers serving.
  • by Rakishi ( 759894 ) on Thursday May 24, 2007 @05:59PM (#19260877)
    I mean, this is IT where things change quickly and at times unexpectedly. If you don't have at least a number of diverse skills then I can't say I feel sorry for you when your job gets axed. I may not be a guru in any one language but at least I won't be unemployed when that language dies out.
  • by Ckwop ( 707653 ) * on Thursday May 24, 2007 @06:01PM (#19260915) Homepage

    As the Web takes over, C languages are also becoming less relevant, according to Padveen. "C++ and C Sharp are still alive and kicking, but try to find a basic C-only programmer today, and you'll likely find a guy that's unemployed and/or training for a new skill," he says.


    What the web can now allocate memory and talk to my hardware? Even if you're not a kernel programmer, the web has sucked and still sucks for application development. It will continue to suck for years, due to Internet Explorer. It's misleading to claim AJAX will solve all these problems because it won't. In fact, it might even cause a few problems of its own. For example, do you really think all that AJAX is secure? In short, I think the web is taking over what naturally comes to that medium. It is wrong to say its displaced C.



    Does this guy forget that all of the GNU/Linux Kernel base system is written in C? You know, the operating system that powers most web-servers? I'll tell you one thing, C will still be here in twenty years time when Ruby on Rails is talked about much in the same was Blitz Basic is today. C is here to stay; it's immortal.



    Simon


  • by LWATCDR ( 28044 ) on Thursday May 24, 2007 @06:04PM (#19260969) Homepage Journal
    C++ is still alive and well.
    I think they are wrong since C is still used on a lot of embedded systems where C++ is too heavy.
    BTW a good number of HPC tools and applications are still written in FORTRAN.
  • by Anonymous Coward on Thursday May 24, 2007 @06:04PM (#19260971)
    1. secure software coding
      2. data management theory
      3. data modeling
      4. usability
      5. interface design
      6. use of testing, version control, refactoring, and other best practices
      7. space or time efficient algorithms
      8. general communications skills
      9. basic business concepts like ROI
    10. business ethics
  • Re:c ? really? (Score:5, Insightful)

    by WrongSizeGlass ( 838941 ) on Thursday May 24, 2007 @06:04PM (#19260985)
    'C' will never die. Period. It has so many uses from PC's & 'big iron' to embedded systems.
  • by khasim ( 1285 ) <brandioch.conner@gmail.com> on Thursday May 24, 2007 @06:06PM (#19261007)

    "C++ and C Sharp are still alive and kicking, but try to find a basic C-only programmer today, and you'll likely find a guy that's unemployed and/or training for a new skill," he says.

    Now I know some people who've learned on C#, but I'm sure that will change in the near future.

    Anyone who originally learned C, and is still writing code, has probably picked up a few other languages over the years.
  • Web Design (Score:5, Insightful)

    by happyfrogcow ( 708359 ) on Thursday May 24, 2007 @06:11PM (#19261089)
    Judging by their web page, all design jobs are dead too. We should all just write web pages to serve ads, because C is dead.

    This article is trash, even if it does have some technologies that are irrelevant. It has very little value to the reader. I'd rather read a 10 top list for reasons Paris Hilton should be locked up for life.

  • by Dionysus ( 12737 ) on Thursday May 24, 2007 @06:14PM (#19261147) Homepage

    a C programmer can move to C++ without a problem but the reverse is not true
    I find the opposite to be true. A C++ programmer is able to move to C without much problems, but the oppose it just not true.
  • by ivanmarsh ( 634711 ) on Thursday May 24, 2007 @06:14PM (#19261151)
    Yep... after all everyone knows that C# is the best language with which to progarm an embeded micro-controller.

    Technology reporting is certainly dying.
  • by khasim ( 1285 ) <brandioch.conner@gmail.com> on Thursday May 24, 2007 @06:17PM (#19261199)
    They phrased it very badly. C isn't going anywhere. But if all you know is C, then you are very rare.

    Most programmers who know C also know at least one other language.

    In any event, putting that on the list was just stupid.
  • Re:c ? really? (Score:5, Insightful)

    by fyngyrz ( 762201 ) * on Thursday May 24, 2007 @06:20PM (#19261249) Homepage Journal

    No, C isn't in any way going out. C produces fast, tight code that so far, C++ and C# can't even begin to match. C++ is just C with a lot of baggage, a great deal of which you can implement in C in a completely controllable, transparent and maintainable manner. We use the most important of those regularly in C code, specifically objects and objects with methods. We obtain better performance, smaller executables, and smaller memory footprints than any company that makes similar software using C++ or Objective C's add-on paradigms. C, and the C sub-domain of C++ and so on, is no more "going away" than C++ itself is. C occupies a unique niche between the metal of assembly and the (so far) considerably less efficient higher level languages — I'm talking about results here, not code. I'm all about recognizing that a few lines of C++ are very convenient, but the cost of those lines is still too high to even think about abandoning C code for performance applications. For many, the object isn't finding the absolute easiest way to write code, but instead trying to find a balance between portability, reasonable code effort and high performance. C sits exactly in that niche. C++ is easier to write, almost as portable, but produces applications with large footprints, inherited, unfixable problems inside non-transparent objects (like Microsoft's treeview, to name one), and a considerable loss of speed as compared to a coder who has a good sense of just what the C compiler actually does (which usually means a C coder that has assembly experience, intimate knowledge of stacks and registers and heaps and so on.)

    Speaking as the guy who does the hiring around here, If your resume shows C and assembler experience, you've made a great start. Even just assembler. C or C++ only, and your odds have dropped considerably. C, assembler and either a great math background or specifically signal processing, and now we're talking. C++ doesn't hurt your chances, but you won't get to use it around here. :)

  • Re:Delphi (Score:1, Insightful)

    by Anonymous Coward on Thursday May 24, 2007 @06:21PM (#19261269)
    I do! In fact, it displaced VB as my primary tool of choice back in 1997, for technical reasons (mainly, performance, & yet it is as readable as VB is imo, as well).

    Delphi 2.0 swept the floor with BOTH MSVC++ & MSVB (of all places), in "VB Programmer's Journal" October 1997 issue entitled "INSIDE THE VB5 COMPILER ENGINE"!

    That's where Delphi absolutely blew away VB in ALL of the tests (except ActiveX form loads, which VB even took MSVC++ out in), & took C++ out on 8 of 10 of the tests!

    Most importantly, by HUGE margins (especially in math & strings work, which EVERY program does).

    Where Delphi did lose to MSVC++ (only 2 of 10 tests) it was by VERY SMALL MARGINS, far less than where it blew away MSVC++....

    It was enough for me to see that developing shareware @ least, Delphi rules. I like it a lot, & used it to create this tool (runs essentially unaltered since its birthdate in 1997 to this day, across ALL Win32 platforms):

    APK Registry Cleaning Engine 2002++ SR-7:

    http://www.techpowerup.com/downloads/389/foowhatev ermakesgooglehappy.html [techpowerup.com]

    That IS the safest & most comprehensive/thorough registry cleaning program there is, bar-none, to this day, even 5 years after I quit developing it, to this day. Enjoy it, if you try it.

    (Anyhow/anyway - In shareware/freeware I have done on the side is where I solely control the tools I use, unlike @ work locations, where mgt. calls the shots on tools used & today they follow "He who has the money, wins" because "nobody ever got fired for buying IBM" (replace IBM today, with Microsoft)).

    Mgt., even though I showed them such results, was like "Well, can't argue the fact that Delphi IS the superior tool for performance AND rapid application development, but... Microsoft has the ca$h, & will be here tomorrow: WILL BORLAND BE?"

    You can't win there, not really, not on a technical superiority level. Much like VHS vs. BetaMax, the 'best man for the job' does NOT always win.

    APK
  • Re:dovetail (Score:5, Insightful)

    by MightyMartian ( 840721 ) on Thursday May 24, 2007 @06:21PM (#19261277) Journal
    I don't think you can justify C and Cobol. There are millions upon millions of lines of code in these two languages, and despite all the sexy new ones that have come along, these two still reign supreme; C is incredibly prevalent on dedicated systems and within a lot of operating systems, and mainframe Cobol code can still be found throughout the business world (though often cleverly disguised these days). I doubt a skilled Cobol programmer will be at risk of starving any time in the near future.
  • PC network admins? (Score:5, Insightful)

    by Volante3192 ( 953645 ) on Thursday May 24, 2007 @06:23PM (#19261305)
    With the accelerating move to consolidate Windows servers, some see substantially less demand for PC network administrators.

    Apparently this guy's never dealt with users. If there's a way to screw up a system, even a dumb terminal, they WILL find a way.
  • by JustNiz ( 692889 ) on Thursday May 24, 2007 @06:24PM (#19261321)
    Well they are agents. if there's one group of people I've come across that don't understand technology its technical staffing agencies.
  • I guess this shouldn't surprise me. Even though Netware was pulling "five nines" (of reliability, for those not familiar with the term) long before anyone considered running any flavor of windows on a server, we see another article bashing Netware.

    Sure, its sales have declined drastically, but I wouldn't say that its relevance has. I'd be willing to bet that if we were to actually survey what file servers are still running out there, we'll see a much larger representation of NetWare. Just because people aren't buying the latest version doesn't necessarily mean that they aren't using the old ones.

    For two years, I managed the computer network of a daily newspaper - including through the election debacle of 2000 and the 9/11 events. We ran that network primarily off of four netware 4.11 (later netware 5.0) servers. One of those servers had been running for over 400 days continuously when I left, and it served files and print jobs. That kind of reliability is hard to match.

  • by Anonymous Coward on Thursday May 24, 2007 @06:30PM (#19261409)
    If you pay the market (equilibrium) wage, then you will find plenty of workers. However, most companies, just like your company, refuse to pay the market salary. They then cry, "There is a shortage of workers!"
  • Re:Raising the bar (Score:1, Insightful)

    by Anonymous Coward on Thursday May 24, 2007 @06:38PM (#19261515)
    You are far removed from reality. The ease of learning the basics of C doesn't make it less useful/powerful than more recent languages.
  • Re:c ? really? (Score:5, Insightful)

    by tha_mink ( 518151 ) on Thursday May 24, 2007 @06:39PM (#19261535)

    'C' will never die. Period. It has so many uses from PC's & 'big iron' to embedded systems.
    What is 'C'? Is that a language? Like latin?

    I'm kidding, but only partially. I was a COBOL developer for lots of years, and I thought that COBOL would never die either. I would say "Too many companies are too invested ..." blah blah blah. I think that I actually even used the 'big iron' quote too when telling my friend how secure COBOL was. Um...I was wrong. I found that out with plenty of time to learn other stuff like Java and so forth but of course it's going to die. Just like C++ will die, just like Java will die, et al. If you've been in our business long enough, you should know better. Everything dies, it's just a matter of time. And if you think you're going to get 30 years out of the technologies that are new now, then you're wrong there too. That's the double edge sword that is the IT business. Keep learning, keep growing or start flipping burgers.
  • by rs79 ( 71822 ) <hostmaster@open-rsc.org> on Thursday May 24, 2007 @06:46PM (#19261641) Homepage
    "Anyone who originally learned C, and is still writing code, has probably picked up a few other languages over the years.

    I learend Assembly first and have programmed nearly every cpu except a vax. I learned C second (in 1976) and still use it every day.

    I also know: apl, forth, snobol, fortran, rpg, cobol, lisp, smalltalk, pascal, algol and probably more I can't remember. I've never done much in them except for a tiny bit of forth. I know postscript pretty well and used it quite a bit. But I do C, day in day out and do not think very much of C++ or (worse) C#. The oddball languages have their place but for most it's a pretty narrow niche. I'd rather invert a matrix in Fortran than C though, I'll admit.

    Andrea Frankel said it best on usenet in the 80s: "If you need to hire a programmer ones with assembly are better than ones without".

    It worries me that people today don't actually know how computers work inside any more.
  • Re:c ? really? (Score:5, Insightful)

    by Chainsaw ( 2302 ) <jens.backman@ g m ail.com> on Thursday May 24, 2007 @06:47PM (#19261665) Homepage

    C produces fast, tight code that so far, C++ and C# can't even begin to match. C++ is just C with a lot of baggage, a great deal of which you can implement in C in a completely controllable, transparent and maintainable manner.

    Wow. You must have had some really shitty software engineers. It's very likely that you can create C++ code that is as fast or faster than C. Yes, I said it. Implementing virtual inheritance and method overloading in plain C is doable, but it will be very complex. Templates? Don't even want to think about it.

    C++ is easier to write, almost as portable, but produces applications with large footprints, inherited, unfixable problems inside non-transparent objects (like Microsoft's treeview, to name one), and a considerable loss of speed as compared to a coder who has a good sense of just what the C compiler actually does (which usually means a C coder that has assembly experience, intimate knowledge of stacks and registers and heaps and so on.)

    I have no idea what the MS Treeview problem is, but once again - the programmers that you have worked with must have sucked balls. I'm an old C coder, with some solid x86 assembler knowledge. As you say, it's possible to get very high performing applications using C. However, why would I do that, when I can create code that is just as fast and much more readable by using C++? Yes - even for embedded development, which is my dayjob.

  • experience (Score:3, Insightful)

    by DrYak ( 748999 ) on Thursday May 24, 2007 @06:48PM (#19261675) Homepage
    In similar way :
    Non relationnal DBMS
    Yes, maybe they don't play such an important role as before at big irons. But actually they are encountered painfully often in science, where usually database grow slowly out of small projects that subsequently undergo numerous hacks.
    I'm studying bioinformatics and proteomics, non relationnal DBMS are part of the standard cursus (and often encountered in the wild).

    C programmingy
    Yes. Just try to tell it to the OSS community. Almost any cool piece of technology (most libraries) are coded in C. Not only but it is an option that almost any student in science can ask.

    NetWare
    Once again a big-iron vs. universities. There's still a lot of NetWare legacy in smll business and universities, even if bigger corporation have moved to some unix-based solutions or (the gods forbid) MS based Active-something.NET solutions.
    Novell is still offering training for it. Even if Novell would like to concentrate more on their Linux solutions.
    It'll end going the way of the dodo. But just not yet. ...the soso category...
    Non IP network
    This guy has never heard about something called bluetooth. But on the other hand, courses, as far as I know, seem to be mostly TCP/IP oriented. ...the agree category...
    ColdFusion, PowerBuiilder : they're dead and deserved it.
    OS/2: cue in "all 2 of them" jokes from Bastard Operator From Hell.
  • Re:Raising the bar (Score:4, Insightful)

    by AuMatar ( 183847 ) on Thursday May 24, 2007 @06:52PM (#19261767)
    You don't master pointers til you learn assembly. Until then, you just don't truely understand addressing and memory use.
  • Re:c ? really? (Score:3, Insightful)

    by iamacat ( 583406 ) on Thursday May 24, 2007 @06:53PM (#19261775)
    a great deal of which you can implement in C in a completely controllable, transparent and maintainable manner

    So, when you have a small object with 10 methods, do you actually waste 40 bytes on 10 function pointers? Do you define your own virtual table structure, use var->vtable.move(var, x, y) kind of notation and require users to call non-virtual methods as ClassName_MethodName(obj, arg, ...)? What kind of C++ overhead are you avoiding here? How did you like the experience of incorporating a library whose developers implemented similar paradigms but different details and naming conventions into your code base?
  • Re:c ? really? (Score:5, Insightful)

    by WrongSizeGlass ( 838941 ) on Thursday May 24, 2007 @07:01PM (#19261897)

    And if you think you're going to get 30 years out of the technologies that are new now, then you're wrong there too.
    I've been coding in 'C' for 24 years, and unless OS's, drivers, embedded systems, et al, stop caring about performance I think 'C' will out last me in this industry (and probably out live me, too).

    That's the double edge sword that is the IT business. Keep learning, keep growing or start flipping burgers.
    I've coded in over 20 languages in my career, from assembly languages to proprietary 4GL's and everything in between, on more platforms than I have fingers. My library has programming texts older than most coders today. Keep learning? Great advice. Give up on 'C'? That's another story entirely ...
  • by unixpro ( 464350 ) on Thursday May 24, 2007 @07:04PM (#19261941)
    If you work in the kernel, you work in C.
  • by Anonymous Coward on Thursday May 24, 2007 @07:08PM (#19261973)
    "Dying" impliesz they were alive at some point.
  • by Neo Quietus ( 1102313 ) on Thursday May 24, 2007 @07:11PM (#19262033)
    At least in my case I also find this to be true. I first learned C++ way back in high-school, and just recently for my CS degree I took the "C/UNIX" class. I've barely opened the C book, because (although I realize that C++ came from C), C is just like C++, except that the useful classes and exceptions have been removed. Oh, and I have to define my indexing variable outside of my "for" loop.
  • Re:c ? really? (Score:5, Insightful)

    by grumbel ( 592662 ) <grumbel+slashdot@gmail.com> on Thursday May 24, 2007 @07:14PM (#19262075) Homepage
    ### C produces fast, tight code that so far,

    So can other languages, I don't think that is the main selling point of C. I think the main selling point of C is that it is by far the most "compatible" language of all. Python, Perl, Ruby and friends are all based on C, if you want to extend the languages, you do so by writing a module using their C API. If you want to simply call C functions, you can do so from many other languages as well be it Lisp, Ada or whatever. If you want to talk to the kernel you do so in C. No matter what language you use, sooner or later you come to a point where you have to fall back and either write C or interface with C code, since C is the 'real thing', while everything else is just an ugly wrapper around C code, trying to hide it, but often failing at doing so properly.

    As long as a ton of stuff is based on C code it isn't going away, especially not in the OpenSource world where basically everything is based on C.

    Maybe one day some Java or .net based OS will take over, but I don't see that happening for many years or decade(s?) to come.
  • by Anonymous Coward on Thursday May 24, 2007 @07:15PM (#19262099)
    Name one other system that does everything that Notes does. And I mean everything. That is, support not just email with databases tacked on, but the same kind of extensibility, replication and failover support, multi-platform support (native Windows, OSX and Linux clients), 300,000+ user level scalability and directory integration that Notes has.


    Exchange? Don't make me laugh.


    Name one.


    Yeah its easy to rag on Notes for poor UI design, but when it comes to an "enterprise wide" collaboration system, nothing else even comes close, whether you like it or not.

  • by QuantumG ( 50515 ) <qg@biodome.org> on Thursday May 24, 2007 @07:15PM (#19262101) Homepage Journal
    Everyone says they are dead, but they just won't go away!

    1. functional programming
    2. formal methods
    3. prolog
    4. LISP
    5. Scheme
    6. Smalltalk
    7. Pascal
    8. Tcl/Tk
    9. LALR parsing
    10. pre-bash shell scripting.

    and that's the real message here.. nothing is thrown away in computer science.. we're just too damn young a field to honestly say we've hit a dead end on any particular technology. Anything you can name, people have done work on it in the last 10 years.
  • by sootman ( 158191 ) on Thursday May 24, 2007 @07:16PM (#19262119) Homepage Journal
    1) knowing what extensions are
    - Both the fact that that they exist in the first place AND what the different ones mean--"ooh, should I click on hotsex.jpg.doc.exe.scr.pif?"

    2) looking at the URL in the status bar before clicking on a link
    - Apple: I love you, but you SUCK for having the status bar off by default in Safari.

    3) knowing where downloaded files go
    - Every phone-based support call I've ever made:
    a) Painfully (see #4) navigate to a URL.
    b) Painfully (see #5) instruct user to download a file.
    c) Spend 5 minutes telling them where that file is on their computer

    4) the difference between \ and /
    - these people saw a backslash ONCE in their lives while using DOS about twenty years ago, and now every time I tell them an address, it's "Is that forward slash or backslash?" (Despite the fact that I've told them a million times that they'll pretty much NEVER see a \ in a URL.) This is usually followed by the question "Which one is slash?" God damn you, Paul [nytimes.com] Allen. [wired.com]

    5) the difference between click, right-click, and double-click
    "OK, right click on My Computer... no, close that window. Now, see the mouse? Press the RIGHT BUTTON..."

    6) the concept of paths, root directories, etc.
    - I why do I have to explain fifty times a day how to get from example.com/foo to example.com?

    Admins can get whatever skills they want--they picked the career, thy can accept the fact that things change. The backends are usually handled by people with some know-how. It's the end-users that cause all the problems. It'd be like driving in a world where people didn't know how to use turn signals, didn't check their blind spots, didn't know they shouldn't talk on the phone while making complicated maneuvers--oh, wait, bad example.
  • by Opportunist ( 166417 ) on Thursday May 24, 2007 @07:24PM (#19262233)
    To quote my Guru: "When you learned C, and you mastered it, you have learned every (procedural) language there is, for it is easier to take a step down rather than up."

    It's pretty much true. Look at the other languages you "should" learn today. Perl, PHP, Python, C#, Java... When you know your C well, learning them is fairly easy.
  • Re:c ? really? (Score:4, Insightful)

    by jesuscyborg ( 903402 ) on Thursday May 24, 2007 @07:27PM (#19262267)

    I'm kidding, but only partially. I was a COBOL developer for lots of years, and I thought that COBOL would never die either. I would say "Too many companies are too invested ..." blah blah blah. I think that I actually even used the 'big iron' quote too when telling my friend how secure COBOL was. Um...I was wrong. I found that out with plenty of time to learn other stuff like Java and so forth but of course it's going to die.
    But you're forgetting the reason COBOL is dying is because the tech industry is moving away from the machines and operating systems that make use of COBOL. C will not die in the next half century because the tech industry is moving closer towards technologies that are built on C like GNU/Linux. If anything, C will become more prevalent.

    In the next fifty years I imagine C's role more or less becoming that of the "mother language". 90% of the time everyone will be using higher level languages like Perl, Ruby, and Haskell on their Linux computers, all of which are programmed in C. Programmers will only need C when they need to change their lower level system tools, or to write new ones that perform very efficiently.

    The only way I can see C dying is if a kernel comes along with a Linux compatible system interface that's written in a language suited better to the massively parallelized CPUs of the future. And once the kernel moves away from C, applications are bound to follow.
  • Re:c ? really? (Score:5, Insightful)

    by Plutonite ( 999141 ) on Thursday May 24, 2007 @07:57PM (#19262607)
    You are definitely right, and it's not just because what happend over the last 24 years of your engineering history is likely to carry over..it's also the nature of the language itself. There is a reason C (and C++) are so damn popular, and the reason is that they embody most, if not all, of what can be done with a general purpose language. Things like Java and Python will stay for quite a while too, because the design there is more conforming to object-orientation while keeping most of the general-pupose flexibility, but C and the various assembly languages will never die. It would require a re-write of the entire architectural basis of computing to throw them out, and the theoretical part of computation theory does not need features that are unavailable here (yet). Anything that can be done in any language can be done (albeit less elegantly) with the aforesaid.

    No one shall expel us from the Paradise that Richie has created (Apologies to David Hilbert)
  • by Opportunist ( 166417 ) on Thursday May 24, 2007 @08:11PM (#19262785)
    A good network admin is sought after. And he will never be out of a job.

    Notice the "good" in the above statement, please!

    Unfortunately, network admins have already suffered for years from what we (programmers) are facing now: Clueless wannabes flooding the market. Sounds harsh, is harsh, but it's sadly true. Everyone who can spell TCP/IP and doesn't think it's the Chinese secret service calls himself a net admin. And since human resources usually can't tell a network cable from a phone cable, they hire the ones with the cutest looking tie. Or the one with the most unrelated certificates.

    Quite frankly, I have met so many people who claim to be net admins who know even LESS about networks than me. And I can barely cable my home net, and I can't solve the retransmission issues with my game machine that clog it. I do expect a lot from a net admin, granted, but for crying out loud, it's their JOB to know more about networks than I do, or I could do it myself!

    What you get today as a "network administrator" is some guy who can somehow, with a bit of luck, good fortune, a graphical interface and a step-by-step guide from the 'net, get the DHCP server on a Win2003 Server up and running. Don't bother trying to get a static IP or even a working DNS server from him. Not to mention that he'll look blankly at you when you ask him about splitting the 'net into smaller chunks. Anything in a netmask other than 00 or 0xFF (sorry: 0 and 255) is alien to him.

    That's not what I call a network administrator. That's what I call a clickmonkey.

    True network administrators who got more than an evening school degree are still rare. And they will have a job, with companies that know what to look for in a net admin.

    But the plague spreads. Recently we hired a "programmer" who doesn't know the difference between heap and stack. Or why inserting an inline assembler line of INT3 could do some good for his debugging problem.

    And we wonder about buffer overflow issues and other security problems in code? I stopped wondering.
  • by Tablizer ( 95088 ) on Thursday May 24, 2007 @08:18PM (#19262881) Journal
    I can only hope. Terrible, terrible language.

    In what way? I find it fairly nice for the following reasons:

    1. Makes it easy to integrate HTML with app code if you go that way (The "separation" argument is usually oversold in my opinion. The cost/benefit analysis often doesn't favor it if you do the probability estimation math. And one can still separate if you really want.)

    2. Very scriptish feel for rapid development. You don't have the verbose formality of some web languages and Java. (OO fans will not like it, I would note. I don't put much stock in OO. It is based on flawed assumptions about change patterns and developer psychology.)

    3. Pretty good database integration, almost plug-and-play.

    4. Graphing widget, editable grid widget, and tree widget are fairly good. It is hard to find decent edit-grid widgets without spending a fortune. (Its grid is far from perfect, but it is hard to find better web grids without spending a fortune.)

    5. Natural for web designers who know HTML to work with variable-embedded templates.

    I would recommend PHP over it because PHP probably has more staying power, but CF isn't bad for small and medium apps. It is how you program, not the language that makes the biggest difference.
           
  • Re:c ? really? (Score:5, Insightful)

    by Anonymous Coward on Thursday May 24, 2007 @08:29PM (#19262999)
    Several times I've been told "nobody does assembler anymore", and yet I still keep needing to use it. I'm not writing whole applications in assembler of course. But the same reasons I keep having to do assembler are some of the same reasons that C is used. C and assembler may be less commonly used than in the past, but unlike COBOL, modern computer systems are still very heavily dependent upon them. Until we get a radically new form of computer architecture, which doesn't seem likely anytime soon, demand will remain for people who can write and maintain the guts of what happens underneath the applications. And you can't do that in Java or C# or Ruby.

    Computerworld Magazine, being an IT rag, is concerned about IT, not computer science or engineering. Thus it worries about product names, not categories. So the point out skills in SNA or Novell Netware that aren't needed so much anymore, even though computer networking skills are even more popular and vital today than in the past. COBOL may be virtually dead, but dry and dusty applications for business purposes are alive and well and written by people who still wear ties.
  • Re:dovetail (Score:5, Insightful)

    by Watts Martin ( 3616 ) <layotl@gm[ ].com ['ail' in gap]> on Thursday May 24, 2007 @08:51PM (#19263257) Homepage

    I think you (and many others) are somewhat missing the point of the article, although the somewhat histrionic headline encourages a "miss the forest for the trees" reading.

    I don't think anyone is expecting C or even COBOL to vanish with the speed of PowerBuilder or NetWare; the issue is whether those are actually "growth markets" any more. The article is asserting they're not, and particularly in COBOL's case I'm pretty sure that's correct. COBOL will probably live on for quite some time, but you don't hear much about people deploying new COBOL projects -- you hear about them supporting existing ones that haven't been replaced.

    As for "but the OSes are written in C!" as a battle cry: well, yes, they are. But 25 years ago, they sure weren't: C was just too damn big and slow to write an operating system in. What's happened since then? Computers have gotten orders of magnitude faster, RAM and disk space have gotten orders of magnitude bigger, and of course compiler technology has also just gotten better. Couple that with the fact that operating systems and the computers they run on are just a lot more complicated -- having a higher-level language to deal with that, even at the system level, is a real advantage. There's nothing that prevents you from writing an operating system in assembly language now, but under most circumstances you probably wouldn't want to.

    The thing is, unless you want to assert that computers twenty years from now will not be much faster and have much more storage and be much more complicated, you can't assert that moving to a higher-level language than C will never be either practical or beneficial even at a system level. I don't expect C to go away or even be relegated to "has-been" status, but I suspect in the long term it isn't a growth skill. It's going to move more deeply into embedded systems and other arenas where small is not merely beautiful but necessary.

    The comparison with COBOL may be overstated, but it may not be completely inapt: the fact that there are still COBOL jobs out there and they may actually be fairly high-paying ones doesn't mean that going to school, in 2007, in preparation for a career as a COBOL developer is a bright idea. The same isn't as true for C, but I'm not convinced that's going to stay true for that much longer, let alone indefinitely.

  • by swordgeek ( 112599 ) on Thursday May 24, 2007 @08:58PM (#19263319) Journal
    I had the same thought about 'network admin' being on the list, but I'm curious about precisely what they mean by 'PC network admin.'

    A good network admin (hell, even a bad one) has enough equipment and required comprehension these days that they can't be too worried about the intricacies of some OS's non-standard quirks. In that sense, the PC network admin is going the way of the Appletalk admin.

    On the other hand, true network admins are absolutely crucial to most companies, and I've been lucky enough to work with a good number who understand their roles very well. Sounds like you haven't, which is a pity. Rest assured, they're out there.
  • Re:c ? really? (Score:5, Insightful)

    by 644bd346996 ( 1012333 ) on Thursday May 24, 2007 @09:07PM (#19263401)
    Sure, some things still need to be done in assembly, but they are always wrapped in a C api. And that is why C will not die until computer architectures change drastically. C is close enough to the hardware that you really know what is going on, and you can control it directly. Languages like Java, which eschew pointers and mandate that all code be in a class, can never replace C.

    In fact, no language that isn't pretty much C can replace C. If it doesn't give you the control over pointers and memory allocation that you have with C, it won't work as a replacement. If it does have those thing, it is not going to replace C unless it is a backwards compatible extension like C++ or Obj-C.
  • Re:c ? really? (Score:5, Insightful)

    by fishbowl ( 7759 ) on Thursday May 24, 2007 @09:13PM (#19263447)

    >I'm surprised that Fortran didn't make the list.

    I would have been, but working in scientific research I've discovered a couple of things that are surprising:

    1. People actually do calculus on the whiteboard for reasons other than taking a math class.

    2. Lots of people actually use FORTRAN. Even people whose Java, C, C++, Perl, Ruby, etc. skills are such that I look up to them -- and they have solid arguments for using FORTRAN, at least for certain kinds of numerical computing.

    But here's the thing: There are separate worlds. In one world, the idea of using calculus on a daily basis is simply never a consideration. You learn enough to finish college, and that's the end of it. Likewise, there's a world where numerical computing and arbitrary precision and optimized complex arithmetic are actually primary considerations and not just hypothetical things.

    I never understood this until I found myself in that world. And I wouldn't have believed you if you told me that people who know other languages, choose FORTRAN even when given a choice.

    But what I take from it, is that there are requirements that are met by FORTRAN which are not met by languages that offer more comfortable grammars.

    People (myself included) will argue that, for instance, C can do anything that FORTRAN can do, in a much happier grammar (opinion, mine, widely shared), but the thing is... while that's strictly true, a lot of the things that seem tangential or irrelevant, turn out to be *crucial*, where seriously optimized math support is the core of the application. FORTRAN makes guarantees on the kinds of things that are implementation dependent in C.

    Anyway, there's no shortage of FORTRAN programmers. It's quite easy for a skilled programmer to learn FORTRAN, once you get past the 'WTF' factor and can accept that it's relevant in todays world, at least when your problem space is a good fit for the language.

    COBOL or PL/1 and the like, make another story entirely. My experience has been that the role of COBOL has been replaced by the combination of transitions to modern RDBMS, decentralized business processes as a side-effect of the whole ubiquitous "PC" adoption, and the adoption of, for example, Enterprise Java. That covers one end of the spectrum, and the other end (the big corporate end) is covered by the evolution of vertical systems providers (e.g., Peoplesoft, SAP, SAIC).

    Back on topic: If there's a university CS program that gives degrees without courses in Operating System and Compiler design taught in C, I'd love to hear about it. No way are C programmers in decreasing supply. If nothing else, the million or so open source projects have created a whole generation of self-taught folks who know C.
  • by fishbowl ( 7759 ) on Thursday May 24, 2007 @09:50PM (#19263873)
    How do you get out of university without taking an architecture course that gives some assembly language, at least for a hypothetical machine?

    If you claim to have a BS in CS at the interview table, but didn't suffer through, e.g., a computer organization course (like Hennessy and Patterson style, which is common these days), didn't have a course where you developed an operating system, didn't design a language starting from BNF and build a compiler for it, didn't take 2 years of a lab science, didn't at least come close to a math minor, didn't have at least 4 courses at various levels of discrete math, automata, algorithm analyis, and didn't have a course that, as a final project, you deliver a significant user app in a high level language (on the order of an original multiplayer game, let's say)... I'd say your school has some explaining to do.

    Seriously, what school can you go to and somehow avoid having a significant background in several languages and paradigms, including but certainly not limited to asm, C, and either C++ or Java, if not both? I don't expect everyone to take an elective where they compare Lisp, Scheme, Ruby, Haskell, and Icon. And I realize that there's often a variety of senior-year choices, and some people don't take Databases choosing HighPerf/Parallel/Distributed computing, say, or 3D Graphics. But there are some basic things that you'd better have done, or else, despite having the piece of paper hanging on your office wall, you're not done with school!
  • Re:c ? really? (Score:5, Insightful)

    by atomicstrawberry ( 955148 ) on Thursday May 24, 2007 @10:06PM (#19264013)
    When you say 'based on C', do you mean that the compiler / interpreter is written in C, or that the language itself is derived from C? Because technically Ruby is based off Perl and Smalltalk, not C. The Perl side of things can be traced back to C, but Smalltalk's origins are in Lisp.

    However they are all implemented in C, as is PHP. In fact, I'm reasonably confident you'll find all of the web languages that the article declares are taking over are implemented using C. As is Apache, which is the backbone of the majority of internet servers. In fact, pretty much everything that provides important infrastructure is written in C.

    There may be demand right now for programmers that know the latest fad high-level language, but the demand for competent C programmers has hardly disappeared. The only reason that C would die is if another fast, portable, general-purpose language like it came along that offered significant benefits over C. I can't personally see that happening any time soon.
  • by glwtta ( 532858 ) on Thursday May 24, 2007 @10:19PM (#19264143) Homepage
    Neither are C, ColdFusion, or NetWare certification - programming and software design are skills, as is network administration; what they list are called tools.
  • Re:c ? really? (Score:4, Insightful)

    by Khashishi ( 775369 ) on Thursday May 24, 2007 @11:10PM (#19264629) Journal
    Any programmer worth his salt can pick up a new language in a couple hours. Hell, most languages today are just ALGOL with some syntactical refinements, and you know one, you know them all. I'm not worried if Java or C or Matlab dies out. What separates programmers is ability, not language experience.
  • by syousef ( 465911 ) on Friday May 25, 2007 @12:08AM (#19265121) Journal
    Learning languages quickly isn't hard. It's the libraries and the patterns that you must follow that require months of experience. The standard C libraries are very small compared to what you're expected to know say on a J2EE project today.
  • by adamjaskie ( 310474 ) on Friday May 25, 2007 @12:14AM (#19265169) Homepage
    I don't understand this whole "computers are faster; why bother making things run fast?" thing. Why can't we keep writing efficient code, run it on the faster modern machines, and have things actually GO FASTER? It seems that as computers get faster, application programmers get lazier, and everything runs at the same pace. What used to take 20 cycles now takes 4000 cycles, but those 4000 cycles happen in the same time as the 20 cycles. Is that an improvement? Not in my book.
  • by neonsignal ( 890658 ) on Friday May 25, 2007 @01:45AM (#19265975)
    How about a top 10 list of computer skills we'd like to see die?

    1. Mass marketing (also known by the fuzzy name 'spam').
    2. Ability to piss someone off with an email that was meant to be friendly.
    3. Documenting with the text "someone needs to fill this bit out".
    4. Finding the Caps-Lock; wasted brainspace for a useless key.
    5. Coding of Flash advertising.
    6. Writing bubblesorts... and inline.
    7. Industrial design that puts the reset button near one's knee.
    8. Being able to press the Ctrl-Alt-Key without thinking.
    9. Internal cable engineering that enables leads to be plugged in reverse.
    10. COBOL; because it is the vampire that needs a stake through the heart.

    Flip, why stop there. Lets go for the top 100.
  • Re:c ? really? (Score:5, Insightful)

    by Kjella ( 173770 ) on Friday May 25, 2007 @02:35AM (#19266309) Homepage
    Any programmer worth his salt can pick up a new language in a couple hours.

    If you mean whether it's curly braces or brackets or none at all and the syntax of basic control flow, then yes. If you mean being familiar with the standard library, the development tools and all the specific bits (Java generics, C++ templates, take your pick)? No.
  • by mcvos ( 645701 ) on Friday May 25, 2007 @03:47AM (#19266719)

    Why can't we keep writing efficient code?

    Economics. Computers are cheaper than programmers, so efficiently writing code is more important than writing efficient code.

    Actually, Java is hardly slower than C++ these days, so for most purposes, you can write pretty efficient code in higher level languages. C/C++ will remain for the really low-level stuff that you simply can't do in Java, and for the high-performance libraries where even the slightest speed gain will pay off in the end.

  • Re:c ? really? (Score:5, Insightful)

    by shutdown -p now ( 807394 ) on Friday May 25, 2007 @04:44AM (#19267003) Journal

    Any programmer worth his salt can pick up a new language in a couple hours. Hell, most languages today are just ALGOL with some syntactical refinements, and you know one, you know them all.
    Surely you meant to write, "all ALGOL-family languages"? Which includes most mainstream ones: C/C++, Java, C#, Python, Perl, BASIC etc. But you can't easily jump from C++ to Lisp, Erlang, Prolog or FORTH (just to name a few) easily, because they are different. Then there are people having troubles moving from class-based OOP (Java) to prototype-based (JavaScript). Etc... there is a lot out there, and it does differ.
  • by Old Wolf ( 56093 ) on Friday May 25, 2007 @04:57AM (#19267069)
    Don't get me wrong, I love C, but there is absolutely no reason to _still_ be using C in the _21st century_ when you could be using Embedded C++

    Embedded C++ is the worst of both worlds, IMO. It is more like C with some syntactic sugar. It removes all of the good features of C++ such as namespaces and templates.

    to get rid of the stupid "typedef struct" type declarations,

    You don't have to write "typedef struct" in C. Simply 'struct X { ...... };' works just fine.

    and other C idioms such as implicit int, no proper bool support, limited variable declarations, etc.

    Those things were all corrected by ISO/IEC 9899:1999, which came out just a few months after the C++ standard (and long before EC++).

    Depending on your real-time nature constraints, you'll want to turn off RTTI, Exceptions, Virtual Funcs, and Multiple Inheritance and use C++ as a better C to _at least _ get some better compile time type safety.

    People often say that, but I've yet to see any code example of how C++ has better compile-time type safety (assuming you are not talking about the use of templates for generic programming). The only thing that comes to mind is that in C++ you can not implicitly convert from (void *) to some other pointer type, but in C++ you would almost never use a (void *) anyway so it seems rather moot.
  • by swilver ( 617741 ) on Friday May 25, 2007 @05:12AM (#19267155)
    Actually, I think it's not as bad as you think -- you assume that a 10 times faster CPU actually performs 10 times faster in every way, unfortunately it usually means that the CPU wastes 10 times more cycles waiting on stuff. Yes, there are lots of things that used to take 20 cycles that now take 4000 cycles, but mostly those things are not time-critical anyway (like user interaction, anything that has to access the network or harddisk -- webservers spring to mind). For example, does it really matter if my GUI code runs in 20 cycles vs 4000 cycles? No, it won't make any difference at all -- the bottle neck here is how fast a user can click (and how fast the graphic card can render all those "special" effects users seem to love these days). Optimizing that code to run in 20 cycles won't give any performance increase at all -- optimizing it would not only be a waste of time, but would be like fixing something that already works -- the best case you can hope for is to get a faster piece of code with as much bugs as the original.. usually however it will introduce new bugs.

    Now let's take this a bit further -- how much of a performance hit do you take when you access memory that is not in the CPU's cache (or 2nd level cache)? The CPU will have to wait for the memory to be available... optimizing code that frequently accesses memory outside the cache would be useless (and would just mean the CPU has to wait a bit longer). Let's take quicksort, the algorithm isn't particularly hard but accesses memory a lot. Would it matter if one iteration takes 20 cycles or 40 cycles on a modern CPU (let's assume that's the difference between C and Java)? It will make little difference, the CPU has to wait for data anyway. In the end, even in such a low level algorithm, it will make little difference whether we used a very efficient piece of code, or a slightly less efficient one -- the bottleneck is the memory. In other words, as long as the algorithm you use is the same, both pieces of code should be about as efficient.

    The only time optimizing is still worth it is when you are doing stuff in tight loops that isn't randomly accessing memory for all kinds of reasons (and which of course is used to do a lot of bulk processing, like video encoding) -- it's hard to even think of a good example, but I suppose it might be worth using more efficient code in signal processing, compression/decompression and rendering applications. Even in those cases however a lot of stuff is handled in optimized libraries for higher level languages.. I mean, it won't make any difference if I use Bash (horribly inefficient!) to call my favourite Unzip program to unzip a multi-megabyte file, or whether I wrote a C program to do the same. It would still take as long.

  • by bscott ( 460706 ) on Friday May 25, 2007 @05:19AM (#19267189)
    0. Tweaking IRQs on PC clones to let soundcards work with any other card
    1. Knowing how to drop certain types of home comupter to re-seat the chips
    2. Inserting 64k RAM chips with your bare hands to expand memory
    3. Cutting a notch in 5-1/4" floppies to use the other side
    4. Adjusting graphics by hand to NTSC-legal colors for decent video output
    5. Editing config.sys to push drivers into HIMEM in order to free up memory
    6. Crimping your own RJ45 connectors to save money
    7. PEEK and POKE locations to do cool stuff on the Commodore 64
    8. Manually configuring a SLIP connection to connect to the Internet (in pre-Winsock days)
    9. Removing adjectives and punctuation from code comments to fit into 1k of RAM
  • Re:c ? really? (Score:3, Insightful)

    by Sir Toby ( 660923 ) on Friday May 25, 2007 @06:04AM (#19267401) Homepage


    I think you will only need to wait till 2100

    Or 2050, when all of the following logic begins to fall to pieces:

    if ( x > 50 )
          return 1900 + x;
    else
          return 2000 + x;
  • by ShannaraFan ( 533326 ) on Friday May 25, 2007 @07:50AM (#19267899)
    I wish I could shake your hand... In my company, we have monthly production outages during which system maintenance is performed and new code is deployed. During the last one, we also upgraded the production database server to a new 16-core 128GB RAM SAN-attached IBM server running SQL Server 2005. Big ol' nasty machine, new database engine, updated DB statistics. Sunday morning, as systems were brought back online and the "world" was beginning to reconnect to the database, Perfmon began to show high CPU utilization. This continued until all 16 cores were pegged at 98-99%. We're all tired, crabby, disappointed, and now management level folks are waking up and beginning to panic.

    The problem turned out to be a new table/stored procedure combo that was part of the new code deployment. The table was missing a critical index. A simple 40-second CREATE INDEX statement produced a near-vertical drop of the CPU metric in PerfMon, from 98% to less than 10%, where it has remained most of the week. Faster hardware is not always the answer!!!!

  • Re:c ? really? (Score:3, Insightful)

    by Junta ( 36770 ) on Friday May 25, 2007 @09:31AM (#19268861)

    Hard core? It's only a compiler. Besides, there are only two architectures that matter, any more: x86 and x86-64.
    Yes, because the 17 billion dollar a year industry in Sparc/Power based systems doesn't exist.
  • by cabazorro ( 601004 ) on Friday May 25, 2007 @09:48AM (#19269085) Journal
    If programs were houses, C would be the hammer.

    A developer who doesn't know C is a contractor fumbling with a hammer.

    Don't start me with what language is the nail-gun but the idea that the
    hammer is going away is by any stretch of the imagination: purely idiotic.

  • by nametaken ( 610866 ) on Friday May 25, 2007 @10:52AM (#19270001)
    Excellent point.

    It also broadens the pool of available programmers. I work for a small business. I know I'm not a great (or good, probably) programmer, but I write all kinds of applications for the company I work at. I certainly try, but I know there are probably a 1,000 ways to do what I do, better.

    So why does the company allow me to write our stuff? Because we're a small company and we could never justify hiring those great programmers for every little thing we'd like to have. It's either me, the guy who probably doesn't always know the best way, or not having it at all. In the meantime, like you said, a workstation costs what a workstation costs... it's not like we're dumping extra money into hardware because of my code.

    And the people who use my software? They love it. It gets the job done well (because it was designed the way they want it) and it all works fast enough. Geocoding software, log parsing and reporting, trivia engines w/ web services for multiple locations, automated RFP systems that integrate with SalesForce, mailing apps, shopping carts, document libraries, etc... all things they've gotten in the last 11 months that they probably wouldn't have purchased or hired someone to develop, but I can knock out for them in no time while still fulfilling my actual job duties. That makes me pretty damn affordable, considering I'm already worth my salary for my regular job there.

    BTW, you can all blame VisualStudio and the .Net framework for making situations like this commonplace... I've seen it all over. It's really TOO easy to write software nowadays! Oh, and nobody panic, I know my code isn't good enough for the public... you won't be dealing with any of my bugs anytime in the near future. :)
  • My predicition: (Score:3, Insightful)

    by Ihlosi ( 895663 ) on Friday May 25, 2007 @10:53AM (#19270009)
    C will still be dying when most languages that are alive and kicking right now are already buried.
  • Re:c ? really? (Score:2, Insightful)

    by spirit of reason ( 989882 ) on Friday May 25, 2007 @11:13AM (#19270311)
    C is used all over in systems-level programming and in applications where speed is critical (but not so much to merit the much-lengthened development time of using assembly). There's more to the computer than your silly "web 2.0" applications; it's just one layer of the abstraction.

"If anything can go wrong, it will." -- Edsel Murphy

Working...