Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Education IT Technology

Top 10 Dead (or Dying) Computer Skills 766

Lucas123 writes "Computerworld reporter Mary Brandel spoke with academics and head hunters to compile this list of computer skills that are dying but may not yet have taken their last gasp. The article's message: Obsolescence is a relative — not absolute — term in the world of technology. 'In the early 1990s, it was all the rage to become a Certified NetWare Engineer, especially with Novell Inc. enjoying 90% market share for PC-based servers. "It seems like it happened overnight. Everyone had Novell, and within a two-year period, they'd all switched to NT," says David Hayes, president of HireMinds LLC in Cambridge, Mass.'"
This discussion has been archived. No new comments can be posted.

Top 10 Dead (or Dying) Computer Skills

Comments Filter:
  • True story... (Score:5, Interesting)

    by KingSkippus ( 799657 ) * on Thursday May 24, 2007 @06:02PM (#19260925) Homepage Journal

    When I started working at the huge multinational company I work at now, there were three things that I had very little experience with that everyone swore would last at the company for decades to come: Token Ring, Netware, and Lotus Notes. I insisted that within the next few years, these technologies would be dead and the company would have to change, and I was constantly reminded of the millions of dollars invested in them.

    It's eight years later. We have no Token Ring network. We have no Netware servers. I'm doing my damned best to convince people of how bad Lotus Notes sucks, and most everyone agrees, but we have a Notes support team that really likes their jobs and somehow manages to convince upper level management that it would cost billions of dollars to change to a real e-mail and collaboration solution. But I'm still holding out hope.

    Godwilling, Lotus Notes will soon be on this list as well.

  • Delphi (Score:3, Interesting)

    by CrazyTalk ( 662055 ) on Thursday May 24, 2007 @06:08PM (#19261047)
    Anyone out there still use Delphi? Does it even exist anymore? I'm a bit nostalgic for it - that was my first professional programming gig.
  • by Danga ( 307709 ) on Thursday May 24, 2007 @06:23PM (#19261313)
    COBOL may be dying, but it's lingering on...

    You are 100% correct and like the article mentioned COBOL is still not only used at many companies but also taught in some universities Computer Science programs including the one I graduated from being Northern Illinois University in DeKalb. Here are two examples:

    http://www.cs.niu.edu/undergrad/coursecat.html#250 [niu.edu]
    http://www.cs.niu.edu/undergrad/coursecat.html#465 [niu.edu]

    There are A LOT of companies that still use COBOL out there (I saw many of them at every job fair I went to) and the langauge is far from dead. Thankfully I didn't have to go the route of being a COBOL programmer and found a job I love doing C/C++ development but at least I have the option and I definitely did learn a lot about the langauge as well as mainframe programming from taking the COBOL classes.

    Another great class they teach at NIU is Assembler on an IBM System 390. That class was HARD but I love the experience and knowledge it gave me regarding how a computer works at the lower levels and I wouldn't trade that experience for anything. Here is more info on the assembler class:

    http://www.cs.niu.edu/undergrad/coursecat.html#360 [niu.edu]

    While I am not exactly happy that COBOL is still around it still is a fact that it is going nowhere anytime soon.
  • I'd vote for FORTH (Score:1, Interesting)

    by Anonymous Coward on Thursday May 24, 2007 @06:24PM (#19261323)
    In the 80s I came into being a professional developer primarily using FORTH in embedded systems. Unfortunately after spending about 8 years becoming a master FORTH programmer, I had to move on to another company. Sure I've become fluent in C, C++, Python, Ruby. But none of these other 'languages'; not a single one; allows me to express the solution to a problem as succinctly, elegantly, and beautifully as FORTH. I know there are still a few FORTH jobs around, just none where I happen to be living these days. And to me this is a crying shame.
  • by jd ( 1658 ) <imipak@yahoGINSBERGo.com minus poet> on Thursday May 24, 2007 @06:26PM (#19261339) Homepage Journal
    Cobol has died back as much as it's going to, same as Fortran. It won't reduce in scale any further, because of maintenance requirements, so it is meaningless to say it is "dying". It's a stagnant segment, but it's a perfectly stable segment.

    Non-IP networks are dying? Must tell that to makers of Infiniband cards, who are carving out a very nice LAN niche and are set on moving into the WAN market. Also need to tell that to xDSL providers, who invariably use ATM, not IP. And if you consider IP to mean IPv4, then the US Government should be informed forthwith that its migration to IPv6 is "dead". Oh, and for satellite communication, they've only just got IP to even work. Since they weren't using string and tin cans before, I can only assume most in use are controlled via non-IP protocols and that this will be true for a very long time. More down-to-earth, PCI's latest specs allows for multiple hosts and is becoming a LAN protocol. USB, FireWire and Bluetooth are all networks of a sort - Bluetooth has a range of a mile, if you connect the devices via rifle.

    C programming. Well, yes, the web is making pure C less useful for some applications, but I somehow don't think pure C developers will be begging in the streets any time soon. Device driver writers are in heavy demand, and you don't get far with those if you're working in Java. There are also an awful lot of patches/additions to Linux (a pure C environment), given this alleged death of C. I'd love to see someone code a hard realtime application (again, something in heavy demand) in AJAX. What about those relational databases mentioned earlier in the story? Those written in DHTML? Or do I C an indication of other languages at work?

    Netware - well, given the talk about non-IP dying, this is redundant and just a filler. It's probably right, but it has no business being there with the other claim. One should go.

    What should be there? Well, Formal Methods is dying, replaced by Extreme Programming. BSD is dying, but only according to Netcraft. Web programming is dying - people no longer write stuff, they use pre-built components. Pure parallel programming is dying -- it's far more efficient to have the OS divide up the work and rely on multi-CPU, multi-core, hyperthreaded systems to take care of all the tracking than it is to mess with very advanced programming techniques, message-passing libraries and the inevitable deadlock issues. Asynchronous hardware is essentially dead. Object-Oriented Databases seem to be pretty much dead. 3D outside of games seems to be dead. Memory-efficient and CPU-efficient programming methods are certainly dead. I guess that would be my list.

  • Re:dovetail (Score:5, Interesting)

    by StarvingSE ( 875139 ) on Thursday May 24, 2007 @06:32PM (#19261433)
    I've said it before, and I'll say it again... lists like this are ridiculously stupid and not thought out. Its like "hey this is old it must be obsolete."

    The first two items on the list made me not want to take the author seriously. The financial business is run on COBOL and flat files, and will continue for some time. The language is not pretty, but it was made for a specific purpose and it does it well. In fact, demand for COBOL programmers has risen dramatically as people retire, and it is 7 years after Y2K. I know people who were asked to come out of retirement to work on COBOL again, for very high salaries, because it is not taught to us youngens anymore.
  • ColdFusion (Score:4, Interesting)

    by Goyuix ( 698012 ) on Thursday May 24, 2007 @06:42PM (#19261581) Homepage
    Wow. I didn't actually expect this to be on the list but I am not at all surprised. We use it at my work as the primary web platform and I can assure you of two things regarding the language: 1) it is very hard to find someone with development skills using it and 2) the ones who do have the skills are VERY expensive. That seems to go along nicely with the theme of the article that it is in fact a dying skill. While I personally have never developed much of a taste for it (I do post on /. after all - it would be like heresy / blasphemy) there are a few long-time-developers here that have an unholy allegiance to it, almost completely unwilling to even look at alternate environments or frameworks. I would guess that is probably similar for many of the languages/skills on this list and their long time supporters.
  • Typing (Score:3, Interesting)

    by radarsat1 ( 786772 ) on Thursday May 24, 2007 @06:45PM (#19261625) Homepage
    I was just copying down a quote from an article I was reading, and I realized that my typing skills haven't really been up to scratch, even though I spend hours and hours on the computer every day. For programming and general writing, I spend a lot more time thinking than actually writing, plenty of time to fix typing mistakes. Rarely do I ever just copy something directly, but this time I happened to be putting a long block quote into my document.

    It got me thinking.. secretaries used to be hired just based on their typing skills. Speed & accuracy. I remember when I took a typing class in high school the teacher made us cover the keyboard so we couldn't look at it while we were typing, and we especially weren't allowed to use the delete key so she could mark us on how many errors we made.

    But it's funny, that's so backward, of course. Since typewriters are no longer used, your typing speed _includes_ the time it takes to hit the delete key and fix what you did wrong. You time further increases if you have to look at the screen and then find your place in the text. So typing speed is now the only thing that counts...

    Now add into that the fact that the days of the boss dictating memos to the secretary are mostly gone, and typing is really a skill that no longer matters. It certainly helps in day-to-day computer tasks, but it's no longer a make or break skill for IT and office people.
  • Re:dovetail (Score:3, Interesting)

    by MightyMartian ( 840721 ) on Thursday May 24, 2007 @06:54PM (#19261797) Journal
    Talk to me again in thirty years. I never said Cobol wasn't being phased out, but the sheer mass of code written in it means that Cobol programmers have probably got it very safe for a few decades to come.
  • Re:ColdFusion Dead? (Score:4, Interesting)

    by funkdancer ( 582069 ) <funkyNO@SPAMfunkdancer.com> on Thursday May 24, 2007 @07:03PM (#19261929)
    I beg to differ. Whilst I agree that there's some shocking solutions out there, I develop applications using the http://mach-ii.com/ [mach-ii.com] framework and it makes for a great development platform; since version 6 (MX) it has supported components that allows for using object oriented development principles.

    If one was to use crappy solutions as an argument, how come anyone is still using php, asp, etc? I think ColdFusion has copped it more than any due to its low threshold of entry, but just because one _can_ make a shit solution using a platform doesn't mean the platform is shit.

    Have been using ColdFusion for 11 years now and it keeps getting better. It's a great second language to know in addition to .net or Java. Unfortunately & ironically, the low availability of developers pushes the case for enterprises to phase the solutions out, as it is too hard to get people who are good at it - even though it is a perfect fit for a lot of the work.
  • Re:c ? really? (Score:4, Interesting)

    by Rakshasa Taisab ( 244699 ) on Thursday May 24, 2007 @07:04PM (#19261937) Homepage
    We've already gotten 35 years out of C, and it is still going strong. Not as much used at it used to be, but not insignificant.

    In addition to that, you could say that the C we have today is an old 'stable' fork of that language, which is now moving ahead in the form of C++.
  • Comment removed (Score:5, Interesting)

    by account_deleted ( 4530225 ) on Thursday May 24, 2007 @07:08PM (#19261975)
    Comment removed based on user account deletion
  • 30 year technologies (Score:3, Interesting)

    by Colin Smith ( 2679 ) on Thursday May 24, 2007 @07:09PM (#19261997)
    Off the top of my head:

    Unix, shell scripting, C. There must be more.

    Just a thought, but it makes sense to invest skills in technologies with proven survivability.

     
  • by Anonymous Brave Guy ( 457657 ) on Thursday May 24, 2007 @07:10PM (#19262005)

    The curse of good presentation skills is that no-one ever notices that you've used them, because you're good at presentation. :-)

    I'm currently having a similar debate at the office. We're working on a new tool, effectively a web front end for a simple database with a few handy UI gimmicks. In the grand scheme of things, it's a pretty simple tool, but it's part of a routine business process and literally thousands of people are going to be using it for a few hours each month.

    At a progress meeting yesterday, one of our (cross-discipline but manager-dominated) team suggested that we should skip the next couple of weeks of detailed design work and just go with something OK, because it will save a few days of our time and so be available faster. It doesn't seem to occur to them that all that time is the equivalent of mere minutes for each person who will be using the tool. If we identify a simple usability improvement that saves every user ten seconds the first time they use the tool just because they find something a little quicker, then that's saved the equivalent of about two solid weeks of one person's time to design and implement that improvement.

    So it goes with typography and graphic design in documents. Poor choice of fonts or use of whitespace = hard to read on-screen = wasted staff time. Awkward page layout = people can't follow the text = wasted staff time. Poorly drawn diagram = distracted reader = wasted staff time. Poorly typeset equation = delayed understanding while the reader figures out what it says = wasted staff time. And so on...

    Unfortunately, presentation skills are one of those things where people don't really notice the subtleties. If something is poorly presented then the viewer will still read something more slowly, or misunderstand what it says, or not remember it as well later, but they probably won't realise what they're missing.

  • Re:dovetail (Score:3, Interesting)

    by Chibi Merrow ( 226057 ) <mrmerrow AT monkeyinfinity DOT net> on Thursday May 24, 2007 @07:14PM (#19262069) Homepage Journal

    I'll give you a dollar if you can find me ONE SINGLE deployment of mainframe COBOL that is being used for business that is not in the process of being phased out and redeveloped. Also, find me one person that knows and writes COBOL that doesn't hate his/her job. Worst language EVER. (Hollerith cards aside)


    You owe me a dollar. The IT department I used to work at runs most of the organization-wide stuff on an IBM OS/390 monolith, written in legacy COBOL code. Any of the programmers that worked there that did hate their job didn't hate it because of COBOL, believe me. Instead of "phasing out" the system, they got a nice new flashy IBM server to run a virtualized version of the mainframe seamlessly. Will the code be phased out one day? Probably. Not because of the language, but because the size of the organization has scaled so much and the increase in communication capacity available to them has grown exponentially, so problems like this can be approached from new directions. Plus, COBOL coders are getting harder to find. But any COBOL gurus could probably eat very well for the next 10 years there, since that's probably the earliest that system will be "phased out".

    And calling it the worst language EVAR is a bit disingenuous... It's very very good at what it does: business logic. I wouldn't want to write an operating system or a thermonuclear explosion simulation in it, but I wouldn't want to write check printing routines in C, either. :)
  • I dare to disagree (Score:3, Interesting)

    by Opportunist ( 166417 ) on Thursday May 24, 2007 @07:22PM (#19262193)
    Cobol had one huge disadvantage over C: It was no "system" language. It was an application language. Whether it was good at that is up for debate, it certainly was better than most alternatives there were, but it was dependent on the applications it was used for.

    When the applications died, the language followed. I dare say, ABAP is going to suffer the same fate as soon as SAP wanes and The Next Big Thing comes along. 'til then, it is a get-rich-quick scheme in IT if there ever was one, granted.

    C, in its "pure" form, is certainly not going to last forever either. But C-derivates will be driving the systems for the forseeable future. C (and C++) offers a good balance between closeness to hardware, so you can actually write low level programs in it, and readability, while still maintaining some basic inter-platform compatibility (unless you want to get really, really close to the system). Even if you can't port a program seamlessly, you can at the very least apply your skills to the compiler on a different platform, something that's not a given with languages that get closer to the core.

    C is probably not going to last. But its successors, C++ and C#, will. Well, C++ will, for sure. Whether C# is just another fad that gets a lot of hype now and gets dumped soon, time will tell.
  • Re:Raising the bar (Score:4, Interesting)

    by Opportunist ( 166417 ) on Thursday May 24, 2007 @07:30PM (#19262305)
    Well, not necessarily so. It does help to see the "wiring under the board", granted, and to see the difference between static and dynamic allocation, especially in subroutines (given the amount of buffer overflow exploits, I guess quite a few people at MS still don't know it), but to understand pointers, or why they can come very handy, you should rather learn some theory about trees, lists and so on. It will not "show" you where your memory is allocated, but it will certainly give you an idea just why and how pointers are very useful.
  • Re:c ? really? (Score:2, Interesting)

    by 91degrees ( 207121 ) on Thursday May 24, 2007 @07:30PM (#19262315) Journal
    Nope. I know a C programmer. He developed code tht was launched on an Arianne 5 a couple of months ago. As long as you're good, there's a C programming job. You'll probably need to know a little assembly as well but I don't know many C programmers without at least some idea of the code their copmplier produces.

    I could do that sort of thing but felt it was worth my while learning C++.
  • by xenocide2 ( 231786 ) on Thursday May 24, 2007 @08:00PM (#19262639) Homepage
    Ironically, most of Google's code has been rewritten to MapReduce style. Interestingly, the core functionality is written in C++. I guess it helps make the function oriented system a bit simpler to implement. The end result is that most of their programmers don't focus on savoring every cycle. Today's computers can run something like a million instructions in the time it takes to serve a single disk read; making code that can be run on 1800 nodes is far more important than how many fewer cycles your uniprocessor code can accommodate. Fewer instructions is better, but there's far more performance gains to be had from better locality -- process the data on the node that holds it.

    The rest of the things you've mentioned all outdate C++, let alone newer languages. And most don't use assembly at at all (apache doesn't, I doubt BIND or sendmail does). The UNIX kernels don't use assembly because it's faster (though it helps), but because there are things that must be done that are platform specific, and C's provided way of handling those cases is assembly. But generally, x86 computers are changing too rapidly to make any optimizations worthwhile. Just make clear code and let the compiler handle the dirty work. If you don't think it's good enough, improve the compiler. But usually the people who claim to "write their inner loops in ASM" have troubles making solid contributions to compiler technology.
  • Re:Memory Tuning (Score:4, Interesting)

    by camperdave ( 969942 ) on Thursday May 24, 2007 @08:04PM (#19262685) Journal
    I remember being able to squeeze 704K of conventional memory out of some systems (shadow RAM at $A0000, and a video card that sat at $B0000), then have certain programs complain that there was insufficient memory... because their counters wrapped at 640. Good times indeed.
  • Re:c ? really? (Score:3, Interesting)

    by multipartmixed ( 163409 ) on Thursday May 24, 2007 @08:16PM (#19262851) Homepage
    So, Haskell only runs on one architecture? Or they write a different Haskell compiler for every architecture they support?

    Damn, that's hard core. Not even GCC is that close to the metal! (except for a very, VERY tiny piece...)
  • by Locutus ( 9039 ) on Thursday May 24, 2007 @08:19PM (#19262895)
    That's right, after Microsoft shipped Windows 95, they dumped hundreds of millions on pushing Windows NT at the server markets. It was a full blown marketing attack on UNIX, Netware, and Lan Manager/OS/2 and we know it is marketing which won the day and admins who lost. How many UNIX servers turned into a dozen WinTel PCs after they found out one WinTel PC couldn't a few server processes and had to be split into one service/PC. Then they had to pull in replication to get anything close to the 99.9999% uptime of the UNIX systems.

    Yup, it's interesting how snake oil still gets sold year after year but only under a different name. IMO.

    Oh, and virtualization, that's all about moving all those single tasking servers back into one box where one crash won't take out the others. That's innovation for ya. Go Microsoft! :-/

    LoB
  • Re:dovetail (Score:2, Interesting)

    by rujholla ( 823296 ) on Thursday May 24, 2007 @08:29PM (#19263009)
    I am a contractor and occasionally I get contracts doing Cobol work. I can think of at least two of the companies that I have worked that still actively use Cobol with no plans to move off it. Plus another of the companies that I worked for wanted to get rid of their Cobol application to escape the oppressive mainframe license fees, but after supporting a project to rewrite that application for 3 years and several million dollars they came to the conclusion that it was easier to get a mainframe emulator and host the Cobol app on a Unix box, and there it continues to run 3 yrs later.

    I think until it becomes so difficult to find people to program Cobol application that it costs more to hire people than it does to rewrite the Cobol application those applications will remain. There are so many years of business processes built into the systems that rewriting them becomes more of a process of changing the business processes to match a new enterprise application than a program rewrite. And it is much more difficult to change the business process than it is to rewrite the application.

    BTW -- I don't hate my job when working with Cobol. It's just another language. Sure its a lot easier to program the same thing in a more modern language, but working with Cobol applications is almost never writing from scratch.
  • by gmack ( 197796 ) <gmack@@@innerfire...net> on Thursday May 24, 2007 @08:34PM (#19263087) Homepage Journal
    Phone systems are meant to just work and often the idea is that if it's still working it should be left that way. I contract for an ISP that has it's own adsl equipment and have an access card that gets me into several Bell Canada Buildings in Montreal and one in Toronto.

    The telephone world is a weird mix of the state of the art and old.

    I regularly see software that comes on 9 track reels and other ancient equipment.. My biggest shock was seeing in downtown Toronto equipment that still uses vacuum tubes.

  • Other dead skills (Score:4, Interesting)

    by Orion Blastar ( 457579 ) <`orionblastar' `at' `gmail.com'> on Thursday May 24, 2007 @09:11PM (#19263431) Homepage Journal
    Turbo Pascal, phased out with Delphi and Free Pascal/Lazarus replacing it. I still know people who know Turbo Pascal and I learned Turbo Pascal in 1985.

    LANTastic, I recall some people were experts with this network. I can recall when Windows for Workgroups came out and had built in networking that LANTastic went on decline.

    DBase and Clipper, I can recall porting databases and code written in them to MS-Access in 1996-1997.

    Wordperfect 5.0/6.0 macro writing. I know some small law firms that still have document templates automated with Wordperfect 5.0 for DOS or Windows. Hardly anyone else uses Wordperfect and has moved to MS-Word and used VBA for Macros.

    AmigaDOS/AmigaOS it used to be the bee's knees for video and multi-media in the late 1980's, I am one of the few left that still has Amiga skills on my resume. AmigaOS reached 4.0 quite some time ago, but hardly anyone uses it anymore except in Europe for various niche markets.

    ProDOS, AppleDOS, I think the Apple // series is long since dead and buried, but still alive in some poor school districts that couldn't afford to replace them.

    Mac OS9 and earlier, I think Mac OSX is the top dog now. The Classic MacOS is no longer in demand, and 68K Macs are only used in school districts that couldn't afford to replace them.

    BeOS, despite trying to bring it back from the dead it using open source, BeOS used to be popular in the late 1990's and used to run on PowerPC Macs and Intel PCs. I can recall some of my friends used to develop software for BeOS, but not anymore.

    Wang, some people I know still list Wang skills on their resume. It used to be in high demand, but once Windows NT 4.0 and Windows 2000 Server came out, there was a mass migration from Wang, after Wang got shrunk and almost went out of business. They did have some Visual BASIC graphic tool called Wang ImageBasic, but I think Internet Explorer 4.0 or 5.0 broke it and so did Visual BASIC 6.0 break it. I think Leadtools replaced it.

    8 Bit Computers, nobody really uses them anymore. Big Businesses only used the Apple // or CP/M systems and the Atari, Commodore, Sinclair/Timex, etc were used in the home mostly.

    The Apple Newton, the Palm Pilot and Windows CE devices replaced it.

    Arcnet and Starnet cards, Ethernet replaced them. Token Ring is almost dead, but some die-hard IBM Fans still use it at their companies. Anyone remember making twisted pair and coaxial cable network wires for Arcnet and Starnet networks? I do.

    MS-DOS 6.X and Windows 3.X and earlier, like OS/2 they deserve to be mentioned. I think some older charities non-profit organizations still use them on old 286/386 systems that cannot run even Windows 95, and they use a Netware 3.X server to share files with them.

    MS-Foxpro, does anyone still use it? After MS-Access got upgraded, and MS-SQL Server had more features added to it, MS-Foxpro became redundant.

    Assembly Language, Machine Language, remember writing native code for the 8088, 68000, 6502, 6809, IBM Mainframes, etc? Hardly any company wants us to write in Assembly or Machine language anymore. It seems like only hackers use these to do exploits and write malware.

    FORTRAN, I think BASIC and C sort of replaced it, and then C++ and Java replaced them. FORTRAN got NASA to the moon, but NASA uses Java or Python now.
  • Re:c ? really? (Score:3, Interesting)

    by iamacat ( 583406 ) on Thursday May 24, 2007 @09:23PM (#19263549)
    Umm... Thanks for insight into image filtering, but how are you avoiding the overhead of an equivalent C++ code? If you wrote your filters as C++ classes with virtual methods, you would be limited to 4/8 byte overhead (with single inheritance) no matter how many methods you have. You would have a consistent syntax for virtual and non-virtual methods. You would be able to more easily handle clean up with stack-based destructors and auto pointers. And your code would be still as small and fast as now. Perhaps even a bit faster, because you would be able to inline small functions and the compiler would optimize away some virtual calls to non-virtual.

    All C++ compilers in the market have ability to turn off exceptions, which is the only feature that inherently generates larger/slower code (but not slower than "if (e 0) goto cleanup" after every function call) even when not used. It would be trivial to institute a coding convention of making all constructors explicit and add -Doperator=foobar -Dtemplate=foobar to your Makefiles to avoid generating unwanted code.
  • by Jimmy King ( 828214 ) on Thursday May 24, 2007 @09:49PM (#19263859) Homepage Journal

    Not that simple. Companies want *paid* experience above all else. If you don't have paid experience, it often doesn't matter much if you memorized the book (unless you get lucky or find a backdoor).

    This is all too true. I had to fight with this problem for about 6 months awhile back. I got hired as a tier 2-3ish support guy who would also do some shell (primarily Korn and sh) and Tcl coding (along with some Javascript and had I stayed longer, possibly some C and some VB eventually). I was very lucky and they picked me for it due to my hobby perl and bash experience. A year and a half later I was lucky and was hired away (that last one was contract and going to end sometime) into a full time Perl dev position due to my hobby Perl experience and professional Tcl experience. Then there was the great layoff of '06.

    I went home, saw there was fuck all demand locally for Perl, Tcl, or C (unless you had 3-5 years experience with embedded C apps in robotics, which I only had some fairly basic hobby experience in C). There was quite the demand for C# and ASP.Net stuff, though. I thought to myself "Cool, I'll learn me some of that newfangled C# stuff. I'd been wanting to for fun anyway and now I've got both a reason and time to do it". No one cared. All of my C# was hobby programming. It didn't matter that I had code examples and a couple years professional experience developing in other languages, I didn't have professional experience in C#. No, that wasn't an assumption I made due to not getting jobs, that's what I was directly told by multiple recruiters, in case that's what anyone here was thinking.

    I spent the next 6 months honing my Perl, C, C#, and PHP skills while getting turned down left and right for everything that wasn't Perl due to lack of professional experience and seeing very little in the way of Perl unless I was willing to relocate halfway across the country for a 3 month contract. Fortunately, I was picked up again by the place that had laid me off in a more senior level Perl dev position.

    Now I'm trying to solve the problem of how to get my employer to pay me to improve my skills in or pick up another language so that if I have to look for work again I have something more than Perl and Tcl to bring to the table as languages I've got professional experience with.
  • Re:c ? really? (Score:5, Interesting)

    by SadGeekHermit ( 1077125 ) on Thursday May 24, 2007 @10:05PM (#19264001)
    I think they're confused, anyway -- they're writers, not programmers. I bet I can even guess how they did their research: they called up all the recruiters they could find and asked each one to list the languages he/she thought were dead or dying. Then they compared notes on all the responses they got, and built their final list.

    I think the list should be called "top 10 languages recruiters don't want to hear about" because that would be more accurate.

    Realistically, as far as C goes I think the following factors should be considered before declaring it a dead language:

    1. Most of the more popular object oriented languages (Java, C#, C++) use C syntax. C++ is a superset of C.

    2. Java can use compiled C modules as an analog to C's old "escape to assembler" technique. In other words, you can call C code from Java when you have something you want to get "close to the metal" on. Thus, a "Java Programmer" may very well ALSO be a C programmer, even if technically that isn't on his resume or job description. I can do this; I imagine most other Java programmers can as well. What's funny is that, once you're calling C code, you can turn around and use the C code to call assembler, Fortran, or whatever else you like! What a weird world this is!

    (Links for the skeptical):
    http://www.csharp.com/javacfort.html [csharp.com] (Ironic that it's on a CSharp site, no?)
    http://www.mactech.com/articles/mactech/Vol.13/13. 09/CallingCCodefromJava/index.html [mactech.com]
    http://java.sun.com/developer/onlineTraining/Progr amming/JDCBook/jniexamp.html [sun.com]

    3. Linux is still written in C, I believe. As are its drivers, KDE-related programs, Gnome-related programs, and whatnot.

    4. C is the modern version of assembler, isn't it?

    ANYway, I don't think C's going anywhere. You might not be able to get PAID for doing it, as your main speciality will probably be something more buzzword-heavy, but you'll probably be doing some of it as a part of whatever other weird and mysterious things you do in the ITU.

    Poor journalists... One suspects they're rather easily confused these days.

  • Re:dovetail (Score:2, Interesting)

    by fishbowl ( 7759 ) on Thursday May 24, 2007 @10:08PM (#19264029)
    >I just started working in the actuarial department of an insurance company. Almost all of our code is in Fortran.

    Ah, you work in the OTHER place where people routinely do calculus on the whiteboard and people who have a choice and know lots of languages, program in FORTRAN.

    Surprised the hell out of me too.

    For number-theoretic approaches to certain classes of problems, FORTRAN gives some guarantees, offers optimizations, has the widest range of libraries available, and scales in ways that aren't even a consideration in other idioms.

    I hate FORTRAN as a grammar, but I certainly now have an appreciation of why it's used by the people who use it. Know what it took to make me realize this? I needed to be told by someone whose skills in other languages meets or exceeds my own. Then I understood; it's not that you have a golden hammer, know only one thing, and stick to it (that's what I thought FORTRAN was!). It's something else, and it has to do with the fact that we are easily deluded into thinking that a modern grammar equals an object better suited to task. It turns out to be true in some cases, but life-and-death-NOT-TRUE in others.

    It would be scary to have someone naively think he could duplicate a math-intensive FORTRAN module with C or Java. How certain are you of the behavior of your language's exponentiation operator between quadruple precision floating point and an integer? Willing to bet your life on it? What does your language have built in for arbitrary precision? Willing to bet your life on that too?

  • by multipartmixed ( 163409 ) on Thursday May 24, 2007 @10:27PM (#19264225) Homepage
    > Think of how you would react if some Perl/Python programer said
    > "Sure, I don't know LISP or C ... but I could probably get to
    > moderate proficiency in two weeks".

    I would consider this individual a dedicated, professional programmer.

    Note that there is a HUGE difference between "moderate proficiency" and "very proficient". I figure it took me about 3,000 hours of C programming to get to the "very proficient" tier (it was also my first professional gig).

    Moderate proficiency is what half the OSS C code out there is. (I'm not counting the sendmails and apaches of the OSS world either). It works, it's obvious-bug free, it's not elegant and it's probably nowhere near optimal -- and desperately needs peer review from an expert to point out where improvements can be made.

    But it should be functional, not reinvent core elements of the language, perform the task correctly in the language's native paradigm, and be legible to a skilled programmer.

    Note, also, that I differentiate strongly between "core language" and common libraries available for use. Java does not imply J2EE, struts, and whatnot. JavaScript does not imply W3C DOM (but it does imply Math and Date). Perl does not imply CPAN. That said, C++ *does* imply stdlib, and C [to me] implies unistd, stdlib, stdio, strings, etc, but not xpg4, sockets, iconv, blah de blah.
  • my experiment (Score:4, Interesting)

    by wellingj ( 1030460 ) on Thursday May 24, 2007 @11:09PM (#19264615)
    I went to dice.com and started a blank search.
    The number of jobs(posted in the last 30 days) that was listed if I picked C as a skill?
    Answer: 17139 jobs

    Java?
    Answer: 15760 jobs

    So.....Myth-busted?
  • by KingSkippus ( 799657 ) * on Friday May 25, 2007 @01:44AM (#19265967) Homepage Journal

    Wow, to the poster above, thank you, that's a fantastic analogy!

    I've been beaten over the head with the "it does a LOT of things!" stick so many times it makes me sick. The problem is that it really sucks at all of them!

    It's really comical. Here's a typical me/Notes goober conversation:

    Me: The Notes client truly sucks as an e-mail client. It doesn't adhere to any OS's standard conventions, and it crashes a lot.
    Them: Well, Notes does kind of suck on the client side, but the servers are where it counts, and it's really stable.
    Me: Okay, well explain to me why we have at least one or two servers crash every week, and we have to schedule a reboot once a week then. Oh, and what happened to my e-mail? It's all gone!
    Them: Oh, sometimes databases just eat themselves. Don't worry, I'll restore everything up through last night from backup. But the rest of the time, it's stable! And besides, it's more than just an e-mail system. It's also a database!
    Me: Oh! Well, in that case, I have these two related tables that I need to store in an--
    Them: It's not a relational database, just a database.
    Me: Come again?
    Them: You can't actually relate the information from one table in another. They're just flat tables. No relations.
    Me: So, for most practical purposes, it's just a storage bucket that can't do what even Microsoft Access can then?
    Them: Oh, it can do rapid application development too, though. Yeah, it's a development environment, that's the ticket!
    Me: Oh! In that case, I'd like to create some kind of form where I can enter this information and store it. Then when I click that button, send an e-mail to those people with the information in it.
    Them: No problema.
    Me: Okay, that's a title, so it needs to be bold text--
    Them: Oh, that's a rich text field.
    Me: Yeah, so how do I code up a rich text field?
    Them: Well, that's kind of a beast. You can't really code it up directly, you have to create another object to store the information in and... Well... I don't really understand rich text fields myself. It's best just to avoid them. Even professional Notes developers know that.
    Me: So, it sucks as an e-mail system, it sucks as a database system, I can't even send out a frickin' formatted e-mail... Is there anything Notes does do well? Anything at all?
    Them: Oh, yes! Replication and security!
    Me: Fuck you. I'm using Gmail account.

    As a technical professional with a strong background in systems architecture and server administration, I would highly advice any serious businessperson to avoid Lotus Notes like the plague. Ignore me at you and your career's peril.

  • Re:c ? really? (Score:3, Interesting)

    by TheRaven64 ( 641858 ) on Friday May 25, 2007 @11:20AM (#19270447) Journal
    Multicore CPUs are likely to kill C. C is a great language for writing high-performance code on an architecture that is semantically similar to a (fast) PDP-11. The more chips become unlike a PDP-11, the less useful C is. You can do a lot of optimisations in a language that has single-assignment variables and immutable data types that you can't do with C (and you can't adapt C to support them without making it a completely different language). These aren't so important in a single thread, but they are when you introduce parallelism. You can't get this kind of thing in C without dropping into assembly, which is what C was meant to avoid. On the subject of assembly, I find that I don't write pure C anymore. Anything that can afford a slight performance hit is done in Objective-C, which is a lot easier. Everything else needs to take advantage of CPU-specific features, and so ends up having macros or inline functions that wrap snippets of inline assembly scattered throughout it.

    The only reason C is still alive today in the application space is that UNIX didn't define a cross-language ABI. Microsoft had COM, which let any languages with C++ like semantics share objects (now they have .NET which does the same thing for languages with C#-compatible semantics). VMS had something similar for sharing functions across procedural languages. UNIX just expects people to expose C functions and write a wrapper for their own language.

An Ada exception is when a routine gets in trouble and says 'Beam me up, Scotty'.

Working...