Top 10 Dead (or Dying) Computer Skills 766
Lucas123 writes "Computerworld reporter Mary Brandel spoke with academics and head hunters to compile this list of computer skills that are dying but may not yet have taken their last gasp. The article's message: Obsolescence is a relative — not absolute — term in the world of technology. 'In the early 1990s, it was all the rage to become a Certified NetWare Engineer, especially with Novell Inc. enjoying 90% market share for PC-based servers. "It seems like it happened overnight. Everyone had Novell, and within a two-year period, they'd all switched to NT," says David Hayes, president of HireMinds LLC in Cambridge, Mass.'"
True story... (Score:5, Interesting)
When I started working at the huge multinational company I work at now, there were three things that I had very little experience with that everyone swore would last at the company for decades to come: Token Ring, Netware, and Lotus Notes. I insisted that within the next few years, these technologies would be dead and the company would have to change, and I was constantly reminded of the millions of dollars invested in them.
It's eight years later. We have no Token Ring network. We have no Netware servers. I'm doing my damned best to convince people of how bad Lotus Notes sucks, and most everyone agrees, but we have a Notes support team that really likes their jobs and somehow manages to convince upper level management that it would cost billions of dollars to change to a real e-mail and collaboration solution. But I'm still holding out hope.
Godwilling, Lotus Notes will soon be on this list as well.
Delphi (Score:3, Interesting)
Re:COBOL as number one? (Score:3, Interesting)
You are 100% correct and like the article mentioned COBOL is still not only used at many companies but also taught in some universities Computer Science programs including the one I graduated from being Northern Illinois University in DeKalb. Here are two examples:
http://www.cs.niu.edu/undergrad/coursecat.html#25
http://www.cs.niu.edu/undergrad/coursecat.html#46
There are A LOT of companies that still use COBOL out there (I saw many of them at every job fair I went to) and the langauge is far from dead. Thankfully I didn't have to go the route of being a COBOL programmer and found a job I love doing C/C++ development but at least I have the option and I definitely did learn a lot about the langauge as well as mainframe programming from taking the COBOL classes.
Another great class they teach at NIU is Assembler on an IBM System 390. That class was HARD but I love the experience and knowledge it gave me regarding how a computer works at the lower levels and I wouldn't trade that experience for anything. Here is more info on the assembler class:
http://www.cs.niu.edu/undergrad/coursecat.html#36
While I am not exactly happy that COBOL is still around it still is a fact that it is going nowhere anytime soon.
I'd vote for FORTH (Score:1, Interesting)
I disagree with some of the list. (Score:5, Interesting)
Non-IP networks are dying? Must tell that to makers of Infiniband cards, who are carving out a very nice LAN niche and are set on moving into the WAN market. Also need to tell that to xDSL providers, who invariably use ATM, not IP. And if you consider IP to mean IPv4, then the US Government should be informed forthwith that its migration to IPv6 is "dead". Oh, and for satellite communication, they've only just got IP to even work. Since they weren't using string and tin cans before, I can only assume most in use are controlled via non-IP protocols and that this will be true for a very long time. More down-to-earth, PCI's latest specs allows for multiple hosts and is becoming a LAN protocol. USB, FireWire and Bluetooth are all networks of a sort - Bluetooth has a range of a mile, if you connect the devices via rifle.
C programming. Well, yes, the web is making pure C less useful for some applications, but I somehow don't think pure C developers will be begging in the streets any time soon. Device driver writers are in heavy demand, and you don't get far with those if you're working in Java. There are also an awful lot of patches/additions to Linux (a pure C environment), given this alleged death of C. I'd love to see someone code a hard realtime application (again, something in heavy demand) in AJAX. What about those relational databases mentioned earlier in the story? Those written in DHTML? Or do I C an indication of other languages at work?
Netware - well, given the talk about non-IP dying, this is redundant and just a filler. It's probably right, but it has no business being there with the other claim. One should go.
What should be there? Well, Formal Methods is dying, replaced by Extreme Programming. BSD is dying, but only according to Netcraft. Web programming is dying - people no longer write stuff, they use pre-built components. Pure parallel programming is dying -- it's far more efficient to have the OS divide up the work and rely on multi-CPU, multi-core, hyperthreaded systems to take care of all the tracking than it is to mess with very advanced programming techniques, message-passing libraries and the inevitable deadlock issues. Asynchronous hardware is essentially dead. Object-Oriented Databases seem to be pretty much dead. 3D outside of games seems to be dead. Memory-efficient and CPU-efficient programming methods are certainly dead. I guess that would be my list.
Re:dovetail (Score:5, Interesting)
The first two items on the list made me not want to take the author seriously. The financial business is run on COBOL and flat files, and will continue for some time. The language is not pretty, but it was made for a specific purpose and it does it well. In fact, demand for COBOL programmers has risen dramatically as people retire, and it is 7 years after Y2K. I know people who were asked to come out of retirement to work on COBOL again, for very high salaries, because it is not taught to us youngens anymore.
ColdFusion (Score:4, Interesting)
Typing (Score:3, Interesting)
It got me thinking.. secretaries used to be hired just based on their typing skills. Speed & accuracy. I remember when I took a typing class in high school the teacher made us cover the keyboard so we couldn't look at it while we were typing, and we especially weren't allowed to use the delete key so she could mark us on how many errors we made.
But it's funny, that's so backward, of course. Since typewriters are no longer used, your typing speed _includes_ the time it takes to hit the delete key and fix what you did wrong. You time further increases if you have to look at the screen and then find your place in the text. So typing speed is now the only thing that counts...
Now add into that the fact that the days of the boss dictating memos to the secretary are mostly gone, and typing is really a skill that no longer matters. It certainly helps in day-to-day computer tasks, but it's no longer a make or break skill for IT and office people.
Re:dovetail (Score:3, Interesting)
Re:ColdFusion Dead? (Score:4, Interesting)
If one was to use crappy solutions as an argument, how come anyone is still using php, asp, etc? I think ColdFusion has copped it more than any due to its low threshold of entry, but just because one _can_ make a shit solution using a platform doesn't mean the platform is shit.
Have been using ColdFusion for 11 years now and it keeps getting better. It's a great second language to know in addition to
Re:c ? really? (Score:4, Interesting)
In addition to that, you could say that the C we have today is an old 'stable' fork of that language, which is now moving ahead in the form of C++.
Comment removed (Score:5, Interesting)
30 year technologies (Score:3, Interesting)
Unix, shell scripting, C. There must be more.
Just a thought, but it makes sense to invest skills in technologies with proven survivability.
The curse of good presentation skills... (Score:3, Interesting)
The curse of good presentation skills is that no-one ever notices that you've used them, because you're good at presentation. :-)
I'm currently having a similar debate at the office. We're working on a new tool, effectively a web front end for a simple database with a few handy UI gimmicks. In the grand scheme of things, it's a pretty simple tool, but it's part of a routine business process and literally thousands of people are going to be using it for a few hours each month.
At a progress meeting yesterday, one of our (cross-discipline but manager-dominated) team suggested that we should skip the next couple of weeks of detailed design work and just go with something OK, because it will save a few days of our time and so be available faster. It doesn't seem to occur to them that all that time is the equivalent of mere minutes for each person who will be using the tool. If we identify a simple usability improvement that saves every user ten seconds the first time they use the tool just because they find something a little quicker, then that's saved the equivalent of about two solid weeks of one person's time to design and implement that improvement.
So it goes with typography and graphic design in documents. Poor choice of fonts or use of whitespace = hard to read on-screen = wasted staff time. Awkward page layout = people can't follow the text = wasted staff time. Poorly drawn diagram = distracted reader = wasted staff time. Poorly typeset equation = delayed understanding while the reader figures out what it says = wasted staff time. And so on...
Unfortunately, presentation skills are one of those things where people don't really notice the subtleties. If something is poorly presented then the viewer will still read something more slowly, or misunderstand what it says, or not remember it as well later, but they probably won't realise what they're missing.
Re:dovetail (Score:3, Interesting)
You owe me a dollar. The IT department I used to work at runs most of the organization-wide stuff on an IBM OS/390 monolith, written in legacy COBOL code. Any of the programmers that worked there that did hate their job didn't hate it because of COBOL, believe me. Instead of "phasing out" the system, they got a nice new flashy IBM server to run a virtualized version of the mainframe seamlessly. Will the code be phased out one day? Probably. Not because of the language, but because the size of the organization has scaled so much and the increase in communication capacity available to them has grown exponentially, so problems like this can be approached from new directions. Plus, COBOL coders are getting harder to find. But any COBOL gurus could probably eat very well for the next 10 years there, since that's probably the earliest that system will be "phased out".
And calling it the worst language EVAR is a bit disingenuous... It's very very good at what it does: business logic. I wouldn't want to write an operating system or a thermonuclear explosion simulation in it, but I wouldn't want to write check printing routines in C, either.
I dare to disagree (Score:3, Interesting)
When the applications died, the language followed. I dare say, ABAP is going to suffer the same fate as soon as SAP wanes and The Next Big Thing comes along. 'til then, it is a get-rich-quick scheme in IT if there ever was one, granted.
C, in its "pure" form, is certainly not going to last forever either. But C-derivates will be driving the systems for the forseeable future. C (and C++) offers a good balance between closeness to hardware, so you can actually write low level programs in it, and readability, while still maintaining some basic inter-platform compatibility (unless you want to get really, really close to the system). Even if you can't port a program seamlessly, you can at the very least apply your skills to the compiler on a different platform, something that's not a given with languages that get closer to the core.
C is probably not going to last. But its successors, C++ and C#, will. Well, C++ will, for sure. Whether C# is just another fad that gets a lot of hype now and gets dumped soon, time will tell.
Re:Raising the bar (Score:4, Interesting)
Re:c ? really? (Score:2, Interesting)
I could do that sort of thing but felt it was worth my while learning C++.
Re:Some of the list looks good (Score:3, Interesting)
The rest of the things you've mentioned all outdate C++, let alone newer languages. And most don't use assembly at at all (apache doesn't, I doubt BIND or sendmail does). The UNIX kernels don't use assembly because it's faster (though it helps), but because there are things that must be done that are platform specific, and C's provided way of handling those cases is assembly. But generally, x86 computers are changing too rapidly to make any optimizations worthwhile. Just make clear code and let the compiler handle the dirty work. If you don't think it's good enough, improve the compiler. But usually the people who claim to "write their inner loops in ASM" have troubles making solid contributions to compiler technology.
Re:Memory Tuning (Score:4, Interesting)
Re:c ? really? (Score:3, Interesting)
Damn, that's hard core. Not even GCC is that close to the metal! (except for a very, VERY tiny piece...)
a few hundred million in marketing gets customers (Score:5, Interesting)
Yup, it's interesting how snake oil still gets sold year after year but only under a different name. IMO.
Oh, and virtualization, that's all about moving all those single tasking servers back into one box where one crash won't take out the others. That's innovation for ya. Go Microsoft!
LoB
Re:dovetail (Score:2, Interesting)
I think until it becomes so difficult to find people to program Cobol application that it costs more to hire people than it does to rewrite the Cobol application those applications will remain. There are so many years of business processes built into the systems that rewriting them becomes more of a process of changing the business processes to match a new enterprise application than a program rewrite. And it is much more difficult to change the business process than it is to rewrite the application.
BTW -- I don't hate my job when working with Cobol. It's just another language. Sure its a lot easier to program the same thing in a more modern language, but working with Cobol applications is almost never writing from scratch.
Re:IT is More Than Software (Score:4, Interesting)
The telephone world is a weird mix of the state of the art and old.
I regularly see software that comes on 9 track reels and other ancient equipment.. My biggest shock was seeing in downtown Toronto equipment that still uses vacuum tubes.
Other dead skills (Score:4, Interesting)
LANTastic, I recall some people were experts with this network. I can recall when Windows for Workgroups came out and had built in networking that LANTastic went on decline.
DBase and Clipper, I can recall porting databases and code written in them to MS-Access in 1996-1997.
Wordperfect 5.0/6.0 macro writing. I know some small law firms that still have document templates automated with Wordperfect 5.0 for DOS or Windows. Hardly anyone else uses Wordperfect and has moved to MS-Word and used VBA for Macros.
AmigaDOS/AmigaOS it used to be the bee's knees for video and multi-media in the late 1980's, I am one of the few left that still has Amiga skills on my resume. AmigaOS reached 4.0 quite some time ago, but hardly anyone uses it anymore except in Europe for various niche markets.
ProDOS, AppleDOS, I think the Apple
Mac OS9 and earlier, I think Mac OSX is the top dog now. The Classic MacOS is no longer in demand, and 68K Macs are only used in school districts that couldn't afford to replace them.
BeOS, despite trying to bring it back from the dead it using open source, BeOS used to be popular in the late 1990's and used to run on PowerPC Macs and Intel PCs. I can recall some of my friends used to develop software for BeOS, but not anymore.
Wang, some people I know still list Wang skills on their resume. It used to be in high demand, but once Windows NT 4.0 and Windows 2000 Server came out, there was a mass migration from Wang, after Wang got shrunk and almost went out of business. They did have some Visual BASIC graphic tool called Wang ImageBasic, but I think Internet Explorer 4.0 or 5.0 broke it and so did Visual BASIC 6.0 break it. I think Leadtools replaced it.
8 Bit Computers, nobody really uses them anymore. Big Businesses only used the Apple
The Apple Newton, the Palm Pilot and Windows CE devices replaced it.
Arcnet and Starnet cards, Ethernet replaced them. Token Ring is almost dead, but some die-hard IBM Fans still use it at their companies. Anyone remember making twisted pair and coaxial cable network wires for Arcnet and Starnet networks? I do.
MS-DOS 6.X and Windows 3.X and earlier, like OS/2 they deserve to be mentioned. I think some older charities non-profit organizations still use them on old 286/386 systems that cannot run even Windows 95, and they use a Netware 3.X server to share files with them.
MS-Foxpro, does anyone still use it? After MS-Access got upgraded, and MS-SQL Server had more features added to it, MS-Foxpro became redundant.
Assembly Language, Machine Language, remember writing native code for the 8088, 68000, 6502, 6809, IBM Mainframes, etc? Hardly any company wants us to write in Assembly or Machine language anymore. It seems like only hackers use these to do exploits and write malware.
FORTRAN, I think BASIC and C sort of replaced it, and then C++ and Java replaced them. FORTRAN got NASA to the moon, but NASA uses Java or Python now.
Re:c ? really? (Score:3, Interesting)
All C++ compilers in the market have ability to turn off exceptions, which is the only feature that inherently generates larger/slower code (but not slower than "if (e 0) goto cleanup" after every function call) even when not used. It would be trivial to institute a coding convention of making all constructors explicit and add -Doperator=foobar -Dtemplate=foobar to your Makefiles to avoid generating unwanted code.
Re:Wait, there IT people who specialize this much? (Score:3, Interesting)
This is all too true. I had to fight with this problem for about 6 months awhile back. I got hired as a tier 2-3ish support guy who would also do some shell (primarily Korn and sh) and Tcl coding (along with some Javascript and had I stayed longer, possibly some C and some VB eventually). I was very lucky and they picked me for it due to my hobby perl and bash experience. A year and a half later I was lucky and was hired away (that last one was contract and going to end sometime) into a full time Perl dev position due to my hobby Perl experience and professional Tcl experience. Then there was the great layoff of '06.
I went home, saw there was fuck all demand locally for Perl, Tcl, or C (unless you had 3-5 years experience with embedded C apps in robotics, which I only had some fairly basic hobby experience in C). There was quite the demand for C# and ASP.Net stuff, though. I thought to myself "Cool, I'll learn me some of that newfangled C# stuff. I'd been wanting to for fun anyway and now I've got both a reason and time to do it". No one cared. All of my C# was hobby programming. It didn't matter that I had code examples and a couple years professional experience developing in other languages, I didn't have professional experience in C#. No, that wasn't an assumption I made due to not getting jobs, that's what I was directly told by multiple recruiters, in case that's what anyone here was thinking.
I spent the next 6 months honing my Perl, C, C#, and PHP skills while getting turned down left and right for everything that wasn't Perl due to lack of professional experience and seeing very little in the way of Perl unless I was willing to relocate halfway across the country for a 3 month contract. Fortunately, I was picked up again by the place that had laid me off in a more senior level Perl dev position.
Now I'm trying to solve the problem of how to get my employer to pay me to improve my skills in or pick up another language so that if I have to look for work again I have something more than Perl and Tcl to bring to the table as languages I've got professional experience with.
Re:c ? really? (Score:5, Interesting)
I think the list should be called "top 10 languages recruiters don't want to hear about" because that would be more accurate.
Realistically, as far as C goes I think the following factors should be considered before declaring it a dead language:
1. Most of the more popular object oriented languages (Java, C#, C++) use C syntax. C++ is a superset of C.
2. Java can use compiled C modules as an analog to C's old "escape to assembler" technique. In other words, you can call C code from Java when you have something you want to get "close to the metal" on. Thus, a "Java Programmer" may very well ALSO be a C programmer, even if technically that isn't on his resume or job description. I can do this; I imagine most other Java programmers can as well. What's funny is that, once you're calling C code, you can turn around and use the C code to call assembler, Fortran, or whatever else you like! What a weird world this is!
(Links for the skeptical):
http://www.csharp.com/javacfort.html [csharp.com] (Ironic that it's on a CSharp site, no?)
http://www.mactech.com/articles/mactech/Vol.13/13
http://java.sun.com/developer/onlineTraining/Prog
3. Linux is still written in C, I believe. As are its drivers, KDE-related programs, Gnome-related programs, and whatnot.
4. C is the modern version of assembler, isn't it?
ANYway, I don't think C's going anywhere. You might not be able to get PAID for doing it, as your main speciality will probably be something more buzzword-heavy, but you'll probably be doing some of it as a part of whatever other weird and mysterious things you do in the ITU.
Poor journalists... One suspects they're rather easily confused these days.
Re:dovetail (Score:2, Interesting)
Ah, you work in the OTHER place where people routinely do calculus on the whiteboard and people who have a choice and know lots of languages, program in FORTRAN.
Surprised the hell out of me too.
For number-theoretic approaches to certain classes of problems, FORTRAN gives some guarantees, offers optimizations, has the widest range of libraries available, and scales in ways that aren't even a consideration in other idioms.
I hate FORTRAN as a grammar, but I certainly now have an appreciation of why it's used by the people who use it. Know what it took to make me realize this? I needed to be told by someone whose skills in other languages meets or exceeds my own. Then I understood; it's not that you have a golden hammer, know only one thing, and stick to it (that's what I thought FORTRAN was!). It's something else, and it has to do with the fact that we are easily deluded into thinking that a modern grammar equals an object better suited to task. It turns out to be true in some cases, but life-and-death-NOT-TRUE in others.
It would be scary to have someone naively think he could duplicate a math-intensive FORTRAN module with C or Java. How certain are you of the behavior of your language's exponentiation operator between quadruple precision floating point and an integer? Willing to bet your life on it? What does your language have built in for arbitrary precision? Willing to bet your life on that too?
Re:It's not C. It's the C only programmer. (Score:3, Interesting)
> "Sure, I don't know LISP or C
> moderate proficiency in two weeks".
I would consider this individual a dedicated, professional programmer.
Note that there is a HUGE difference between "moderate proficiency" and "very proficient". I figure it took me about 3,000 hours of C programming to get to the "very proficient" tier (it was also my first professional gig).
Moderate proficiency is what half the OSS C code out there is. (I'm not counting the sendmails and apaches of the OSS world either). It works, it's obvious-bug free, it's not elegant and it's probably nowhere near optimal -- and desperately needs peer review from an expert to point out where improvements can be made.
But it should be functional, not reinvent core elements of the language, perform the task correctly in the language's native paradigm, and be legible to a skilled programmer.
Note, also, that I differentiate strongly between "core language" and common libraries available for use. Java does not imply J2EE, struts, and whatnot. JavaScript does not imply W3C DOM (but it does imply Math and Date). Perl does not imply CPAN. That said, C++ *does* imply stdlib, and C [to me] implies unistd, stdlib, stdio, strings, etc, but not xpg4, sockets, iconv, blah de blah.
my experiment (Score:4, Interesting)
The number of jobs(posted in the last 30 days) that was listed if I picked C as a skill?
Answer: 17139 jobs
Java?
Answer: 15760 jobs
So.....Myth-busted?
Re:Amateur's make me laugh. (Score:5, Interesting)
Wow, to the poster above, thank you, that's a fantastic analogy!
I've been beaten over the head with the "it does a LOT of things!" stick so many times it makes me sick. The problem is that it really sucks at all of them!
It's really comical. Here's a typical me/Notes goober conversation:
As a technical professional with a strong background in systems architecture and server administration, I would highly advice any serious businessperson to avoid Lotus Notes like the plague. Ignore me at you and your career's peril.
Re:c ? really? (Score:3, Interesting)
The only reason C is still alive today in the application space is that UNIX didn't define a cross-language ABI. Microsoft had COM, which let any languages with C++ like semantics share objects (now they have .NET which does the same thing for languages with C#-compatible semantics). VMS had something similar for sharing functions across procedural languages. UNIX just expects people to expose C functions and write a wrapper for their own language.