Top 10 Dead (or Dying) Computer Skills 766
Lucas123 writes "Computerworld reporter Mary Brandel spoke with academics and head hunters to compile this list of computer skills that are dying but may not yet have taken their last gasp. The article's message: Obsolescence is a relative — not absolute — term in the world of technology. 'In the early 1990s, it was all the rage to become a Certified NetWare Engineer, especially with Novell Inc. enjoying 90% market share for PC-based servers. "It seems like it happened overnight. Everyone had Novell, and within a two-year period, they'd all switched to NT," says David Hayes, president of HireMinds LLC in Cambridge, Mass.'"
c ? really? (Score:5, Insightful)
Re:c ? really? (Score:5, Insightful)
Re:c ? really? (Score:5, Insightful)
I'm kidding, but only partially. I was a COBOL developer for lots of years, and I thought that COBOL would never die either. I would say "Too many companies are too invested
Re:c ? really? (Score:5, Funny)
Re:c ? really? (Score:5, Funny)
Re:c ? really? (Score:5, Insightful)
Re:c ? really? (Score:5, Insightful)
No one shall expel us from the Paradise that Richie has created (Apologies to David Hilbert)
Re:c ? really? (Score:4, Funny)
Re:c ? really? (Score:5, Insightful)
Computerworld Magazine, being an IT rag, is concerned about IT, not computer science or engineering. Thus it worries about product names, not categories. So the point out skills in SNA or Novell Netware that aren't needed so much anymore, even though computer networking skills are even more popular and vital today than in the past. COBOL may be virtually dead, but dry and dusty applications for business purposes are alive and well and written by people who still wear ties.
Re:Ha! C != performance (Score:5, Informative)
I've coded some fairly complicated high performance multi-threaded applications in C. It's not easy, but it's not easy in C# or Java either. There have been many minor improvements to the required syntax, but that has never been the hard part of multi-threaded development. Parallelizing (sp, I know) the problem is a conceptual problem unrelated to language.
Re:I agree totally.... BUT (Score:4, Insightful)
Re:I agree totally.... BUT (Score:5, Informative)
Simple economics: developer time is expensive, and the cost of it keeps rising with inflation, if not beating it, making the cost of developer time ever more expensive in real terms.
Meanwhile, hardware continues to drop in price in both nominal (not inflation-adjusted) and real (inflation-adjusted) terms.
It's cheaper to implement for a 16 core, 8GByte RAM box than it is to pay a developer to optimize the code so it can run on a single 486DX2/66...
Re:I agree totally.... BUT (Score:5, Informative)
You _have_ to write efficient code for those. The laws of physics say that these small processors will *not* get substantially faster, because they need to be very low power and have very small die sizes, so you can't just throw MHz and extra transistors at them to compensate for software bloat. Anybody working with embedded computers still has to write efficient code, and get as close to the metal as they can. This means assembly language or C.
The cost of developer time in an embedded device that will ship millions of units is trivial compared to having to use a more powerful microcontroller to compensate for bloated code. In the PC world, of course, the opposite holds true - since the software developer is only shipping a software device, they can just rely on the customer to buy beefier hardware at no cost to the software developer. Embedded developers cannot push the cost of bloat onto their customers without losing out to their competitors.
Re:I agree totally.... BUT (Score:5, Insightful)
Now let's take this a bit further -- how much of a performance hit do you take when you access memory that is not in the CPU's cache (or 2nd level cache)? The CPU will have to wait for the memory to be available... optimizing code that frequently accesses memory outside the cache would be useless (and would just mean the CPU has to wait a bit longer). Let's take quicksort, the algorithm isn't particularly hard but accesses memory a lot. Would it matter if one iteration takes 20 cycles or 40 cycles on a modern CPU (let's assume that's the difference between C and Java)? It will make little difference, the CPU has to wait for data anyway. In the end, even in such a low level algorithm, it will make little difference whether we used a very efficient piece of code, or a slightly less efficient one -- the bottleneck is the memory. In other words, as long as the algorithm you use is the same, both pieces of code should be about as efficient.
The only time optimizing is still worth it is when you are doing stuff in tight loops that isn't randomly accessing memory for all kinds of reasons (and which of course is used to do a lot of bulk processing, like video encoding) -- it's hard to even think of a good example, but I suppose it might be worth using more efficient code in signal processing, compression/decompression and rendering applications. Even in those cases however a lot of stuff is handled in optimized libraries for higher level languages.. I mean, it won't make any difference if I use Bash (horribly inefficient!) to call my favourite Unzip program to unzip a multi-megabyte file, or whether I wrote a C program to do the same. It would still take as long.
Re:c ? really? (Score:4, Interesting)
In addition to that, you could say that the C we have today is an old 'stable' fork of that language, which is now moving ahead in the form of C++.
Re:c ? really? (Score:5, Insightful)
In fact, no language that isn't pretty much C can replace C. If it doesn't give you the control over pointers and memory allocation that you have with C, it won't work as a replacement. If it does have those thing, it is not going to replace C unless it is a backwards compatible extension like C++ or Obj-C.
Comment removed (Score:5, Interesting)
Re:IT is More Than Software (Score:4, Interesting)
The telephone world is a weird mix of the state of the art and old.
I regularly see software that comes on 9 track reels and other ancient equipment.. My biggest shock was seeing in downtown Toronto equipment that still uses vacuum tubes.
Re:IT is More Than Software (Score:4, Funny)
I bet it has a lovely rich, warm sound, though...
30 year technologies (Score:3, Interesting)
Unix, shell scripting, C. There must be more.
Just a thought, but it makes sense to invest skills in technologies with proven survivability.
I dare to disagree (Score:3, Interesting)
When the applications died, the language followed. I dare say, ABAP is going to suffer the same fate as soon as SAP wanes and The Next Big Thing comes along. 'til then, it is a get-rich-quick scheme in IT if there ever was one, granted.
C, in its "pu
Re:c ? really? (Score:4, Insightful)
In the next fifty years I imagine C's role more or less becoming that of the "mother language". 90% of the time everyone will be using higher level languages like Perl, Ruby, and Haskell on their Linux computers, all of which are programmed in C. Programmers will only need C when they need to change their lower level system tools, or to write new ones that perform very efficiently.
The only way I can see C dying is if a kernel comes along with a Linux compatible system interface that's written in a language suited better to the massively parallelized CPUs of the future. And once the kernel moves away from C, applications are bound to follow.
Re:c ? really? (Score:5, Funny)
Wait until Y3K. Then everyone will come crawling back, offering COBOL programmers big bucks.
Re:c ? really? (Score:5, Informative)
Re:c ? really? (Score:5, Informative)
Well, yeah, every language will eventually fade out. But C is still going on strong, as its still the language of choice for many low level applications. I just searched Monster.com and found over 2500 jobs referencing C [monster.com] (its possible that some of the results are because the term "C" is too generic, but most of the titles indicate that C programming is actually part of the job), while Python gets 419 [monster.com], Ruby gets 168 [monster.com], PHP gets 612 [monster.com], and JavaScript gets 1736 [monster.com]. How the hell can C be considered dead if its one of the most popular languages around, and probably still the best available choice for a huge class of applications (just not web applications)?
And in fact even the "dead and buried" Cobol is still alive, with 174 jobs [monster.com]. Now, its not as much as the more popular languages, but its still more than Ruby, which is supposed to be the next big thing.
Anyways, from TFA:
Despite what this guy thinks, web programming hasn't "taken over", and never will. Yes, it has a large niche, but there are many systems out there that are not, nor never will be web applications. Unfortunately some people (like this guy, he owns some dumb .com company that no one has ever heard of, how does that make him an expert on the subject) have tunnel vision and think that since they work on web applications, everyone else must as well.
Re:c ? really? (Score:4, Funny)
WTF? I mean, I can understand learning Java to update your skills, but forth? And how does learning java lead to learning forth anyway?
Must be one of those ancient COBOL codgers who's lost his marbles. ;-P
Re:c ? really? (Score:4, Insightful)
Re:c ? really? (Score:5, Insightful)
If you mean whether it's curly braces or brackets or none at all and the syntax of basic control flow, then yes. If you mean being familiar with the standard library, the development tools and all the specific bits (Java generics, C++ templates, take your pick)? No.
Re:c ? really? (Score:5, Insightful)
Re:c ? really? (Score:5, Insightful)
>I'm surprised that Fortran didn't make the list.
I would have been, but working in scientific research I've discovered a couple of things that are surprising:
1. People actually do calculus on the whiteboard for reasons other than taking a math class.
2. Lots of people actually use FORTRAN. Even people whose Java, C, C++, Perl, Ruby, etc. skills are such that I look up to them -- and they have solid arguments for using FORTRAN, at least for certain kinds of numerical computing.
But here's the thing: There are separate worlds. In one world, the idea of using calculus on a daily basis is simply never a consideration. You learn enough to finish college, and that's the end of it. Likewise, there's a world where numerical computing and arbitrary precision and optimized complex arithmetic are actually primary considerations and not just hypothetical things.
I never understood this until I found myself in that world. And I wouldn't have believed you if you told me that people who know other languages, choose FORTRAN even when given a choice.
But what I take from it, is that there are requirements that are met by FORTRAN which are not met by languages that offer more comfortable grammars.
People (myself included) will argue that, for instance, C can do anything that FORTRAN can do, in a much happier grammar (opinion, mine, widely shared), but the thing is... while that's strictly true, a lot of the things that seem tangential or irrelevant, turn out to be *crucial*, where seriously optimized math support is the core of the application. FORTRAN makes guarantees on the kinds of things that are implementation dependent in C.
Anyway, there's no shortage of FORTRAN programmers. It's quite easy for a skilled programmer to learn FORTRAN, once you get past the 'WTF' factor and can accept that it's relevant in todays world, at least when your problem space is a good fit for the language.
COBOL or PL/1 and the like, make another story entirely. My experience has been that the role of COBOL has been replaced by the combination of transitions to modern RDBMS, decentralized business processes as a side-effect of the whole ubiquitous "PC" adoption, and the adoption of, for example, Enterprise Java. That covers one end of the spectrum, and the other end (the big corporate end) is covered by the evolution of vertical systems providers (e.g., Peoplesoft, SAP, SAIC).
Back on topic: If there's a university CS program that gives degrees without courses in Operating System and Compiler design taught in C, I'd love to hear about it. No way are C programmers in decreasing supply. If nothing else, the million or so open source projects have created a whole generation of self-taught folks who know C.
Re: (Score:3, Informative)
C as a vehicle for embedded programming is very much alive. I work as an embedded programming for devices ranging from 8 bit PIC's to DSP's and most things in between.
How would you like to code a TCP/IP stack in asm? It's not entertaining and while low power low cost embedded devices are more and more having ethernet MAC and PHY layers embedded in them C programming for these devices becomes more and more important. At the point where a $1.50 micro can
Re:c ? really? (Score:5, Interesting)
I think the list should be called "top 10 languages recruiters don't want to hear about" because that would be more accurate.
Realistically, as far as C goes I think the following factors should be considered before declaring it a dead language:
1. Most of the more popular object oriented languages (Java, C#, C++) use C syntax. C++ is a superset of C.
2. Java can use compiled C modules as an analog to C's old "escape to assembler" technique. In other words, you can call C code from Java when you have something you want to get "close to the metal" on. Thus, a "Java Programmer" may very well ALSO be a C programmer, even if technically that isn't on his resume or job description. I can do this; I imagine most other Java programmers can as well. What's funny is that, once you're calling C code, you can turn around and use the C code to call assembler, Fortran, or whatever else you like! What a weird world this is!
(Links for the skeptical):
http://www.csharp.com/javacfort.html [csharp.com] (Ironic that it's on a CSharp site, no?)
http://www.mactech.com/articles/mactech/Vol.13/13
http://java.sun.com/developer/onlineTraining/Prog
3. Linux is still written in C, I believe. As are its drivers, KDE-related programs, Gnome-related programs, and whatnot.
4. C is the modern version of assembler, isn't it?
ANYway, I don't think C's going anywhere. You might not be able to get PAID for doing it, as your main speciality will probably be something more buzzword-heavy, but you'll probably be doing some of it as a part of whatever other weird and mysterious things you do in the ITU.
Poor journalists... One suspects they're rather easily confused these days.
They said something else. (Score:4, Insightful)
Now I know some people who've learned on C#, but I'm sure that will change in the near future.
Anyone who originally learned C, and is still writing code, has probably picked up a few other languages over the years.
Re:They said something else. (Score:4, Insightful)
I learend Assembly first and have programmed nearly every cpu except a vax. I learned C second (in 1976) and still use it every day.
I also know: apl, forth, snobol, fortran, rpg, cobol, lisp, smalltalk, pascal, algol and probably more I can't remember. I've never done much in them except for a tiny bit of forth. I know postscript pretty well and used it quite a bit. But I do C, day in day out and do not think very much of C++ or (worse) C#. The oddball languages have their place but for most it's a pretty narrow niche. I'd rather invert a matrix in Fortran than C though, I'll admit.
Andrea Frankel said it best on usenet in the 80s: "If you need to hire a programmer ones with assembly are better than ones without".
It worries me that people today don't actually know how computers work inside any more.
Re:They said something else. (Score:4, Insightful)
If you claim to have a BS in CS at the interview table, but didn't suffer through, e.g., a computer organization course (like Hennessy and Patterson style, which is common these days), didn't have a course where you developed an operating system, didn't design a language starting from BNF and build a compiler for it, didn't take 2 years of a lab science, didn't at least come close to a math minor, didn't have at least 4 courses at various levels of discrete math, automata, algorithm analyis, and didn't have a course that, as a final project, you deliver a significant user app in a high level language (on the order of an original multiplayer game, let's say)... I'd say your school has some explaining to do.
Seriously, what school can you go to and somehow avoid having a significant background in several languages and paradigms, including but certainly not limited to asm, C, and either C++ or Java, if not both? I don't expect everyone to take an elective where they compare Lisp, Scheme, Ruby, Haskell, and Icon. And I realize that there's often a variety of senior-year choices, and some people don't take Databases choosing HighPerf/Parallel/Distributed computing, say, or 3D Graphics. But there are some basic things that you'd better have done, or else, despite having the piece of paper hanging on your office wall, you're not done with school!
Re:They said something else. (Score:5, Funny)
No, I'm clean shaven and have all of my long blond hair. I like long walks on the beach and pina coladas. I'm good with kids and dogs.
You better be a chick, bitch.
Re:They said something else. (Score:5, Insightful)
It's pretty much true. Look at the other languages you "should" learn today. Perl, PHP, Python, C#, Java... When you know your C well, learning them is fairly easy.
Re:Raising the bar (Score:4, Insightful)
Re:Raising the bar (Score:4, Interesting)
dovetail (Score:5, Informative)
Here's a link to the print version [computerworld.com] for those who dislike clicking 18 times to read a news piece.
And for those not wanting to feed the gossiping trolls altogether, here's the (pointless) "Top 10" list in short form.
1. Cobol
2. Nonrelational DBMS
3. Non-IP networks
4. cc:Mail
5. ColdFusion
6. C programming
7. PowerBuilder
8. Certified NetWare Engineers
9. PC network administrators
10. OS/2
You may now return to the
Re:dovetail (Score:5, Insightful)
Re:dovetail (Score:5, Insightful)
I think you (and many others) are somewhat missing the point of the article, although the somewhat histrionic headline encourages a "miss the forest for the trees" reading.
I don't think anyone is expecting C or even COBOL to vanish with the speed of PowerBuilder or NetWare; the issue is whether those are actually "growth markets" any more. The article is asserting they're not, and particularly in COBOL's case I'm pretty sure that's correct. COBOL will probably live on for quite some time, but you don't hear much about people deploying new COBOL projects -- you hear about them supporting existing ones that haven't been replaced.
As for "but the OSes are written in C!" as a battle cry: well, yes, they are. But 25 years ago, they sure weren't: C was just too damn big and slow to write an operating system in. What's happened since then? Computers have gotten orders of magnitude faster, RAM and disk space have gotten orders of magnitude bigger, and of course compiler technology has also just gotten better. Couple that with the fact that operating systems and the computers they run on are just a lot more complicated -- having a higher-level language to deal with that, even at the system level, is a real advantage. There's nothing that prevents you from writing an operating system in assembly language now, but under most circumstances you probably wouldn't want to.
The thing is, unless you want to assert that computers twenty years from now will not be much faster and have much more storage and be much more complicated, you can't assert that moving to a higher-level language than C will never be either practical or beneficial even at a system level. I don't expect C to go away or even be relegated to "has-been" status, but I suspect in the long term it isn't a growth skill. It's going to move more deeply into embedded systems and other arenas where small is not merely beautiful but necessary.
The comparison with COBOL may be overstated, but it may not be completely inapt: the fact that there are still COBOL jobs out there and they may actually be fairly high-paying ones doesn't mean that going to school, in 2007, in preparation for a career as a COBOL developer is a bright idea. The same isn't as true for C, but I'm not convinced that's going to stay true for that much longer, let alone indefinitely.
Re: (Score:3, Interesting)
Re: (Score:3, Interesting)
You owe me a dollar. The IT department I used to work at runs most of the organization-wide stuff on an IBM OS/390 monolith, written in legacy COBOL code. Any of the programmers that worked there that did
Re:dovetail (Score:5, Interesting)
The first two items on the list made me not want to take the author seriously. The financial business is run on COBOL and flat files, and will continue for some time. The language is not pretty, but it was made for a specific purpose and it does it well. In fact, demand for COBOL programmers has risen dramatically as people retire, and it is 7 years after Y2K. I know people who were asked to come out of retirement to work on COBOL again, for very high salaries, because it is not taught to us youngens anymore.
Re:dovetail (Score:5, Funny)
20 GOTO 10
Re:c ? really? (Score:5, Insightful)
No, C isn't in any way going out. C produces fast, tight code that so far, C++ and C# can't even begin to match. C++ is just C with a lot of baggage, a great deal of which you can implement in C in a completely controllable, transparent and maintainable manner. We use the most important of those regularly in C code, specifically objects and objects with methods. We obtain better performance, smaller executables, and smaller memory footprints than any company that makes similar software using C++ or Objective C's add-on paradigms. C, and the C sub-domain of C++ and so on, is no more "going away" than C++ itself is. C occupies a unique niche between the metal of assembly and the (so far) considerably less efficient higher level languages — I'm talking about results here, not code. I'm all about recognizing that a few lines of C++ are very convenient, but the cost of those lines is still too high to even think about abandoning C code for performance applications. For many, the object isn't finding the absolute easiest way to write code, but instead trying to find a balance between portability, reasonable code effort and high performance. C sits exactly in that niche. C++ is easier to write, almost as portable, but produces applications with large footprints, inherited, unfixable problems inside non-transparent objects (like Microsoft's treeview, to name one), and a considerable loss of speed as compared to a coder who has a good sense of just what the C compiler actually does (which usually means a C coder that has assembly experience, intimate knowledge of stacks and registers and heaps and so on.)
Speaking as the guy who does the hiring around here, If your resume shows C and assembler experience, you've made a great start. Even just assembler. C or C++ only, and your odds have dropped considerably. C, assembler and either a great math background or specifically signal processing, and now we're talking. C++ doesn't hurt your chances, but you won't get to use it around here. :)
Re:c ? really? (Score:5, Insightful)
Wow. You must have had some really shitty software engineers. It's very likely that you can create C++ code that is as fast or faster than C. Yes, I said it. Implementing virtual inheritance and method overloading in plain C is doable, but it will be very complex. Templates? Don't even want to think about it.
I have no idea what the MS Treeview problem is, but once again - the programmers that you have worked with must have sucked balls. I'm an old C coder, with some solid x86 assembler knowledge. As you say, it's possible to get very high performing applications using C. However, why would I do that, when I can create code that is just as fast and much more readable by using C++? Yes - even for embedded development, which is my dayjob.
Re: (Score:3, Insightful)
So, when you have a small object with 10 methods, do you actually waste 40 bytes on 10 function pointers? Do you define your own virtual table structure, use var->vtable.move(var, x, y) kind of notation and require users to call non-virtual methods as ClassName_MethodName(obj, arg,
Re:c ? really? (Score:5, Insightful)
So can other languages, I don't think that is the main selling point of C. I think the main selling point of C is that it is by far the most "compatible" language of all. Python, Perl, Ruby and friends are all based on C, if you want to extend the languages, you do so by writing a module using their C API. If you want to simply call C functions, you can do so from many other languages as well be it Lisp, Ada or whatever. If you want to talk to the kernel you do so in C. No matter what language you use, sooner or later you come to a point where you have to fall back and either write C or interface with C code, since C is the 'real thing', while everything else is just an ugly wrapper around C code, trying to hide it, but often failing at doing so properly.
As long as a ton of stuff is based on C code it isn't going away, especially not in the OpenSource world where basically everything is based on C.
Maybe one day some Java or
Re:c ? really? (Score:5, Insightful)
However they are all implemented in C, as is PHP. In fact, I'm reasonably confident you'll find all of the web languages that the article declares are taking over are implemented using C. As is Apache, which is the backbone of the majority of internet servers. In fact, pretty much everything that provides important infrastructure is written in C.
There may be demand right now for programmers that know the latest fad high-level language, but the demand for competent C programmers has hardly disappeared. The only reason that C would die is if another fast, portable, general-purpose language like it came along that offered significant benefits over C. I can't personally see that happening any time soon.
experience (Score:3, Insightful)
Non relationnal DBMS
Yes, maybe they don't play such an important role as before at big irons. But actually they are encountered painfully often in science, where usually database grow slowly out of small projects that subsequently undergo numerous hacks.
I'm studying bioinformatics and proteomics, non relationnal DBMS are part of the standard cursus (and often encountered in the wild).
C programmingy
Yes. Just try to tell it to the OSS community. Almost any cool piece of technology (most librar
Then Pay the Market Wage (Score:5, Insightful)
Some of the list looks good (Score:5, Insightful)
Until there's enough spare processor cycles that it really doesn't matter how much CPU time you use, or a managed language gets as good at optimizing as a good C compiler/programmer combo (unlikely) I don't think C is going anywhere.
Re:Some of the list looks good (Score:5, Insightful)
I think they are wrong since C is still used on a lot of embedded systems where C++ is too heavy.
BTW a good number of HPC tools and applications are still written in FORTRAN.
Re: (Score:3, Informative)
And before someone jumps in to say, "Oh, but all the open source developers, blah blah," there honestly isn't that many compared even to C# developers. Honestly. And I say this as someone who has contributed his share of open sou
Re: (Score:3, Interesting)
It's not C. It's the C only programmer. (Score:5, Insightful)
Most programmers who know C also know at least one other language.
In any event, putting that on the list was just stupid.
Re: (Score:3, Funny)
ColdFusion Dead? (Score:5, Insightful)
Also, what's with "PC Network Administrators"? TFA must be referring to a rather specialized form of administrator, because last I checked we still needed someone to keep the desktops configured, the networks running, the file severs sharing, the login servers logging people in, and the IIS servers serving.
Re:ColdFusion Dead? (Score:4, Interesting)
If one was to use crappy solutions as an argument, how come anyone is still using php, asp, etc? I think ColdFusion has copped it more than any due to its low threshold of entry, but just because one _can_ make a shit solution using a platform doesn't mean the platform is shit.
Have been using ColdFusion for 11 years now and it keeps getting better. It's a great second language to know in addition to
Wait, there IT people who specialize this much? (Score:5, Insightful)
C? You must be kidding (Score:5, Insightful)
What the web can now allocate memory and talk to my hardware? Even if you're not a kernel programmer, the web has sucked and still sucks for application development. It will continue to suck for years, due to Internet Explorer. It's misleading to claim AJAX will solve all these problems because it won't. In fact, it might even cause a few problems of its own. For example, do you really think all that AJAX is secure? In short, I think the web is taking over what naturally comes to that medium. It is wrong to say its displaced C.
Does this guy forget that all of the GNU/Linux Kernel base system is written in C? You know, the operating system that powers most web-servers? I'll tell you one thing, C will still be here in twenty years time when Ruby on Rails is talked about much in the same was Blitz Basic is today. C is here to stay; it's immortal.
Simon
Re: (Score:3, Funny)
"We don't need sockets anymore, everything is going to Web now..."
LaTeX (Score:5, Informative)
Re:LaTeX (Score:5, Funny)
The curse of good presentation skills... (Score:3, Interesting)
The curse of good presentation skills is that no-one ever notices that you've used them, because you're good at presentation. :-)
I'm currently having a similar debate at the office. We're working on a new tool, effectively a web front end for a simple database with a few handy UI gimmicks. In the grand scheme of things, it's a pretty simple tool, but it's part of a routine business process and literally thousands of people are going to be using it for a few hours each month.
At a progress meeting yesterd
Re: (Score:3, Informative)
Re: (Score:3, Informative)
True story... (Score:5, Interesting)
When I started working at the huge multinational company I work at now, there were three things that I had very little experience with that everyone swore would last at the company for decades to come: Token Ring, Netware, and Lotus Notes. I insisted that within the next few years, these technologies would be dead and the company would have to change, and I was constantly reminded of the millions of dollars invested in them.
It's eight years later. We have no Token Ring network. We have no Netware servers. I'm doing my damned best to convince people of how bad Lotus Notes sucks, and most everyone agrees, but we have a Notes support team that really likes their jobs and somehow manages to convince upper level management that it would cost billions of dollars to change to a real e-mail and collaboration solution. But I'm still holding out hope.
Godwilling, Lotus Notes will soon be on this list as well.
Re:Amateur's make me laugh. (Score:4, Funny)
Matt's a flaming asshole who manages to make everyone miserable and lose us business, but we can't fire him.
Name one other person who can kind of deliver the mail, kind of answer the phones, do some of the bookkeeping, unclog the toilet, and kind of drive a 18-wheeler?
Of course, a reasonable manager who understood some of these things would conclude that since these days it's possible to hire well trained, specialized and pleasant professionals to do all of these things, some of which are business critical, it makes sense to break up this arbitrary collection of tasks which have no real synergy, fire Matt, and hire modern, well adjusted professionals to run the business.
Re:Amateur's make me laugh. (Score:5, Interesting)
Wow, to the poster above, thank you, that's a fantastic analogy!
I've been beaten over the head with the "it does a LOT of things!" stick so many times it makes me sick. The problem is that it really sucks at all of them!
It's really comical. Here's a typical me/Notes goober conversation:
As a technical professional with a strong background in systems architecture and server administration, I would highly advice any serious businessperson to avoid Lotus Notes like the plague. Ignore me at you and your career's peril.
If only... (Score:5, Funny)
...writing unreliable, poorly-documented, just-about-does-the-job-and-only-if-you-get-lucky code would go out of fashion.
Sadly it seems to be here to stay. In fact with the better availability/quality of scripting languages it is, if anything, becoming more popular...
10 more dying computer skills (Score:5, Insightful)
2. data management theory
3. data modeling
4. usability
5. interface design
6. use of testing, version control, refactoring, and other best practices
7. space or time efficient algorithms
8. general communications skills
9. basic business concepts like ROI
10. business ethics
Re:10 more dying computer skills (Score:4, Funny)
10 undead computer skills (Score:4, Insightful)
1. functional programming
2. formal methods
3. prolog
4. LISP
5. Scheme
6. Smalltalk
7. Pascal
8. Tcl/Tk
9. LALR parsing
10. pre-bash shell scripting.
and that's the real message here.. nothing is thrown away in computer science.. we're just too damn young a field to honestly say we've hit a dead end on any particular technology. Anything you can name, people have done work on it in the last 10 years.
Delphi (Score:3, Interesting)
Re: (Score:3, Informative)
Also, take a look at Lazarus [freepascal.org]. It's a multiplatform and open source Delphi clone that brought the beauty of Delphi to Linux.
Note that it's 100% native on all platforms and produces 100% native cod
Web Design (Score:5, Insightful)
This article is trash, even if it does have some technologies that are irrelevant. It has very little value to the reader. I'd rather read a 10 top list for reasons Paris Hilton should be locked up for life.
C and PC network administrators? (Score:4, Informative)
There just aren't that many people that know networking outside of IT and there are still a lot of people that get confused about what is going on. I have seen where many people have cluged together a network at their office, but then they find out it sucks after awhile, so they have to call somebody in to look at it.
C programming is going away. I'm always seeing algorithms with some part of C in them. Partly because these guys with VB skills say hey there is no reason to learn all that hard stuff. We'll just get more/bigger hardware. So far they have spent $300K on hardware and 5 man years of programming. They've got a lot of code but nothing to show for it. Runs fast and cranks through a lot of data, but nobody can figure out what it's good for.
PC network admins? (Score:5, Insightful)
Apparently this guy's never dealt with users. If there's a way to screw up a system, even a dumb terminal, they WILL find a way.
I disagree with some of the list. (Score:5, Interesting)
Non-IP networks are dying? Must tell that to makers of Infiniband cards, who are carving out a very nice LAN niche and are set on moving into the WAN market. Also need to tell that to xDSL providers, who invariably use ATM, not IP. And if you consider IP to mean IPv4, then the US Government should be informed forthwith that its migration to IPv6 is "dead". Oh, and for satellite communication, they've only just got IP to even work. Since they weren't using string and tin cans before, I can only assume most in use are controlled via non-IP protocols and that this will be true for a very long time. More down-to-earth, PCI's latest specs allows for multiple hosts and is becoming a LAN protocol. USB, FireWire and Bluetooth are all networks of a sort - Bluetooth has a range of a mile, if you connect the devices via rifle.
C programming. Well, yes, the web is making pure C less useful for some applications, but I somehow don't think pure C developers will be begging in the streets any time soon. Device driver writers are in heavy demand, and you don't get far with those if you're working in Java. There are also an awful lot of patches/additions to Linux (a pure C environment), given this alleged death of C. I'd love to see someone code a hard realtime application (again, something in heavy demand) in AJAX. What about those relational databases mentioned earlier in the story? Those written in DHTML? Or do I C an indication of other languages at work?
Netware - well, given the talk about non-IP dying, this is redundant and just a filler. It's probably right, but it has no business being there with the other claim. One should go.
What should be there? Well, Formal Methods is dying, replaced by Extreme Programming. BSD is dying, but only according to Netcraft. Web programming is dying - people no longer write stuff, they use pre-built components. Pure parallel programming is dying -- it's far more efficient to have the OS divide up the work and rely on multi-CPU, multi-core, hyperthreaded systems to take care of all the tracking than it is to mess with very advanced programming techniques, message-passing libraries and the inevitable deadlock issues. Asynchronous hardware is essentially dead. Object-Oriented Databases seem to be pretty much dead. 3D outside of games seems to be dead. Memory-efficient and CPU-efficient programming methods are certainly dead. I guess that would be my list.
Yet more netware bashing... (Score:3, Insightful)
Sure, its sales have declined drastically, but I wouldn't say that its relevance has. I'd be willing to bet that if we were to actually survey what file servers are still running out there, we'll see a much larger representation of NetWare. Just because people aren't buying the latest version doesn't necessarily mean that they aren't using the old ones.
For two years, I managed the computer network of a daily newspaper - including through the election debacle of 2000 and the 9/11 events. We ran that network primarily off of four netware 4.11 (later netware 5.0) servers. One of those servers had been running for over 400 days continuously when I left, and it served files and print jobs. That kind of reliability is hard to match.
F77/F90/F95 aren't on the list. (Score:5, Funny)
"What will the language of the year 2000 look like? Nobody knows, but it will be called FORTRAN." John W. Backus
Memory Tuning (Score:5, Funny)
I carried a specially tuned DOS disk around with me, and would whip it out whenever anyone complained that a certain program wouldn't load. Boot off the floppy (with around 630KB conventional memory available after all drivers loaded), run the program with no problem, deliver the classic "It works for me" tech support line, slip the boot disk back into my pocket, and leave the user convinced they're doing something wrong.
Ah, good times, good times....
Re:Memory Tuning (Score:4, Interesting)
ColdFusion (Score:4, Interesting)
Typing (Score:3, Interesting)
It got me thinking.. secretaries used to be hired just based on their typing skills. Speed & accuracy. I remember when I took a typing class in high school the teacher made us cover the keyboard so we couldn't look at it while we were typing, and we especially weren't allowed to use the delete key so she could mark us on how many errors we made.
But it's funny, that's so backward, of course. Since typewriters are no longer used, your typing speed _includes_ the time it takes to hit the delete key and fix what you did wrong. You time further increases if you have to look at the screen and then find your place in the text. So typing speed is now the only thing that counts...
Now add into that the fact that the days of the boss dictating memos to the secretary are mostly gone, and typing is really a skill that no longer matters. It certainly helps in day-to-day computer tasks, but it's no longer a make or break skill for IT and office people.
I'm more concerned with dead USER skills (Score:5, Insightful)
- Both the fact that that they exist in the first place AND what the different ones mean--"ooh, should I click on hotsex.jpg.doc.exe.scr.pif?"
2) looking at the URL in the status bar before clicking on a link
- Apple: I love you, but you SUCK for having the status bar off by default in Safari.
3) knowing where downloaded files go
- Every phone-based support call I've ever made:
a) Painfully (see #4) navigate to a URL.
b) Painfully (see #5) instruct user to download a file.
c) Spend 5 minutes telling them where that file is on their computer
4) the difference between \ and /
- these people saw a backslash ONCE in their lives while using DOS about twenty years ago, and now every time I tell them an address, it's "Is that forward slash or backslash?" (Despite the fact that I've told them a million times that they'll pretty much NEVER see a \ in a URL.) This is usually followed by the question "Which one is slash?" God damn you, Paul [nytimes.com] Allen. [wired.com]
5) the difference between click, right-click, and double-click
"OK, right click on My Computer... no, close that window. Now, see the mouse? Press the RIGHT BUTTON..."
6) the concept of paths, root directories, etc.
- I why do I have to explain fifty times a day how to get from example.com/foo to example.com?
Admins can get whatever skills they want--they picked the career, thy can accept the fact that things change. The backends are usually handled by people with some know-how. It's the end-users that cause all the problems. It'd be like driving in a world where people didn't know how to use turn signals, didn't check their blind spots, didn't know they shouldn't talk on the phone while making complicated maneuvers--oh, wait, bad example.
I highly disagree with number 9! (Score:5, Insightful)
Notice the "good" in the above statement, please!
Unfortunately, network admins have already suffered for years from what we (programmers) are facing now: Clueless wannabes flooding the market. Sounds harsh, is harsh, but it's sadly true. Everyone who can spell TCP/IP and doesn't think it's the Chinese secret service calls himself a net admin. And since human resources usually can't tell a network cable from a phone cable, they hire the ones with the cutest looking tie. Or the one with the most unrelated certificates.
Quite frankly, I have met so many people who claim to be net admins who know even LESS about networks than me. And I can barely cable my home net, and I can't solve the retransmission issues with my game machine that clog it. I do expect a lot from a net admin, granted, but for crying out loud, it's their JOB to know more about networks than I do, or I could do it myself!
What you get today as a "network administrator" is some guy who can somehow, with a bit of luck, good fortune, a graphical interface and a step-by-step guide from the 'net, get the DHCP server on a Win2003 Server up and running. Don't bother trying to get a static IP or even a working DNS server from him. Not to mention that he'll look blankly at you when you ask him about splitting the 'net into smaller chunks. Anything in a netmask other than 00 or 0xFF (sorry: 0 and 255) is alien to him.
That's not what I call a network administrator. That's what I call a clickmonkey.
True network administrators who got more than an evening school degree are still rare. And they will have a job, with companies that know what to look for in a net admin.
But the plague spreads. Recently we hired a "programmer" who doesn't know the difference between heap and stack. Or why inserting an inline assembler line of INT3 could do some good for his debugging problem.
And we wonder about buffer overflow issues and other security problems in code? I stopped wondering.
a few hundred million in marketing gets customers (Score:5, Interesting)
Yup, it's interesting how snake oil still gets sold year after year but only under a different name. IMO.
Oh, and virtualization, that's all about moving all those single tasking servers back into one box where one crash won't take out the others. That's innovation for ya. Go Microsoft!
LoB
Other dead skills (Score:4, Interesting)
LANTastic, I recall some people were experts with this network. I can recall when Windows for Workgroups came out and had built in networking that LANTastic went on decline.
DBase and Clipper, I can recall porting databases and code written in them to MS-Access in 1996-1997.
Wordperfect 5.0/6.0 macro writing. I know some small law firms that still have document templates automated with Wordperfect 5.0 for DOS or Windows. Hardly anyone else uses Wordperfect and has moved to MS-Word and used VBA for Macros.
AmigaDOS/AmigaOS it used to be the bee's knees for video and multi-media in the late 1980's, I am one of the few left that still has Amiga skills on my resume. AmigaOS reached 4.0 quite some time ago, but hardly anyone uses it anymore except in Europe for various niche markets.
ProDOS, AppleDOS, I think the Apple
Mac OS9 and earlier, I think Mac OSX is the top dog now. The Classic MacOS is no longer in demand, and 68K Macs are only used in school districts that couldn't afford to replace them.
BeOS, despite trying to bring it back from the dead it using open source, BeOS used to be popular in the late 1990's and used to run on PowerPC Macs and Intel PCs. I can recall some of my friends used to develop software for BeOS, but not anymore.
Wang, some people I know still list Wang skills on their resume. It used to be in high demand, but once Windows NT 4.0 and Windows 2000 Server came out, there was a mass migration from Wang, after Wang got shrunk and almost went out of business. They did have some Visual BASIC graphic tool called Wang ImageBasic, but I think Internet Explorer 4.0 or 5.0 broke it and so did Visual BASIC 6.0 break it. I think Leadtools replaced it.
8 Bit Computers, nobody really uses them anymore. Big Businesses only used the Apple
The Apple Newton, the Palm Pilot and Windows CE devices replaced it.
Arcnet and Starnet cards, Ethernet replaced them. Token Ring is almost dead, but some die-hard IBM Fans still use it at their companies. Anyone remember making twisted pair and coaxial cable network wires for Arcnet and Starnet networks? I do.
MS-DOS 6.X and Windows 3.X and earlier, like OS/2 they deserve to be mentioned. I think some older charities non-profit organizations still use them on old 286/386 systems that cannot run even Windows 95, and they use a Netware 3.X server to share files with them.
MS-Foxpro, does anyone still use it? After MS-Access got upgraded, and MS-SQL Server had more features added to it, MS-Foxpro became redundant.
Assembly Language, Machine Language, remember writing native code for the 8088, 68000, 6502, 6809, IBM Mainframes, etc? Hardly any company wants us to write in Assembly or Machine language anymore. It seems like only hackers use these to do exploits and write malware.
FORTRAN, I think BASIC and C sort of replaced it, and then C++ and Java replaced them. FORTRAN got NASA to the moon, but NASA uses Java or Python now.
COBOL is not a skill (Score:5, Insightful)
my experiment (Score:4, Interesting)
The number of jobs(posted in the last 30 days) that was listed if I picked C as a skill?
Answer: 17139 jobs
Java?
Answer: 15760 jobs
So.....Myth-busted?
BAH - 10 already-dead skills... (Score:5, Insightful)
1. Knowing how to drop certain types of home comupter to re-seat the chips
2. Inserting 64k RAM chips with your bare hands to expand memory
3. Cutting a notch in 5-1/4" floppies to use the other side
4. Adjusting graphics by hand to NTSC-legal colors for decent video output
5. Editing config.sys to push drivers into HIMEM in order to free up memory
6. Crimping your own RJ45 connectors to save money
7. PEEK and POKE locations to do cool stuff on the Commodore 64
8. Manually configuring a SLIP connection to connect to the Internet (in pre-Winsock days)
9. Removing adjectives and punctuation from code comments to fit into 1k of RAM
Re:Number 1 (Score:5, Funny)
Re:They are nuts on the C front. (Score:5, Insightful)
Re:You have to admit (Score:4, Insightful)
Technology reporting is certainly dying.
Re: (Score:3, Interesting)
You are 100% correct and like the article mentioned COBOL is still not only used at many companies but also taught in some universities Computer Science programs including the one I graduated from being Northern Illinois University in DeKalb. Here are two examples:
http://www.cs.niu.edu/undergrad/coursecat.html#250 [niu.edu]
http://www.cs.niu.edu/undergrad/coursecat.html#465 [niu.edu]
There are A LOT of companies that still use COBOL out there (I saw many of them at every job fair I
Re: (Score:3, Insightful)