Top 10 Dead (or Dying) Computer Skills 766
Lucas123 writes "Computerworld reporter Mary Brandel spoke with academics and head hunters to compile this list of computer skills that are dying but may not yet have taken their last gasp. The article's message: Obsolescence is a relative — not absolute — term in the world of technology. 'In the early 1990s, it was all the rage to become a Certified NetWare Engineer, especially with Novell Inc. enjoying 90% market share for PC-based servers. "It seems like it happened overnight. Everyone had Novell, and within a two-year period, they'd all switched to NT," says David Hayes, president of HireMinds LLC in Cambridge, Mass.'"
LaTeX (Score:5, Informative)
print edition (Score:1, Informative)
http://www.computerworld.com/action/article.do?co
dovetail (Score:5, Informative)
Here's a link to the print version [computerworld.com] for those who dislike clicking 18 times to read a news piece.
And for those not wanting to feed the gossiping trolls altogether, here's the (pointless) "Top 10" list in short form.
1. Cobol
2. Nonrelational DBMS
3. Non-IP networks
4. cc:Mail
5. ColdFusion
6. C programming
7. PowerBuilder
8. Certified NetWare Engineers
9. PC network administrators
10. OS/2
You may now return to the
C programming? (Score:1, Informative)
C and PC network administrators? (Score:4, Informative)
There just aren't that many people that know networking outside of IT and there are still a lot of people that get confused about what is going on. I have seen where many people have cluged together a network at their office, but then they find out it sucks after awhile, so they have to call somebody in to look at it.
C programming is going away. I'm always seeing algorithms with some part of C in them. Partly because these guys with VB skills say hey there is no reason to learn all that hard stuff. We'll just get more/bigger hardware. So far they have spent $300K on hardware and 5 man years of programming. They've got a lot of code but nothing to show for it. Runs fast and cranks through a lot of data, but nobody can figure out what it's good for.
Re:LaTeX (Score:3, Informative)
Re:dovetail (Score:1, Informative)
It's pretty restrictive to work in an environment like this after having been raised on more modern languages, but to be honest I can see how there isn't really anything to be gained by replacing the existing code. All it has to do is make sure that customer service has access to the stuff they need and then all we do is generate reports or pull samples from the raw data.
Everyone seems to be content here. A former IT guy moved to actuarial a few months before me and he regards the mainframe as a beautiful thing, even with its warts.
Re:LaTeX (Score:3, Informative)
Re:c ? really? (Score:3, Informative)
C as a vehicle for embedded programming is very much alive. I work as an embedded programming for devices ranging from 8 bit PIC's to DSP's and most things in between.
How would you like to code a TCP/IP stack in asm? It's not entertaining and while low power low cost embedded devices are more and more having ethernet MAC and PHY layers embedded in them C programming for these devices becomes more and more important. At the point where a $1.50 micro can give you 10+ mips, asm programming is more in danger of dieing than C.
If you have an embedded monitoring app i challenge you to find a more cost effective solution than using a PIC from the pic18f97j60 family, and you had between be comfortable with ANSI spec C to do anything really useful with it. At $5 a chip there is no easier way to add TCP/IP support to a low bandwidth application (the chip can sustain something around 5-6 MBit a sec while running useful code)
I think the REAL point that this article should make is that if you are only capable of programming in one language or API...your days are numbers no matter what, to exist in the engineering world of today you need to understand COMPUTING at a hardware level. At this point the language or API you use is trivial.
If you can't pick up a language or API after a week working with it you are simply in the wrong field.
-xian
Re:How to learn COBOL (Score:1, Informative)
000200 PROGRAM-ID. USETHESOURCELUKE
000300 DATE-WRITTEN. 24/05/107 21:04.
000400* AUTHOR SLASHDOT_COWARD
000500 ENVIRONMENT DIVISION.
000600 CONFIGURATION SECTION.
000700 SOURCE-COMPUTER. UGH-COBOL.
000800 OBJECT-COMPUTER. UGH-COBOL.
000900
001000 DATA DIVISION.
001100 FILE SECTION.
001200
100000 PROCEDURE DIVISION.
100100
100200 MAIN-LOGIC SECTION.
100300 BEGIN.
100400 VISIT LIBRARY ADDING BOOKS TO STACK GIVING MATERIAL
100500 READ BOOKS INCREASING KNOWLEDGE GREATER THAN BEFORE
100501 GOTO GOOGLE
100600 STOP RUN.
100700 MAIN-LOGIC-EXIT.
100800 EXIT.
Okay, so that's stolen. Feh! Bet you can't guess what from!
Re:Delphi (Score:3, Informative)
Also, take a look at Lazarus [freepascal.org]. It's a multiplatform and open source Delphi clone that brought the beauty of Delphi to Linux.
Note that it's 100% native on all platforms and produces 100% native code: no Wine, no emulations. Young but already powerful, and damn funny to use!
Re:Some of the list looks good (Score:3, Informative)
And before someone jumps in to say, "Oh, but all the open source developers, blah blah," there honestly isn't that many compared even to C# developers. Honestly. And I say this as someone who has contributed his share of open source C.
So, there is a lot of legacy C around, and there is still lots being written in niche fields (embedded). But for active, mainstream, ground-up development, there isn't too much happening. No big deal, really, that's the nature of technology.
Re:c ? really? (Score:5, Informative)
Well, yeah, every language will eventually fade out. But C is still going on strong, as its still the language of choice for many low level applications. I just searched Monster.com and found over 2500 jobs referencing C [monster.com] (its possible that some of the results are because the term "C" is too generic, but most of the titles indicate that C programming is actually part of the job), while Python gets 419 [monster.com], Ruby gets 168 [monster.com], PHP gets 612 [monster.com], and JavaScript gets 1736 [monster.com]. How the hell can C be considered dead if its one of the most popular languages around, and probably still the best available choice for a huge class of applications (just not web applications)?
And in fact even the "dead and buried" Cobol is still alive, with 174 jobs [monster.com]. Now, its not as much as the more popular languages, but its still more than Ruby, which is supposed to be the next big thing.
Anyways, from TFA:
Despite what this guy thinks, web programming hasn't "taken over", and never will. Yes, it has a large niche, but there are many systems out there that are not, nor never will be web applications. Unfortunately some people (like this guy, he owns some dumb .com company that no one has ever heard of, how does that make him an expert on the subject) have tunnel vision and think that since they work on web applications, everyone else must as well.
Re:ColdFusion Dead? (Score:2, Informative)
I forgot to mention in my reply that CF4 and CF5 lacked full-blown user-defined functions. This often resulted in file proliferation. When full-blown functions were added in later versions, it became a lot easier to organize stuff and create reusable libraries. Thus, your criticism is correct for those versions, but is no longer applicable. However, the file sprawl did give it a bad name. (I don't need Fusebox anymore since functions.)
Re:c ? really? (Score:3, Informative)
Re:Ha! C != performance (Score:5, Informative)
I've coded some fairly complicated high performance multi-threaded applications in C. It's not easy, but it's not easy in C# or Java either. There have been many minor improvements to the required syntax, but that has never been the hard part of multi-threaded development. Parallelizing (sp, I know) the problem is a conceptual problem unrelated to language.
Re:c ? really? (Score:3, Informative)
"Hey, Peter. Whaaat's happening?"
Re:IT is More Than Software (Score:3, Informative)
Even in the cities stuff was built to last - because until very recently upgrade cycles were measured in years, if not decades. Certainly not the annual or quarterly cycles so common today, even in infrastructure.
I own a Bell System familiarization manual from the early 80's - and about the only type of switching system not covered in it was the local manually operated local plugboard. The only type of switching system specifically mentioned as being phased out was the manually operated (actually semi automatic as the operator punched buttons rather than patched cables) switchboards for connecting to and cross connecting between long distance trunks. The tacit assumption (even for the remaining mechanical systems where were already obsolescent by then) was "this stuff is out there, and it's going to be for a while yet - so you'd better know about it".
Re:They are nuts on the C front. (Score:2, Informative)
There are some very idiomatic elements to C++ that are not of obvious utility from a C programmer's point of view. This can even escape people who *Teach* C++. Some differences between C and C++ look tiny, but have enormous implications.
Consider a couple; const correctness, and the function-style casts.
There is a short list of specific things where C++ differs from C. So a programmer can basically write C in a C++ environment and get away with it. He can even make use of the type/class/object system, heap based memory allocation, etc. But these are really still superficial differences. The real differences don't present themselves so much as syntactic distinctions, and it is quite obvious when a C programmer writes "C in C++".
Likewise there are some big hurdles that a Java programmer has to get over before being a really effective C++ programmer, although in these cases, the whole OO-design idea has usually taken root; sometimes even more than is typically idiomatic for C++ -- you can tell when someone is thinking Java and writing C++ too.
Like I said, the differences can be subtle, but fundamental, and it really jumps off the screen when an experienced C++ programmer sees the work of another experienced C++ programmer, as opposed to the C++ of a java or C programmer. Hard to explain, but I suspect you know what I'm saying if you are one.
not my experience (Score:3, Informative)
Re:c ? really? (Score:3, Informative)
Re:ColdFusion Dead? (Score:2, Informative)
I completely agree with your point about processing moving to the client's side. I want to point out that ColdFusion is embracing that shift in paradigm, especially in the upcoming release, ColdFusion 8, that is slated to have a lot of nice and easy AJAX integration. CF 8 is more than just pretty stuff though; it's got a lot of fine-grain feature improvements in there as well.
I think you would be hard pressed to find a platform that has been as adaptive as ColdFusion has been. It's come a lot further than the CFQUERY tag (not to diminish that though). Consider for a minute its move from CF 5, written in C++, to CF 6 written in Java. Let's see Ruby switch off of Rails to something like "roads" or "orbits".
ColdFusion has been proven in rapid application development (even before that was a buzz-term), but still has the power and extensibility to accomplish some pretty intense things. If you can't figure out how to do something with the proper ease or efficiency you need in ColdFusion, go write something in Java, C++, or even
Maybe ColdFusion guys are a dying bread; but one possible reason is because they are picking up other languages along the way to marry with the ColdFusion skills. Once you know Java, it seems to be more chic to call yourself a "Java Developer" than a "ColdFusion Developer". That fact itself may invalidate Mr. Foote's (see TFA) data collection methods; I can't really comment decidedly due to the lack of information about the data.
ColdFusion's reputation has always been plagued by several problems: its uniqueness, not lending itself to any particular design patterns, cost, total lack of presence in the education field, etc. All of those issues have not stopped it from becoming a viable technology that is used in real-word applications. It's been around for over a decade now, and I don't see any new threat to its existence on the horizon and in the absence of such a threat, I can hardly describe it as "dying".
Re:c ? really? (Score:3, Informative)
Mine did. If I had chosen to, I could have made it through the entire curriculum primarily using Java.
My Operating Systems course had an assignment writing multi-threaded applications using semaphores and a simulated scheduler. You could write it in C or in Java. The option was there because about half the students never took C as part of their 100-level and 200-level core (you could take either Java or C). Compiler design was covered in one class, but it was entirely theoretical book work with no implementation IIRC (been a few years).
Some courses required using C or C++, but if you wanted to you could avoid them. Sure you get "exposure" to C/C++ in 200-level Intro to Programming Languages, but it was cursory at best unless you were self-directed. Even 3D graphics allowed the java openGL wrapper. Yuck.
My point is, you could have made it through with maybe one semester worth of very basic C. But like any educational experience, what you get out of it is what you put into it. I know I got a lot more out of it then many of my peers who took CS because "computer stuff pays well." I'm not so sure it's paying them well now, unless they took jobs managing geeks like me instead of being a geek themselves.
Hey, wait a minute....
Re:I agree totally.... BUT (Score:5, Informative)
Simple economics: developer time is expensive, and the cost of it keeps rising with inflation, if not beating it, making the cost of developer time ever more expensive in real terms.
Meanwhile, hardware continues to drop in price in both nominal (not inflation-adjusted) and real (inflation-adjusted) terms.
It's cheaper to implement for a 16 core, 8GByte RAM box than it is to pay a developer to optimize the code so it can run on a single 486DX2/66...
Re:I agree totally.... BUT (Score:5, Informative)
You _have_ to write efficient code for those. The laws of physics say that these small processors will *not* get substantially faster, because they need to be very low power and have very small die sizes, so you can't just throw MHz and extra transistors at them to compensate for software bloat. Anybody working with embedded computers still has to write efficient code, and get as close to the metal as they can. This means assembly language or C.
The cost of developer time in an embedded device that will ship millions of units is trivial compared to having to use a more powerful microcontroller to compensate for bloated code. In the PC world, of course, the opposite holds true - since the software developer is only shipping a software device, they can just rely on the customer to buy beefier hardware at no cost to the software developer. Embedded developers cannot push the cost of bloat onto their customers without losing out to their competitors.
Re:c ? really? (Score:5, Informative)
Re:c ? really? (Score:3, Informative)
Yes, the Linux kernel is written mostly in C (with a few modules in assembler). KDE, however, is C++. The basic GUI widgets are written as C++ classes. Gnome's widgets are written in C, but API bindings for C++, Java, Perl and Python are part of the project.
Chris Mattern
Re:c ? really? (Score:3, Informative)