Top 10 Dead (or Dying) Computer Skills 766
Lucas123 writes "Computerworld reporter Mary Brandel spoke with academics and head hunters to compile this list of computer skills that are dying but may not yet have taken their last gasp. The article's message: Obsolescence is a relative — not absolute — term in the world of technology. 'In the early 1990s, it was all the rage to become a Certified NetWare Engineer, especially with Novell Inc. enjoying 90% market share for PC-based servers. "It seems like it happened overnight. Everyone had Novell, and within a two-year period, they'd all switched to NT," says David Hayes, president of HireMinds LLC in Cambridge, Mass.'"
c ? really? (Score:5, Insightful)
Some of the list looks good (Score:5, Insightful)
Until there's enough spare processor cycles that it really doesn't matter how much CPU time you use, or a managed language gets as good at optimizing as a good C compiler/programmer combo (unlikely) I don't think C is going anywhere.
ColdFusion Dead? (Score:5, Insightful)
Also, what's with "PC Network Administrators"? TFA must be referring to a rather specialized form of administrator, because last I checked we still needed someone to keep the desktops configured, the networks running, the file severs sharing, the login servers logging people in, and the IIS servers serving.
Wait, there IT people who specialize this much? (Score:5, Insightful)
C? You must be kidding (Score:5, Insightful)
What the web can now allocate memory and talk to my hardware? Even if you're not a kernel programmer, the web has sucked and still sucks for application development. It will continue to suck for years, due to Internet Explorer. It's misleading to claim AJAX will solve all these problems because it won't. In fact, it might even cause a few problems of its own. For example, do you really think all that AJAX is secure? In short, I think the web is taking over what naturally comes to that medium. It is wrong to say its displaced C.
Does this guy forget that all of the GNU/Linux Kernel base system is written in C? You know, the operating system that powers most web-servers? I'll tell you one thing, C will still be here in twenty years time when Ruby on Rails is talked about much in the same was Blitz Basic is today. C is here to stay; it's immortal.
Simon
Re:Some of the list looks good (Score:5, Insightful)
I think they are wrong since C is still used on a lot of embedded systems where C++ is too heavy.
BTW a good number of HPC tools and applications are still written in FORTRAN.
10 more dying computer skills (Score:5, Insightful)
2. data management theory
3. data modeling
4. usability
5. interface design
6. use of testing, version control, refactoring, and other best practices
7. space or time efficient algorithms
8. general communications skills
9. basic business concepts like ROI
10. business ethics
Re:c ? really? (Score:5, Insightful)
They said something else. (Score:4, Insightful)
Now I know some people who've learned on C#, but I'm sure that will change in the near future.
Anyone who originally learned C, and is still writing code, has probably picked up a few other languages over the years.
Web Design (Score:5, Insightful)
This article is trash, even if it does have some technologies that are irrelevant. It has very little value to the reader. I'd rather read a 10 top list for reasons Paris Hilton should be locked up for life.
Re:They are nuts on the C front. (Score:5, Insightful)
Re:You have to admit (Score:4, Insightful)
Technology reporting is certainly dying.
It's not C. It's the C only programmer. (Score:5, Insightful)
Most programmers who know C also know at least one other language.
In any event, putting that on the list was just stupid.
Re:c ? really? (Score:5, Insightful)
No, C isn't in any way going out. C produces fast, tight code that so far, C++ and C# can't even begin to match. C++ is just C with a lot of baggage, a great deal of which you can implement in C in a completely controllable, transparent and maintainable manner. We use the most important of those regularly in C code, specifically objects and objects with methods. We obtain better performance, smaller executables, and smaller memory footprints than any company that makes similar software using C++ or Objective C's add-on paradigms. C, and the C sub-domain of C++ and so on, is no more "going away" than C++ itself is. C occupies a unique niche between the metal of assembly and the (so far) considerably less efficient higher level languages — I'm talking about results here, not code. I'm all about recognizing that a few lines of C++ are very convenient, but the cost of those lines is still too high to even think about abandoning C code for performance applications. For many, the object isn't finding the absolute easiest way to write code, but instead trying to find a balance between portability, reasonable code effort and high performance. C sits exactly in that niche. C++ is easier to write, almost as portable, but produces applications with large footprints, inherited, unfixable problems inside non-transparent objects (like Microsoft's treeview, to name one), and a considerable loss of speed as compared to a coder who has a good sense of just what the C compiler actually does (which usually means a C coder that has assembly experience, intimate knowledge of stacks and registers and heaps and so on.)
Speaking as the guy who does the hiring around here, If your resume shows C and assembler experience, you've made a great start. Even just assembler. C or C++ only, and your odds have dropped considerably. C, assembler and either a great math background or specifically signal processing, and now we're talking. C++ doesn't hurt your chances, but you won't get to use it around here. :)
Re:Delphi (Score:1, Insightful)
Delphi 2.0 swept the floor with BOTH MSVC++ & MSVB (of all places), in "VB Programmer's Journal" October 1997 issue entitled "INSIDE THE VB5 COMPILER ENGINE"!
That's where Delphi absolutely blew away VB in ALL of the tests (except ActiveX form loads, which VB even took MSVC++ out in), & took C++ out on 8 of 10 of the tests!
Most importantly, by HUGE margins (especially in math & strings work, which EVERY program does).
Where Delphi did lose to MSVC++ (only 2 of 10 tests) it was by VERY SMALL MARGINS, far less than where it blew away MSVC++....
It was enough for me to see that developing shareware @ least, Delphi rules. I like it a lot, & used it to create this tool (runs essentially unaltered since its birthdate in 1997 to this day, across ALL Win32 platforms):
APK Registry Cleaning Engine 2002++ SR-7:
http://www.techpowerup.com/downloads/389/foowhate
That IS the safest & most comprehensive/thorough registry cleaning program there is, bar-none, to this day, even 5 years after I quit developing it, to this day. Enjoy it, if you try it.
(Anyhow/anyway - In shareware/freeware I have done on the side is where I solely control the tools I use, unlike @ work locations, where mgt. calls the shots on tools used & today they follow "He who has the money, wins" because "nobody ever got fired for buying IBM" (replace IBM today, with Microsoft)).
Mgt., even though I showed them such results, was like "Well, can't argue the fact that Delphi IS the superior tool for performance AND rapid application development, but... Microsoft has the ca$h, & will be here tomorrow: WILL BORLAND BE?"
You can't win there, not really, not on a technical superiority level. Much like VHS vs. BetaMax, the 'best man for the job' does NOT always win.
APK
Re:dovetail (Score:5, Insightful)
PC network admins? (Score:5, Insightful)
Apparently this guy's never dealt with users. If there's a way to screw up a system, even a dumb terminal, they WILL find a way.
Re:C, dead or dying!?!?! OUTRAGEOUS!!! (Score:3, Insightful)
Yet more netware bashing... (Score:3, Insightful)
Sure, its sales have declined drastically, but I wouldn't say that its relevance has. I'd be willing to bet that if we were to actually survey what file servers are still running out there, we'll see a much larger representation of NetWare. Just because people aren't buying the latest version doesn't necessarily mean that they aren't using the old ones.
For two years, I managed the computer network of a daily newspaper - including through the election debacle of 2000 and the 9/11 events. We ran that network primarily off of four netware 4.11 (later netware 5.0) servers. One of those servers had been running for over 400 days continuously when I left, and it served files and print jobs. That kind of reliability is hard to match.
Then Pay the Market Wage (Score:5, Insightful)
Re:Raising the bar (Score:1, Insightful)
Re:c ? really? (Score:5, Insightful)
I'm kidding, but only partially. I was a COBOL developer for lots of years, and I thought that COBOL would never die either. I would say "Too many companies are too invested
Re:They said something else. (Score:4, Insightful)
I learend Assembly first and have programmed nearly every cpu except a vax. I learned C second (in 1976) and still use it every day.
I also know: apl, forth, snobol, fortran, rpg, cobol, lisp, smalltalk, pascal, algol and probably more I can't remember. I've never done much in them except for a tiny bit of forth. I know postscript pretty well and used it quite a bit. But I do C, day in day out and do not think very much of C++ or (worse) C#. The oddball languages have their place but for most it's a pretty narrow niche. I'd rather invert a matrix in Fortran than C though, I'll admit.
Andrea Frankel said it best on usenet in the 80s: "If you need to hire a programmer ones with assembly are better than ones without".
It worries me that people today don't actually know how computers work inside any more.
Re:c ? really? (Score:5, Insightful)
Wow. You must have had some really shitty software engineers. It's very likely that you can create C++ code that is as fast or faster than C. Yes, I said it. Implementing virtual inheritance and method overloading in plain C is doable, but it will be very complex. Templates? Don't even want to think about it.
I have no idea what the MS Treeview problem is, but once again - the programmers that you have worked with must have sucked balls. I'm an old C coder, with some solid x86 assembler knowledge. As you say, it's possible to get very high performing applications using C. However, why would I do that, when I can create code that is just as fast and much more readable by using C++? Yes - even for embedded development, which is my dayjob.
experience (Score:3, Insightful)
Non relationnal DBMS
Yes, maybe they don't play such an important role as before at big irons. But actually they are encountered painfully often in science, where usually database grow slowly out of small projects that subsequently undergo numerous hacks.
I'm studying bioinformatics and proteomics, non relationnal DBMS are part of the standard cursus (and often encountered in the wild).
C programmingy
Yes. Just try to tell it to the OSS community. Almost any cool piece of technology (most libraries) are coded in C. Not only but it is an option that almost any student in science can ask.
NetWare
Once again a big-iron vs. universities. There's still a lot of NetWare legacy in smll business and universities, even if bigger corporation have moved to some unix-based solutions or (the gods forbid) MS based Active-something.NET solutions.
Novell is still offering training for it. Even if Novell would like to concentrate more on their Linux solutions.
It'll end going the way of the dodo. But just not yet.
Non IP network
This guy has never heard about something called bluetooth. But on the other hand, courses, as far as I know, seem to be mostly TCP/IP oriented.
ColdFusion, PowerBuiilder : they're dead and deserved it.
OS/2: cue in "all 2 of them" jokes from Bastard Operator From Hell.
Re:Raising the bar (Score:4, Insightful)
Re:c ? really? (Score:3, Insightful)
So, when you have a small object with 10 methods, do you actually waste 40 bytes on 10 function pointers? Do you define your own virtual table structure, use var->vtable.move(var, x, y) kind of notation and require users to call non-virtual methods as ClassName_MethodName(obj, arg,
Re:c ? really? (Score:5, Insightful)
There's one *very* important place where C is used (Score:2, Insightful)
Re:10 more dying computer skills (Score:1, Insightful)
Re:They are nuts on the C front. (Score:2, Insightful)
Re:c ? really? (Score:5, Insightful)
So can other languages, I don't think that is the main selling point of C. I think the main selling point of C is that it is by far the most "compatible" language of all. Python, Perl, Ruby and friends are all based on C, if you want to extend the languages, you do so by writing a module using their C API. If you want to simply call C functions, you can do so from many other languages as well be it Lisp, Ada or whatever. If you want to talk to the kernel you do so in C. No matter what language you use, sooner or later you come to a point where you have to fall back and either write C or interface with C code, since C is the 'real thing', while everything else is just an ugly wrapper around C code, trying to hide it, but often failing at doing so properly.
As long as a ton of stuff is based on C code it isn't going away, especially not in the OpenSource world where basically everything is based on C.
Maybe one day some Java or
Amateur's make me laugh. (Score:1, Insightful)
Exchange? Don't make me laugh.
Name one.
Yeah its easy to rag on Notes for poor UI design, but when it comes to an "enterprise wide" collaboration system, nothing else even comes close, whether you like it or not.
10 undead computer skills (Score:4, Insightful)
1. functional programming
2. formal methods
3. prolog
4. LISP
5. Scheme
6. Smalltalk
7. Pascal
8. Tcl/Tk
9. LALR parsing
10. pre-bash shell scripting.
and that's the real message here.. nothing is thrown away in computer science.. we're just too damn young a field to honestly say we've hit a dead end on any particular technology. Anything you can name, people have done work on it in the last 10 years.
I'm more concerned with dead USER skills (Score:5, Insightful)
- Both the fact that that they exist in the first place AND what the different ones mean--"ooh, should I click on hotsex.jpg.doc.exe.scr.pif?"
2) looking at the URL in the status bar before clicking on a link
- Apple: I love you, but you SUCK for having the status bar off by default in Safari.
3) knowing where downloaded files go
- Every phone-based support call I've ever made:
a) Painfully (see #4) navigate to a URL.
b) Painfully (see #5) instruct user to download a file.
c) Spend 5 minutes telling them where that file is on their computer
4) the difference between \ and /
- these people saw a backslash ONCE in their lives while using DOS about twenty years ago, and now every time I tell them an address, it's "Is that forward slash or backslash?" (Despite the fact that I've told them a million times that they'll pretty much NEVER see a \ in a URL.) This is usually followed by the question "Which one is slash?" God damn you, Paul [nytimes.com] Allen. [wired.com]
5) the difference between click, right-click, and double-click
"OK, right click on My Computer... no, close that window. Now, see the mouse? Press the RIGHT BUTTON..."
6) the concept of paths, root directories, etc.
- I why do I have to explain fifty times a day how to get from example.com/foo to example.com?
Admins can get whatever skills they want--they picked the career, thy can accept the fact that things change. The backends are usually handled by people with some know-how. It's the end-users that cause all the problems. It'd be like driving in a world where people didn't know how to use turn signals, didn't check their blind spots, didn't know they shouldn't talk on the phone while making complicated maneuvers--oh, wait, bad example.
Re:They said something else. (Score:5, Insightful)
It's pretty much true. Look at the other languages you "should" learn today. Perl, PHP, Python, C#, Java... When you know your C well, learning them is fairly easy.
Re:c ? really? (Score:4, Insightful)
In the next fifty years I imagine C's role more or less becoming that of the "mother language". 90% of the time everyone will be using higher level languages like Perl, Ruby, and Haskell on their Linux computers, all of which are programmed in C. Programmers will only need C when they need to change their lower level system tools, or to write new ones that perform very efficiently.
The only way I can see C dying is if a kernel comes along with a Linux compatible system interface that's written in a language suited better to the massively parallelized CPUs of the future. And once the kernel moves away from C, applications are bound to follow.
Re:c ? really? (Score:5, Insightful)
No one shall expel us from the Paradise that Richie has created (Apologies to David Hilbert)
I highly disagree with number 9! (Score:5, Insightful)
Notice the "good" in the above statement, please!
Unfortunately, network admins have already suffered for years from what we (programmers) are facing now: Clueless wannabes flooding the market. Sounds harsh, is harsh, but it's sadly true. Everyone who can spell TCP/IP and doesn't think it's the Chinese secret service calls himself a net admin. And since human resources usually can't tell a network cable from a phone cable, they hire the ones with the cutest looking tie. Or the one with the most unrelated certificates.
Quite frankly, I have met so many people who claim to be net admins who know even LESS about networks than me. And I can barely cable my home net, and I can't solve the retransmission issues with my game machine that clog it. I do expect a lot from a net admin, granted, but for crying out loud, it's their JOB to know more about networks than I do, or I could do it myself!
What you get today as a "network administrator" is some guy who can somehow, with a bit of luck, good fortune, a graphical interface and a step-by-step guide from the 'net, get the DHCP server on a Win2003 Server up and running. Don't bother trying to get a static IP or even a working DNS server from him. Not to mention that he'll look blankly at you when you ask him about splitting the 'net into smaller chunks. Anything in a netmask other than 00 or 0xFF (sorry: 0 and 255) is alien to him.
That's not what I call a network administrator. That's what I call a clickmonkey.
True network administrators who got more than an evening school degree are still rare. And they will have a job, with companies that know what to look for in a net admin.
But the plague spreads. Recently we hired a "programmer" who doesn't know the difference between heap and stack. Or why inserting an inline assembler line of INT3 could do some good for his debugging problem.
And we wonder about buffer overflow issues and other security problems in code? I stopped wondering.
Re:ColdFusion Dead? (Score:3, Insightful)
In what way? I find it fairly nice for the following reasons:
1. Makes it easy to integrate HTML with app code if you go that way (The "separation" argument is usually oversold in my opinion. The cost/benefit analysis often doesn't favor it if you do the probability estimation math. And one can still separate if you really want.)
2. Very scriptish feel for rapid development. You don't have the verbose formality of some web languages and Java. (OO fans will not like it, I would note. I don't put much stock in OO. It is based on flawed assumptions about change patterns and developer psychology.)
3. Pretty good database integration, almost plug-and-play.
4. Graphing widget, editable grid widget, and tree widget are fairly good. It is hard to find decent edit-grid widgets without spending a fortune. (Its grid is far from perfect, but it is hard to find better web grids without spending a fortune.)
5. Natural for web designers who know HTML to work with variable-embedded templates.
I would recommend PHP over it because PHP probably has more staying power, but CF isn't bad for small and medium apps. It is how you program, not the language that makes the biggest difference.
Re:c ? really? (Score:5, Insightful)
Computerworld Magazine, being an IT rag, is concerned about IT, not computer science or engineering. Thus it worries about product names, not categories. So the point out skills in SNA or Novell Netware that aren't needed so much anymore, even though computer networking skills are even more popular and vital today than in the past. COBOL may be virtually dead, but dry and dusty applications for business purposes are alive and well and written by people who still wear ties.
Re:dovetail (Score:5, Insightful)
I think you (and many others) are somewhat missing the point of the article, although the somewhat histrionic headline encourages a "miss the forest for the trees" reading.
I don't think anyone is expecting C or even COBOL to vanish with the speed of PowerBuilder or NetWare; the issue is whether those are actually "growth markets" any more. The article is asserting they're not, and particularly in COBOL's case I'm pretty sure that's correct. COBOL will probably live on for quite some time, but you don't hear much about people deploying new COBOL projects -- you hear about them supporting existing ones that haven't been replaced.
As for "but the OSes are written in C!" as a battle cry: well, yes, they are. But 25 years ago, they sure weren't: C was just too damn big and slow to write an operating system in. What's happened since then? Computers have gotten orders of magnitude faster, RAM and disk space have gotten orders of magnitude bigger, and of course compiler technology has also just gotten better. Couple that with the fact that operating systems and the computers they run on are just a lot more complicated -- having a higher-level language to deal with that, even at the system level, is a real advantage. There's nothing that prevents you from writing an operating system in assembly language now, but under most circumstances you probably wouldn't want to.
The thing is, unless you want to assert that computers twenty years from now will not be much faster and have much more storage and be much more complicated, you can't assert that moving to a higher-level language than C will never be either practical or beneficial even at a system level. I don't expect C to go away or even be relegated to "has-been" status, but I suspect in the long term it isn't a growth skill. It's going to move more deeply into embedded systems and other arenas where small is not merely beautiful but necessary.
The comparison with COBOL may be overstated, but it may not be completely inapt: the fact that there are still COBOL jobs out there and they may actually be fairly high-paying ones doesn't mean that going to school, in 2007, in preparation for a career as a COBOL developer is a bright idea. The same isn't as true for C, but I'm not convinced that's going to stay true for that much longer, let alone indefinitely.
Re:I highly disagree with number 9! (Score:3, Insightful)
A good network admin (hell, even a bad one) has enough equipment and required comprehension these days that they can't be too worried about the intricacies of some OS's non-standard quirks. In that sense, the PC network admin is going the way of the Appletalk admin.
On the other hand, true network admins are absolutely crucial to most companies, and I've been lucky enough to work with a good number who understand their roles very well. Sounds like you haven't, which is a pity. Rest assured, they're out there.
Re:c ? really? (Score:5, Insightful)
In fact, no language that isn't pretty much C can replace C. If it doesn't give you the control over pointers and memory allocation that you have with C, it won't work as a replacement. If it does have those thing, it is not going to replace C unless it is a backwards compatible extension like C++ or Obj-C.
Re:c ? really? (Score:5, Insightful)
>I'm surprised that Fortran didn't make the list.
I would have been, but working in scientific research I've discovered a couple of things that are surprising:
1. People actually do calculus on the whiteboard for reasons other than taking a math class.
2. Lots of people actually use FORTRAN. Even people whose Java, C, C++, Perl, Ruby, etc. skills are such that I look up to them -- and they have solid arguments for using FORTRAN, at least for certain kinds of numerical computing.
But here's the thing: There are separate worlds. In one world, the idea of using calculus on a daily basis is simply never a consideration. You learn enough to finish college, and that's the end of it. Likewise, there's a world where numerical computing and arbitrary precision and optimized complex arithmetic are actually primary considerations and not just hypothetical things.
I never understood this until I found myself in that world. And I wouldn't have believed you if you told me that people who know other languages, choose FORTRAN even when given a choice.
But what I take from it, is that there are requirements that are met by FORTRAN which are not met by languages that offer more comfortable grammars.
People (myself included) will argue that, for instance, C can do anything that FORTRAN can do, in a much happier grammar (opinion, mine, widely shared), but the thing is... while that's strictly true, a lot of the things that seem tangential or irrelevant, turn out to be *crucial*, where seriously optimized math support is the core of the application. FORTRAN makes guarantees on the kinds of things that are implementation dependent in C.
Anyway, there's no shortage of FORTRAN programmers. It's quite easy for a skilled programmer to learn FORTRAN, once you get past the 'WTF' factor and can accept that it's relevant in todays world, at least when your problem space is a good fit for the language.
COBOL or PL/1 and the like, make another story entirely. My experience has been that the role of COBOL has been replaced by the combination of transitions to modern RDBMS, decentralized business processes as a side-effect of the whole ubiquitous "PC" adoption, and the adoption of, for example, Enterprise Java. That covers one end of the spectrum, and the other end (the big corporate end) is covered by the evolution of vertical systems providers (e.g., Peoplesoft, SAP, SAIC).
Back on topic: If there's a university CS program that gives degrees without courses in Operating System and Compiler design taught in C, I'd love to hear about it. No way are C programmers in decreasing supply. If nothing else, the million or so open source projects have created a whole generation of self-taught folks who know C.
Re:They said something else. (Score:4, Insightful)
If you claim to have a BS in CS at the interview table, but didn't suffer through, e.g., a computer organization course (like Hennessy and Patterson style, which is common these days), didn't have a course where you developed an operating system, didn't design a language starting from BNF and build a compiler for it, didn't take 2 years of a lab science, didn't at least come close to a math minor, didn't have at least 4 courses at various levels of discrete math, automata, algorithm analyis, and didn't have a course that, as a final project, you deliver a significant user app in a high level language (on the order of an original multiplayer game, let's say)... I'd say your school has some explaining to do.
Seriously, what school can you go to and somehow avoid having a significant background in several languages and paradigms, including but certainly not limited to asm, C, and either C++ or Java, if not both? I don't expect everyone to take an elective where they compare Lisp, Scheme, Ruby, Haskell, and Icon. And I realize that there's often a variety of senior-year choices, and some people don't take Databases choosing HighPerf/Parallel/Distributed computing, say, or 3D Graphics. But there are some basic things that you'd better have done, or else, despite having the piece of paper hanging on your office wall, you're not done with school!
Re:c ? really? (Score:5, Insightful)
However they are all implemented in C, as is PHP. In fact, I'm reasonably confident you'll find all of the web languages that the article declares are taking over are implemented using C. As is Apache, which is the backbone of the majority of internet servers. In fact, pretty much everything that provides important infrastructure is written in C.
There may be demand right now for programmers that know the latest fad high-level language, but the demand for competent C programmers has hardly disappeared. The only reason that C would die is if another fast, portable, general-purpose language like it came along that offered significant benefits over C. I can't personally see that happening any time soon.
COBOL is not a skill (Score:5, Insightful)
Re:c ? really? (Score:4, Insightful)
Re:It's not C. It's the C only programmer. (Score:2, Insightful)
Re:I agree totally.... BUT (Score:4, Insightful)
skills we want to kill (Score:2, Insightful)
1. Mass marketing (also known by the fuzzy name 'spam').
2. Ability to piss someone off with an email that was meant to be friendly.
3. Documenting with the text "someone needs to fill this bit out".
4. Finding the Caps-Lock; wasted brainspace for a useless key.
5. Coding of Flash advertising.
6. Writing bubblesorts... and inline.
7. Industrial design that puts the reset button near one's knee.
8. Being able to press the Ctrl-Alt-Key without thinking.
9. Internal cable engineering that enables leads to be plugged in reverse.
10. COBOL; because it is the vampire that needs a stake through the heart.
Flip, why stop there. Lets go for the top 100.
Re:c ? really? (Score:5, Insightful)
If you mean whether it's curly braces or brackets or none at all and the syntax of basic control flow, then yes. If you mean being familiar with the standard library, the development tools and all the specific bits (Java generics, C++ templates, take your pick)? No.
Re:I agree totally.... BUT (Score:3, Insightful)
Economics. Computers are cheaper than programmers, so efficiently writing code is more important than writing efficient code.
Actually, Java is hardly slower than C++ these days, so for most purposes, you can write pretty efficient code in higher level languages. C/C++ will remain for the really low-level stuff that you simply can't do in Java, and for the high-performance libraries where even the slightest speed gain will pay off in the end.
Re:c ? really? (Score:5, Insightful)
Re:it's the perception of its importance that died (Score:3, Insightful)
Embedded C++ is the worst of both worlds, IMO. It is more like C with some syntactic sugar. It removes all of the good features of C++ such as namespaces and templates.
to get rid of the stupid "typedef struct" type declarations,
You don't have to write "typedef struct" in C. Simply 'struct X {
and other C idioms such as implicit int, no proper bool support, limited variable declarations, etc.
Those things were all corrected by ISO/IEC 9899:1999, which came out just a few months after the C++ standard (and long before EC++).
Depending on your real-time nature constraints, you'll want to turn off RTTI, Exceptions, Virtual Funcs, and Multiple Inheritance and use C++ as a better C to _at least _ get some better compile time type safety.
People often say that, but I've yet to see any code example of how C++ has better compile-time type safety (assuming you are not talking about the use of templates for generic programming). The only thing that comes to mind is that in C++ you can not implicitly convert from (void *) to some other pointer type, but in C++ you would almost never use a (void *) anyway so it seems rather moot.
Re:I agree totally.... BUT (Score:5, Insightful)
Now let's take this a bit further -- how much of a performance hit do you take when you access memory that is not in the CPU's cache (or 2nd level cache)? The CPU will have to wait for the memory to be available... optimizing code that frequently accesses memory outside the cache would be useless (and would just mean the CPU has to wait a bit longer). Let's take quicksort, the algorithm isn't particularly hard but accesses memory a lot. Would it matter if one iteration takes 20 cycles or 40 cycles on a modern CPU (let's assume that's the difference between C and Java)? It will make little difference, the CPU has to wait for data anyway. In the end, even in such a low level algorithm, it will make little difference whether we used a very efficient piece of code, or a slightly less efficient one -- the bottleneck is the memory. In other words, as long as the algorithm you use is the same, both pieces of code should be about as efficient.
The only time optimizing is still worth it is when you are doing stuff in tight loops that isn't randomly accessing memory for all kinds of reasons (and which of course is used to do a lot of bulk processing, like video encoding) -- it's hard to even think of a good example, but I suppose it might be worth using more efficient code in signal processing, compression/decompression and rendering applications. Even in those cases however a lot of stuff is handled in optimized libraries for higher level languages.. I mean, it won't make any difference if I use Bash (horribly inefficient!) to call my favourite Unzip program to unzip a multi-megabyte file, or whether I wrote a C program to do the same. It would still take as long.
BAH - 10 already-dead skills... (Score:5, Insightful)
1. Knowing how to drop certain types of home comupter to re-seat the chips
2. Inserting 64k RAM chips with your bare hands to expand memory
3. Cutting a notch in 5-1/4" floppies to use the other side
4. Adjusting graphics by hand to NTSC-legal colors for decent video output
5. Editing config.sys to push drivers into HIMEM in order to free up memory
6. Crimping your own RJ45 connectors to save money
7. PEEK and POKE locations to do cool stuff on the Commodore 64
8. Manually configuring a SLIP connection to connect to the Internet (in pre-Winsock days)
9. Removing adjectives and punctuation from code comments to fit into 1k of RAM
Re:c ? really? (Score:3, Insightful)
I think you will only need to wait till 2100
if ( x > 50 )
return 1900 + x;
else
return 2000 + x;
Re:I agree totally.... BUT (Score:2, Insightful)
The problem turned out to be a new table/stored procedure combo that was part of the new code deployment. The table was missing a critical index. A simple 40-second CREATE INDEX statement produced a near-vertical drop of the CPU metric in PerfMon, from 98% to less than 10%, where it has remained most of the week. Faster hardware is not always the answer!!!!
Re:c ? really? (Score:3, Insightful)
It goes like this is: (Score:2, Insightful)
A developer who doesn't know C is a contractor fumbling with a hammer.
Don't start me with what language is the nail-gun but the idea that the
hammer is going away is by any stretch of the imagination: purely idiotic.
Re:I agree totally.... BUT (Score:3, Insightful)
It also broadens the pool of available programmers. I work for a small business. I know I'm not a great (or good, probably) programmer, but I write all kinds of applications for the company I work at. I certainly try, but I know there are probably a 1,000 ways to do what I do, better.
So why does the company allow me to write our stuff? Because we're a small company and we could never justify hiring those great programmers for every little thing we'd like to have. It's either me, the guy who probably doesn't always know the best way, or not having it at all. In the meantime, like you said, a workstation costs what a workstation costs... it's not like we're dumping extra money into hardware because of my code.
And the people who use my software? They love it. It gets the job done well (because it was designed the way they want it) and it all works fast enough. Geocoding software, log parsing and reporting, trivia engines w/ web services for multiple locations, automated RFP systems that integrate with SalesForce, mailing apps, shopping carts, document libraries, etc... all things they've gotten in the last 11 months that they probably wouldn't have purchased or hired someone to develop, but I can knock out for them in no time while still fulfilling my actual job duties. That makes me pretty damn affordable, considering I'm already worth my salary for my regular job there.
BTW, you can all blame VisualStudio and the
My predicition: (Score:3, Insightful)
Re:c ? really? (Score:2, Insightful)