Pentagon to Significantly Cut CS Research 408
GabrielF writes "Over the last few decades, DARPA, the Defense Advanced Research Projects Agency has funded some of the most successful computer science research projects in history, such as the Internet. However, according to the New York Times, DARPA has recently decided to significantly cut funding of open-ended computer science research projects in favor of projects that will yield short-term military results. Leading computer scientists, such as David Patterson, the head of the ACM are outraged and worried."
Time for a fed Dept of Information Technology (Score:3, Interesting)
Re:Technology (Score:1, Interesting)
Fighting the last war (Score:3, Interesting)
On the plus side, by the time we fight the Mongolian Khanate in 2037 we'll have the best network firewalls in the world. :)
short sighted (Score:2, Interesting)
Re:Twilight of the empire (Score:3, Interesting)
Or do you define 50% of your loans held by
Chinese banks an American success story.
Dude, we're in TRILLIONS of dollars of debt,
the boomers are about to bankrupt the rest of
the budgets.
Japan's got a few problems with banking. We've
got systemic failures.
I suppose you can look at the numbers today,
and say the US is better off. But the US
is better off because the government borrowed
trillions of dollars and pumped it into the
economy. If the Japanese did the same, they'd
look great today as well. But in 10-15 years,
when those bonds come due... look out.
Facts about Iraq and Al Qaeda (Score:2, Interesting)
2. To where did Ramzi Yousef flee after the first WTC attack? Iraq.
3. Where did Zarqawi go to hide after he got chased out of Afghanistan after 9/11/2001? Iraq.
In what country is Salman Pak [google.com], a training camp where teams of four or five terrorists were taught to hijack civilian airliners with small knives? Iraq.
Go ahead, keep fooling yourself that there was no connection between Saddam Hussein and Al Qaeda. Just because Al Qaeda is based on fanatical Islam doesn't mean someone like Saddam Hussein couldn't use them.
Re:Should I be worried? (Score:3, Interesting)
*Ahem* From your own link: The Web can be traced back to a project at CERN in 1989.
CS research is worthless
Didn't say that. I did say that there's not as much value as their used to be. The field is well saturated, and therefore is less likely to be much to be gained through expensive research. And as I also said, there's still research that's valuable, just far less overall.
real progress comes from companies like Google or Akamai. Oh wait... both came to us straight from the university (Stanford and MIT, respectively).
And how many millions of dollars did it take for PageRank to go from the start of research to an algorithm on paper? (Actually, I'd be quite interested to know. I'd expect that it probably wasn't more than a few thousand dollars.)
Rome (Score:1, Interesting)
Greece produced a wealth of culture. From Plato and Aristotle's philosophy on governments we derive our idea of the republic, where every individual's rights are important, and democracy, where the people rule. From Euclid and Pythagoras we know the principles of geometry and that the earth is round. These were discovered around 300-500BC.
Then the Romans came and put an end to this period of amazing discovery. It would be a millennium before the Rennaisance and science would be born. In Rome, everything was put in terms of fighting war. Math was only useful if it could be directly used in battle. It was taught to students in forms like "a phalanx eight deep and twenty wide consists of how many warriors?" People were not encouraged to learn for learning's sake.
Let's not make the same mistake.
Re:Well... (Score:4, Interesting)
It's the people outside the Pentagon pointing out that the money spent on futuristic weapons systems will hurt the ability to find funding for shorter-term but still rather useful projects.
Re:sigh... (Score:3, Interesting)
I think you're forgetting that a lot goes into this. If a professor gets a grant, he pays the school and his department for hosting him, for his own time, and for post-doc, graduate, and undergraduate students to work on the project. I would guess that the majority of the cost isn't in hardware, but in people's time. Who cares what kind of hardware is available if the project won't help pay your tuition? No money, no students, no research.
Re:Technology (Score:2, Interesting)
Re:Should I be worried? (Score:3, Interesting)
I haven't seen a commercial product that paid the slightest attention to CompSci foundations in years, leaving them with the sort of saleable "usefulness" that pleases the marketing department, but a bit lacking in the sort of usefulness that "gets shit done."
While I agree with your general complaint, allow me to point something out:All modern computer products are based upon the CompSci foundations laid out by researchers years ago. They don't have to pay much attention to CompSci theory, because the APIs, hardware, OSes, and Virtual Machines do all the work for them.
That being said, there are a lot of idiots in the field who cheated or slept their way through CompSci. (Or perhaps they were taught the "marketable" brand of "Comp[Not]Sci") That, however, is a separate problem from scientific research.
It could move CompSci research back to an academic field conducted in the universities, if the universities themselves hadn't already forgotten what CompSci was and devolved into Java trade schools, because Java is "useful."
The sad truth, however, is that it's happening in ALL fields. For example, most of the crack aerospace and nuclear engineers I've talked to have iterated the same complaint as you. The only difference is that they're speaking about their own field instead of CompSci. Feel safer about flying yet?
Re:Budget Defecit (Score:1, Interesting)
WTF. No money will be saved as a result of this, it's just being spent somewhere else.
Does it suck? Sure. But America has shown in elections it doesn't want European-style high taxes to pay for stuff, and when you can't pay for stuff, you can't have stuff.
What ? We have all the stuff we need plus more and we don't pay for it anyways. Check out our deficit sometime !
We should pull out all of our troops from afghanistan, iraq, korea, europe, the entire pacific rim. NOW THAT WOULD BE SAVING US SOME $$$.
Instead, people say "our social security is being financed by foreigners." No, our military is. Thats my perspective, like it or leave it bitch.
Re:Darpa should have cut spending years ago (Score:1, Interesting)
High tech is currently mature? I'm sure our children and grandchildren will disagree if we give them the chance.
We should spend on basic research because it provides the foundations for the applied research. Sure, we might have enough "basic material" to go on for a while, but eventually, we're going to need more. We could wait until we need to do more basic research, but there will be a delay and then we'll stagnate for a while. Rather than stagnate, let's keep momentum, or at least try to not stagnate as much.
You're suggesting we live off the fruits of our parents and grandparents basic research labor while not contributing anything for our children and grandchildren.
Do you listen to CDs? Do you ever store data on CDs? Do you watch DVDs? Do you ever store data on DVDs?
If so, then you benefit from basic research that was done over the last few hundred years. If for nothing else, we should spend on basic research for the same reasons that we appreciate people having done basic research in the past: basic research provides the foundations for applied research which provides the deliverables.
Anybody who questions the value of basic research should read this:
http://camel.math.ca/vault/future/moody/moody.htm
If everybody just bought off-the-shelf components instead of trying to come up with something new, we'd still be listening to music on a wax cylinder and watching B&W silent movies, or worse.
Personally, I'm very glad SONY and Philips decided to not continue buying off-the-shelf.
Re:To GabrielF or the /. editor (Score:3, Interesting)
Re:Technology (Score:2, Interesting)
Computer technology spans many "core" disciplines: processing (figuring things out), communication (hooking people and things together), visualization (graphics & simulations), etc. But one of the most significant has always been information: instant automated access to the who, what, and when of the world. ("Why," for the nonce, remains left to we mere mortals :-)
But while computers provide wonderful database access to that information, we still need someone to type most of it in. You can't OCR an invoice until it's been keyed; eye-safety and unobstructed line-of-sight requirements sharply bound barcode scalability; and ATR (automatic target recognition) "isn't there yet" in a big way.
If you want real-time access to real-world data about real-world objects, then you need some kind of inexpensive wireless automated tracking/ID mechanism which can be remotely queried through simple obstacles (cloth, paper, etc). RFID provides that.
Combined with wireless networking, RFID opens up a Pandora's Box of interesting new possibilities -- some wonderful, some frightening, especially from the privacy standpoint. But just because the "killer apps" haven't yet been identified and married to effective markets doesn't mean they aren't there.
Many other computer technologies languished on the sidelines for a few years before the groundbreaking new applications "clicked." Don't give up on these three just yet!
Re:Twilight of the empire (Score:2, Interesting)
Re:Should I be worried? (Score:2, Interesting)
I'll name just one since you just need an example - http://www.cc.gatech.edu/news/palem.pdf [gatech.edu]pbits.
Stuff that the grandparent missed:
1. Quantam computing
2. Formal verification of systems: Born in 60s and 70s to likes of Djikstra and Lamport, revived in late 80s and early 90s, used in hardware design today. Still a lot to do for software design.
Yes, compilers, wired networks and OS have been pretty stagnant as far as research is considered - and thats what computer engineers think computer science is. And sci-fi fans add AI to the list.
Software Reliability Crisis & DARPA (Score:3, Interesting)
In a way, DARPA is right to cut funding to academia. Over the last forty years, scientists have made a complete mess of programming. We now have a world full of incompatible operating systems and programming languages, a veritable tower of Babel. Yet, software is as failure prone as ever. Software disaster stories are now making the evening news on a regular basis. Does academia take the blame, even partially? Don't count on it. They've invented every excuse in the book, from "there is no silver bullet" to "we don't have enough funding." It's sickening.
I say, unless the computer science community gets off its spoiled collective ass and comes up with a solution to the software reliability problem, it deserves to get its funding decreased. Drastically.
Re:Technology (Score:2, Interesting)
Re:sigh... (Score:5, Interesting)
I'm going to go out on a limb here and say "research." I've never seen an educational institution that was wasteful about it's funding (Maybe Harvard).
Then you've never seen how research happens at a major university. Waste happens *differently* than at major corporations, but it happens in vast amounts, often in the form of wasted time.
At a private company I used to work at, when there was a minor problem with my working environment (too cold), it took a day or two to fix. At a top-rated university, a more serious problem (lights that turn off by themselves every ten minutes) took seven months to fix.
At the same company, security was taken very seriously. When the door to the server room was being repainted, we had a security guard stand there, literally watching paint dry. At the major university, we had five break-ins to our building last semester and yet it's still possible to break in in 15 seconds with nothing more than a newspaper. (The last of those break-ins cost the university about $10,000 in computer equipment, and it took four months to get the computers replaced and running again).
I haven't even started on the amount of time wasted on pointless administrative tasks (e.g. two weeks telling payroll how to do their jobs).
The professors and grad students are paid wages that nobody in the private sector would accept. They don't have crazy offices or private jets or 100,000 dollar golf club memberships.
Professors don't get crazy bonuses, but the top administrators get pretty hefty salaries and bonuses (like a beautiful house on campus). Compensation for administrators is approaching corporate levels.
Plus, universities find lots of ways to sphon off federal grant money. Any major purchase or salary coming from a federal grant gets a ~50% "overhead" charge tacked on--that money goes to the university.
It literally hurts me to see DARPA cut funding to universities (my group took a hit), but I can understand why it's happening.
Re:Pure Research (Score:3, Interesting)
Our basic research situation was bad enough 10 years ago that NEC started buying up the scientists from the other labs that were laying them off, and running it's own basic research facility at Princeton.
U.S. research used to be a three-sided affair, with the government labs, private industry, and academia, all doing some mixture of applied and basic research, and passing ideas and people between them. Every now and then an idea got loose, and became a real product. Now, we're in the grips of a mindset that believes that the world is too complicated for their undereducated minds to understand, and that a bonus today is worth the entire company tomorrow. Therefore, we're not putting money into forward-looking research, and we're not encouraging people to go into the technical fields, have dreams, and then work to make them real. We're back to the basics; entertaininment, overconsumption, dogma, and War without end.
After we finally wipe ourselves out, and the racoons evolve to replace us, I hope they're more farsighted than we.
Re:Twilight of the empire (Score:3, Interesting)
Agriculture, entertainment and industry account for a huge chunk of the US economy [doc.gov].
It's twilight of the empire, folks. If you're young, start learning another language.
Half empty, eh?
Re:sigh... (Score:3, Interesting)
Total tangent here, but my father used to work for Hughes Aircraft Company (back when they still existed) and the numbnuts facilities manager of the building in which he worked, in an attempt to "save electricity" and earn some brownnose points, decided to replace all the office light switches with motion sensor switches to turn off the lights when no one was there. Well, in a building full of engineers where they frequently spent hours at a time making notes or calculations by hand on paper (this was the 70's), those motion sensors would shut off the lights because an angineer writing at his desk wasn't moving enough to trigger the sensor. In the end, two hundred-odd irate engineers made a variety of breeze-driven "movement generators"-- everything from a single sheet of paper on a string to complex windmills and mobiles-- and hung them from the AC/heat vents. In order for them to work, they had to have constant air flow so they kept the blower fans running all day. So the final result was a net LOSS, as the lights still stayed lit, but the fans ran all day every day instead of intermittently. Sometimes nothing wastes like conservation.
Military Research Solves the Wrong Problems (Score:5, Interesting)
We do occasionally get good things out of it, and it does let bright people develop ideas and technologies that have broader uses, but mostly it develops better and better technology for killing people. Sure, we've gotten communications satellites, and the Internet does things that UUCP-net didn't do. But there's a huge amount of solar energy research that simply didn't get done because the college kids who were good at thermodynamics went to work developing aerospace technology instead. And while that aerospace technology has civilian applications, much more of it is for jumbo jets than for small private aircraft and free-flight navigation that would make air travel more practical and decentralized. (I *still* want my flying car :-)
Some of the agricultural research has been seriously useful. But too much of it has been directed in ways that support big agribusiness quasi-industrial farms instead of family farms, and towards pesticides that enable mass production, toward genetically modifying plants to make them more resistant to pesticides so that they're more practical for pesticide-based farming, and towards monocultures rather than increased diversity. And if you thought software patents were nasty, you should go look at the biological patent explosions of the last 20-30 years.
Medical research seems like it wouldn't have this problem, and while it's nowhere near as bad, it's still a mixed bag. Most medical techniques that are useful on battlefields are useful on other trauma, and more Americans are still killed every year by the side-effects of the War on Drugs than the wars for oil, and far more by car accidents than either one. But government-funded medical research has unfortunate interactions with the FDA's regulation of new drug development - the regulatory barriers make it economically difficult to develop drugs that have less than a billion-dollar market, and the government funding tends to encourage large labs, and make up for some of the regulatory problems by funding universities which can avoid the regulatory barriers rather than fixing the regulatory barriers.
Short-term military-focused research is far more of an interference to the evolution of our economy than long-term mixed-use research. But they're both bad.
Re:Technology (Score:2, Interesting)
Funny you mention the word "monopoly."
With Microsoft owning the applications, OS, development tools and (by extension) methodologies the overwhelming majority of the populace uses, that prevents researchers from implementing their ideas into the software most people see. There's been a complete lack of visibility for this research for the past fifteen years, because of Microsoft's monopoly. The cutting of projects is the obvious next step.
The nationalistic decision by other countries to block Microsoft is what will allow those countries to eventually reap the benefits of higher-end research. The prime reason Linux is taking off is because researchers can implement their ideas in it; and if they want to replace the kernel wholesale, the GNU tools are there to turn the idea into an OS.
Next thing you know, we Americans are wondering, "Why is all of the cool stuff on my computer coming from [insert foreign country here] when ten years ago it was all American?" Just like what happened in the home electronics industry in the 80's, when suddenly all the great stuff was Japanese...
Seen KDE 3.4? It's already beyond Windows XP's interface, and it's starting to look and feel better than Panther's interface. This is not an American invention...
Re:Should I be worried? (Score:3, Interesting)
>>> - breakthroughs in graphics
>
>All designed in the 60's through 80's, but lacking in powerful enough hardware until the late 90's.
Total nonsense. Most of the recent advances---such as fluid sims, deformable objects, motion capture, and the like---were made possible because of better algorithms---i.e., research---rather than any advance in hardware. I can guarantee you running algorithms from the 60's-80's on modern hardware wouldn't give you the kinds of results the multi-billion-dollar entertainment industries are looking for.
>>> - breakthroughs in vision
>
> ???
Did you get mail today? How do you think it got sorted? Computer vision algorithms started doing that in the last two decades.
Most uses of vision in industry are pretty low-profile---things like automatic verification of manufactured component quality---but are neither trivial nor ancient.
>>> - stunning advancements in computer architecture
>
>Eh? What stunning advancements? Most of the architectures in
> use today go all the way back to the early 70's.
You'd be a fool to think that a P4 is 70's technology just because an 8086 was designed a long time ago. Building a computer with modern lithography and 70's-era designs would be a laughable failure; caching, for example, has improved hugely since then, with significant work on parallelizing the multi-stage decoding, fetching, and execution of individual instructions with extensive branch prediction and speculative prefetching.
Part of the problem is, earth-shaking discoveries don't spring fully-formed from a computer scientist's brow. Each one is built up over years of painstaking work, carefully laying the groundwork necessary to get there.
That's the reason you can point to much earlier precursors of "recent" advances, and also the reason you can't point to truly recent ones---the research that's being done right now is too abstract and specialized for you to know about it, and by the time it's something that you'd have heard of, it's probably no longer new.
Essentially, your complaint is "why haven't I heard of all the new advances at the cutting edge of computer science???" My response is "why should we go out of our way to tell you what we're working on if you can't be bothered to look for yourself?"
You haven't heard because you haven't looked hard enough. The only one to blame for that is you.