Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
News

All Science is Computer Science [Y/N]? 156

angainor sent in this interesting piece: "There is an article in NY Times which claims that in fact all science is computer science. He does some small talk about the fields of modern science where computers have been successfully used. But that's it. Does he really know what he is talking about? Read this piece, but don't be proud just because you are too a computer "scientist"." The writer has a good point about new advances in many fields being due to large amounts of computing power being applied.
This discussion has been archived. No new comments can be posted.

All Science is Computer Science [Y/N]?

Comments Filter:
  • Larry Ellision didn't go into computing.

    He went into sales, and decided that he'd make the most money shilling computers.
  • I don't think that's being discounted, but simply using mathematics doesn't make one a scientist. A computer programmer uses mathematics in his programming in much the same way that a civil engineer uses mathematics in designing a bridge. Neither of them are really "scientists," but rather engineers who apply scientific knowledge to particular problems.
  • by Phaid ( 938 ) on Saturday March 24, 2001 @07:21AM (#342937) Homepage
    Most people who get CS degrees are the farthest thing from being actual computer "scientists". Real computer science is basically mathematics - whether it is finite automata or database normalization, it boils down to math.

    On the other hand, computer programming, which is really what the vast majority of CS people do, is the farthest thing from science. If it were done with discipline and planning, you might be able to call it engineering, but really when you look at the way software is actually created, it can't even be called that.

    So let's not flatter ourselves. The fact that you use computers as a tool in true scientific research, or you program computers to do specific tasks, in no way makes you a computer scientist.
  • If his reasoning held up one could, with a little effort, argue that computing (due to the work of Turing, Chruch, etc.) is based upon mathematics which, in turn (due to the work of Russell, Frege, etc.) is based upon logic which, classically speaking, is a branch of philosophy.

    Thus everyone is doing philosophy. And indeed, the degree they get for their first bout of postgrad research is a Doctor of Philosophy.

    Thus the whole world's turning into a bunch of philosophers.

    Even then, we'd go back to the usual subject classifications just to tell people who 'did philosophy' in different areas apart from one another.

    Basically, the dependence on computers underlies their importance, and consequently that of mathematics, logic and all the other branches of 'classical science' that computer science/computing-in-general draws its inspiration from.
    John
  • This article is typical of an overload of the term "computer science" by people who don't draw distinctions finely enough.

    There are (at least) three distinct areas where you need a term using words like computer and science. They are:

    • Computing Science (or Computer Science) -- typical CS stuff; algorithms research, working on good models of computation. Kind of a cross between pure mathematics and Operations Research; lots of good stuff here, but not really science in the hypothesis/test/conclusions sense.
    • Scientific Computing The part of CS particularly focused on scientific applications. Numerical methods, especially for PDE solving or simulation; efficient parallel (or serial) algorithms for typical science applications; maybe even including things like parallel IO .
    • Computational Science Like theoretical science or experiemental science or observational science, this is science-of-choice done using a particular tool; here, computer simulations or calculations. The first two were focused on computation, possibly with science as an application; this one is focused on science, with computing as a tool.
    The distinctions are important, because otherwise its hard to talk intelligently about science and computing, as we see in the NYT article. If the article read ``All Science is Computational Science'', it'd be an overstatement but at least make some sense. As it is, the article is clearly nonsense.
  • If "insightful" means "trying to use psychic powers of sight to guess what the article says", then sure, this deserves +100.

    In fact, what the article says is that all sciences are becoming computer-dependent -- not "computer science" the field, but "computer-science" (as in, science done with computers), thus leading companies and researchers in other sciences to invest in computer science (the field).
  • I agree. They don't really mean "computer science", they mean "computer-science". (That is, not CS per se, but rather that science dependent on computers.) A lot of non-geeks don't really understand what CS actually is.

    That said, the article _does_ imply that because of this need, science-based companies like Celera are beginning to invest in the actual field of computer science.
  • Ahh, but all science _IS_ quantum physics, in that if you apply the rules of QM to your system, it will give the correct answer.

    Please apply those ideas to gravity. :-)

    Fun aside, unless someone comes up with the GUT, putting together the particle/field ideas from the quantumn theories of electro-weak and strong interaction with the geometric ideas from general relativity, I have serious doubts that quantumn mechanics (which one? :-) covers all of science.

    As for math on the other hand, it is true that nearly all science involves math, but if you just go by the math equations, you can sometimes get non-physical solutions.

    Of course. Mathematics provides a lot of consistent models (heck even a lot of consistent mathematics depending on how you choose the axioms), but picking that model that modells reality is physics.

    On a side note it is rather interesting how many professors of theoretic computer science seem not to care much about quantumn computing.

    The basic models from the theory of computation on which various famous theorems rely (including the halting problem) all stem on a model that models a machine governed by the laws of classical mechanics, which is only a approximation of course, as nature is mostly quantumn mechnical. Thus computation models that model a quantumn mechanical machine are supposed to show different computational behaviour.

    When talking about such systems, the usual computer scientist gives you a look like you were talking about warp drives.

  • Science with Computers !~ Computer Science. A physicist might only be able to interface the large piece of technology he or she uses to do research, but the core piece of computer science research (algorithms) is either a minor element of the work or none at all. Same with so many other computer-enhanced works.
  • Like so many journalists who are ignorant of the actual process of science, this author cannot understand the difference between using a tool and studying the nature of a thing.

    Computer science is the study of computing--the theory and practice that makes the machines work.

    Physics (for example) uses computers as a tool to study the laws of the universe.

    The former is interested in computers themselves. The latter is interested in computers as a tool to study the primary discipline.

  • Math is a useful *tool* for science, but science and math are in fact entirely different. Math is *true*
    All science, sooner or later, is *false*. That's because theorems can be proven, but hypotheses can only be disproven
  • by Jonathan ( 5011 ) on Saturday March 24, 2001 @07:18AM (#342946) Homepage
    And I, being a bioinformatician, am one of the sorts of people the article is talking about! I did my doctorate in a microbiology department (doing bioinformatics really) and did a postdoctoral stint in a computer science department, so I can compare the two fields.

    While I think that working in a computer science department gave me an interesting perspective on problem solving, the fact is that computer science really doesn't deal with making actual programs that do things, but with more esoteric things like proving problems to be NP-hard. The sorts of applied knowledge that is useful to other fields isn't really central to the aims of computer science as such. This isn't a slam on computer science -- you can make a similar claim between the basic science of microbiology and the applied knowledge useful for treating infections.
  • Sooner or later, all Sciences reduce to the study of Mathematics.

    So, every scientific field builds on the advances of all other scientific fields.

    So... what's the point???

    --

  • Perhaps for some theoretical non-human scientist that has a consciousness structured so that all facts are always explicit, who can simultaneously sustain conscious awareness of every level of analysis, and then communicate this to their peers, the idea behind this statement might be true. But insofar as human science refers to knowledge possessed by humans, this statement is incorrect.
  • Thus, all science is math. So we can lump all science together, but certainly not as computer 'science'. Actually, all science is auto mechanics, because mathematicians drive cars.

    Its been said that any discipline that has the word 'science' added to it isn't 'real' science. That phrase, of course, was no doubt coined by a physicist. But to gain a perspective on the nature of a given theoretical discipline, look at what the discipline in question has produced.

    The study of physics has given rise to modern power systems, telecommunications, and nuclear power, to name just a few. Those engaged in study of biology have discovered selective breeding, penecilin, and the nature of how disease is spread and treated. Chemistry has given rise to most of the materials that make most of what we use in the course of our daily pursuits, including computers.

    From the study of algorithms and data structures has come . . . Microsoft windows and office. An industry where 'standards' are all but non-existent and most of the products of of a quality so bad that they can no longer be sold. Software makers must get people addicted to their software and then charge for rent and repair. The computer industry is advancing not because software (algorithms and data structures) is improving, but because the hardware is improving. If computer hardware didn't improve, computer software wouldn't improve. Computer science has very little if anything at all to do with computer hardware.

    And if you use the argument that computers are being used to design computers, remember that the Pentium IV isn't much faster than a pentium III - in fact for some tasks, its slower.

    Physics and chemistry is what builds computers. I have a friend who owns a company that sells a .07 micron process to chip manufacturers. I asked him how he achieved that elusive goal. He said, "I have a very good physicist."
  • Science *wasn't* "dependant" (if you can read so well, why can't you spell? - oops, flamebait) on computers in the past. But things change. Science is becoming increasingly dependEnt on computers to do its work. Why is this a bad thing? I don't think the author of the editorial was saying "science is impossible without computers". However, walk into *any* scientific research institute and you'll see computers. Lots of them. They sure make life easier. Sometimes. :)
  • Software engineering isn't Engineering either.
  • Uh. All science is not computer science. The article essentially says that a lot of scientific fields rely on massive computations performed by computers. Massive computations != computer science. CS has a lot more in common with logic and math, and formal theories of computation. Fundamentally, it has nothing to do with vanilla number crunching. Following the lead of this article, one might as well say "all science is engineering", because nearly all the sciences rely on well-engineered pieces of equipment to test their theories, gather their data, and a myriad of other things. I think a better way of phrasing it would be that 'computers have become a fundamental part of modern science (and modern life for most of the western world)' -Laxitive
  • Computers are *tools*. There is a science of how to make/apply these tools effectively, and there are some sciences that are "adjacent" to that science, but that is not the same as science being enabled BY the tools.
  • God forbid you actually go into a field you enjoy rather than one in which you hope to become famous.

    Besides, last I checked Larry Ellison didn't know much about computer science or genetic engineering.

    The real problem with computers is now that they are so dang popular the real advances don't get the press that a new release of Linux 2.4 or a new Athlon does. People aren't interested in trying out new language paradigms because, gall darnit, if C was good enough for K&R it's good enough for them. People aren't interested in trying out new kinds of software because they're comfortable with the old kind.

    It's kinda like saying there hasn't been any development in automotive technology when what you really mean is that the cool developments take decades to actually be implemented (if they ever make it) so you never hear about.

    Pie in the sky fields like genetic engineering can fill their press releases with things like "some day we may be able to use this technology to cure congenital birth defects". More established industries like computers that already have shipping products have to be slightly more...pragmatic.

    Has genetic engineering really made big advances in the past few years? Or is that just the spin that biotech companies have put on it? Or is it our own biases based on our ingrained awe for biology and contempt for mere machines? Even if it has made great leaps in the past, is genetic engineering likely to do so in the next few years as well? How can you even begin to quantify what counts as a "big advance" except through hind sight? I think it was Yogi Berra who said, "It's hard to predict things. Especially the future."
  • Computer science is a part of discrete math (actually it is the part of discrete math dealing with computation, things derived from lambda- and combinator-calculus). How could this be the same as, say physics, which (most of the time) deals with algebra?

    Oh, I see, he means computer sciense, not computing sciense!
  • This is just the NY Times doing a typical media headline troll. The article doesn't actually say what the headline implies.

    Computers have become an incredible and indispensable tool in the advancement of all the sciences, but that doesn't make "all science computer science". One could just as easily say that "all science is quantum physics" or "all science is math" and it would have the same degree of truth, i.e. some but not enough to be considered a generally true statement.

  • > Also it would take a LONG time to work out the biology of a human being from quantum physics, but it could be done. :-)

    That may be true (although I'd like to see you prove it), but would it be meaningful or useful? As another reply has pointed out, you're expressing a standard reductionist position, but it's one that isn't even held by most good physicists. For example, here's a quote from quantum physics professor Howard Georgi of Harvard (taken from here [magna.com.au]):

    ...the statement, "chemistry and biology are branches of physics" is not true. It *is* true that in chemistry and biology one does not encounter any new physical principles. But the systems on which the old principles act differ in such a drastic and qualitative way in the different fields that it is simply not *useful* to regard one as a branch of another. Indeed, the systems are so different that `principles' of new kinds must be developed, and it is the principles which are inherently chemical or biological which are important.
    -- Howard Georgi, "Effective quantum field theories", in The New Physics, ed. Paul Davies

    Another good intro to these kinds of issues is Murray Gell-Mann's book, "The Quark and the Jaguar". Gell-Mann's credentials as a quantum physicist are beyond reproach, but he is by no means a reductionist, and has a keen appreciation for the unique properties of complex systems - the jaguar in the title of his book is a metaphor for this.

    Since many other physicists and philosophers more qualified than I have written on this topic, I'll restrict my reponse to a freewheeling, extended analogy: quantum physics can be compared to a CPU's instruction set, or "machine code". On top of that, we layer assemblers, and then compilers and interpreters for various languages. Using compilers and interpreters, we build various systems and applications. Since ultimately, all of these things are done using machine code, is it meaningful to say that all applications are "just machine code"? There's a sense in which this is true, but let's examine it further.

    With the CPU analogy, we can do something we can't do in our single physical universe: we can take an application and compile it on a different type of CPU - a CPU with a different instruction set. Compiled for this CPU, the application still behaves identically. So by claiming that an application is machine code, we're clearly missing an important point, since the same application functionality can be achieved with completely different machine code. [Of course, both CPUs follow a more fundamental set of information theory laws, but that's not important to the argument.] The point is that complex systems exhibit "emergent properties", characteristics which arise from interactions between components of the system in question, and which can't be meaningfully analyzed, or even easily inferred, from the perspective of more basic, underlying systems.

    To cut this short (well, shorter than it would be otherwise), I'm going to make a few leaps. Imagine for a moment that we could build a toy universe in the laboratory, with different physical laws than our own. Even though its physical laws are different, it's not impossible - in fact it's quite likely - that complex systems in that universe could share some of the properties of complex systems in our universe. To take an extreme example (as I said, I'm leaping), imagine an intelligence in this other universe, and assume we could communicate with it somehow. We would probably find that we share some basic characteristics with this alien intelligence. For example, it is a common characteristic of living systems that they have a strong bias toward survival, simply because those that don't, die out. This survival instinct is something that's not a direct or obvious consequence of quantum mechanics - it's actually rooted in simple logic (perhaps all science is logic?!)

    Even if you could somehow come up with a QM model for the survival instinct, it would miss the point, since it's quite conceivable that a survival instinct could arise in a universe not based on QM - it really has nothing to do with QM. The survival instinct is just one example of an emergent property of complex systems - in this case a living system - that has little or nothing to do with the physical construction of the system.

  • Please apply those ideas to gravity. :-)

    Good catch!

    When talking about [quantum computing] systems, the usual computer scientist gives you a look like you were talking about warp drives.

    I think the warp drive analogy is fairly accurate. Building quantum computers is a task somewhat outside the realm of computer science, at least as it's usually defined. And, since no useful quantum computer has yet been built, we don't know for sure what physical limits we're going to run into with them. A lot of the discussion about quantum computing to date has been about unrestricted theoretical possibilities. Sure, we could develop a mathematical computing theory based on the imagined properties of a perfect quantum computer - but what are the chances that a real-world quantum computer will be as unconstrained as we would like it to be? I suggest we revisit this discussion in 25 years, when I predict that quantum computers will be "about five years" from being viable...

  • by dillon_rinker ( 17944 ) on Saturday March 24, 2001 @07:28AM (#342959) Homepage
    AP, UPI (NY)

    In a stunning announcement today, the American Automobile Association stated "All science is actually automotive engineering." A recent study has shown that over 98% of scientific developments required the use of a car. "Without these amazing machines, I'd have to walk 45 miles to work, up hill both ways!" one scientist was quoted as saying.

    In a related event, the Carpenters Union announced "All science is actually construction." Nearly 100% of lab experiments take place in buildings that were built by builders. Without us, there'd be no labs, no checking of theories - in short, no scientific advancement." Theoretical mathematicians scoffed at the announcement, but other scientists confirmed that most labs are not in caves or other natural structures.

    Meanwhile, representatives of the National Restaurant Association are preparing press releases to explain that all science is actually eating. 100% of scientists contacted by this reporter confirmed that they would be unable to do science if it weren't for food.
  • But... but... the bridge didn't collapse in the simulation! How would I know that people would put vehicles on top of it. Sheesh. Besides... it wasn't my fault... it was a software glitch. Blame the app-vendor.

    ;)


    ---
    I came here for a good argument!
    No you didn't! No, you came here for an argument.
  • Most programmers use (formally or informally) the scientific process to debug software defects. They witness unexpected behaviour, create a reproducible test, postulate a cause, compare the new test results with the expected/control results. Are programmers now scientists when debugging, but not when writing new code?
  • The naive ignorance and general gullibility of the public never ceases to amaze me. Claiming that everyone who knows how to drive requires is also a mechanic does not compute. Very broadly speaking, there are 3 branches of mathematics which feed into computers
    - statistical = accounting = infosys
    - discrete = binary/automata = computer science
    - continuous = scientific fields = computational science

    Basically computers have matured to the stage where nowadays CSEE are nothing more than software engineering techniques, but the level varies according to the stage of hardware->firmware->software->wetware. (as Intel? CEO once said, hardware is nothing but frozen software). Computers are useful because they act as mental accelerators allowing you to do stuff overnight or in between coffee breaks or QUAKE sessions. But by itself, the theory is rooted in various branches of maths split into the business of computing (variations of the accounting equation), art of computing (Knuth/algorithms/etc) and the science of computing (complex systems/quantum effects/etc). For some strange reason fun and money seem to have an inverse relationship along this continuum.

    For the average layperson who barely recognises how to access the internet (gee-whiz, moving text) the distinctions are superfluous but it doesn't help when the media confuses mathematics with their applications.

    LL
  • Ahh, but all science _IS_ quantum physics, in that if you apply the rules of QM to your system, it will give the correct answer. Of course there are all sorts of shortcuts like Newton's Laws, or the Universal Gas Law or Ohm's law or what have you. Also it would take a LONG time to work out the biology of a human being from quantum physics, but it could be done. :-)

    As for math on the other hand, it is true that nearly all science involves math, but if you just go by the math equations, you can sometimes get non-physical solutions.

    Anyway IAAP (I am a physicist) :-)
  • As I see it, CS is just a tool to automate mathematics on computers. It will never replace nor bilogy nor physics nor mathematics.

    ---------
    A soon-to-be Biologist
  • All science is math. In the theory sense, not just calculus and number crunching.
  • Unless something's changed since I stopped paying attention, QCs are Turing-equivalent -- no more, possibly less.

    I thought so too, then I saw a paper by Calude, Dinneen, and Svozil.

    Using the counterfactual effect, we demonstrate that with better than 50% chance we can determine whether an arbitrary universal Turing machine will halt on an arbitrarily given program. As an application we indicate a probabilistic method for computing the busy beaver function--- a classical uncomputable function. These results suggest a possibility of going beyond the Turing barrier.
    Counterfactual Effect, the Halting Problem, and the Busy Beaver Function (1999) [nec.com]
  • The headline was just an eye grabber, and the Slashdot submitter managed to grossly misinterpret the article's intent. The purpose of the article was to suggest that science is a great deal more dependant on computers now than it ever has been in the past, primarily due to the availability of extremely fast machines.

    Worth mentioning that the NYT article was drastically better written and better researched than any link to somebody else's article that I've ever found on Slashdot.

    Don't bash NYT just because people have no attention span.

    -rt
    ======
    Now, I think it would be GOOD to buy FIVE or SIX STUDEBAKERS

  • The word he puts in quote, (presumably meaning a direct quote from the article) only occurs once in the article, and it doesn't say that all science is dependant on CS, somehow or otherwise, he simply says it is increasingly dependant. He then goes on to site examples of recent major scientific breakthroughs which would have been unimaginable without CS. He doesn't say physicists are doing code, with the possible exception of the Larry Ellison bit at the end, and that was just a journalistic flourish. It's a god damned editorial, for chrissakes, and it's not a bad one.

    Try reading it.

    -rt
    ======
    Now, I think it would be GOOD to buy FIVE or SIX STUDEBAKERS

  • Well, my subject about says it all. Just because most scientists can and do use cars and pencils, we don't refer to them as racing drivers or pencil-operators.

    "All scientists are programmers" would have been a truer headline, as would "All programmers are not computer scientists".
  • by SpinyNorman ( 33776 ) on Saturday March 24, 2001 @07:34AM (#342970)
    "If I were 21 years old," he said at a company conference in New Orleans, "I probably wouldn't go into computing. The computing industry is about to become boring. I'd go into genetic engineering."

    This rings true to me. Much of Comp Sci (my chosen profession, though I suck at it) seems to have a lack of discovery and / or innovation these days, with the exception of nanocomputing. Much of the rest of it is innovation, not invention / discovery. How many Turings do we have in Comp Sci now?


    Computing / computer science is a skill rather than an industry. While I'm a programmer, and have worked for a computer company (Acorn), I've also worked for a medical company and a couple of communications companies. I'm sure I could get a programming job at a medical company doing genetic research if I set my mind to it.

    Secondly, how many people working in genetics are making fundamental discoveries, and how many are just grunts doing their job? For that matter, how many people's jobs in *any* field allow them to do blue skies research of the type that may lead to fundamental discovery?

    I've long ago realized I had to separate my intellectual interests from my job. While I've been lucky to have extremely interesting work assignments, it's at home that I become the "mad scientist". :-)

  • You're overestimating the potential impact of quantum computers. Unless something's changed since I stopped paying attention, QCs are Turing-equivalent -- no more, possibly less. At best, they're nondeterminstic (or do a good impression), but that's hardly a breakthough for theory. Everything we know about algorithms, formal languages, and computation still applies. On the other hand, they're hard to build, fragile, and almost useless as general purpose computers (I haven't yet heard developed proposal for I/O, which is vital to just about everything most computers do).
  • All science is computer science is true in as far as it goes. The more accurate assertion would be to say that computer science is a new way of looking at information theory (which pre-dates even Babbage).

    For example, high energy physics is the process of deducing, without knowing the underlying properties of the Universe, what behavior we will see when we "crank up the heat" of the universe. If we knew the underlying properties, however, math could tell us the rest. This math, it seems, is more complex than the pre-Dirac world had thought. It does, in fact, seem to involve some rudimetary logic. Hence, the study of the universe is the study of an information system with logic, math and vast "memory", which is not unlike Turing's paper tape.

    Computer science is math, and math is the Universe. As computer science expands and more generically encompases all of mathematics, the lines get grayer. If it is fair to say that all science is math (and I think it is), it is getting increasingly more accurate to say that all science is computer science.
  • Why would math not describe the natural world? In fact, I think the worst cul de sac's of mathematics have been the pure-math-because-I-can-do-the-proof sort of things where there is no real-world application (e.g. complex math which just forces a certain polynomial rigor, which you don't need i for in the first place).

    Did you think that I was trying to bad-mouth math at any point in my post?
  • I got this quote from someone here on slashdot.

    Ironicly its too true.

    "The entire body of computer science can be viewed as nothing more than the development of efficient methods for the storage, transportation, encoding, and rendering of pornography."
  • Just at the time I gave up being an academic ('Savant Fellow in Computing', no less) to start my first startup, the Department of Mathematics at the University at which I worked was moved in to the School of Computing. I said at the time, and I still say, that that is like putting the Department of English Literature into the School of Penmaking.

    A computer is a tool. Its use, like its construction, is a technique. In the early days of computing it made sense to pull together multidisciplinary teams from mathematics, physics, philosophy and engineering together to make the things work in the first place. That's been done.

    There are still interesting things to be done in Physics and Engineering which may, in the fullness of time, lead to better hardware, and there are still interesting things being done in Mathematics, Linguistics and Philosophy which will, in the fullness of time, lead to better software.

    But there is fundamentally no such thing as Computer Science.

  • by Multics ( 45254 ) on Saturday March 24, 2001 @10:01AM (#342977) Journal
    Computers are only a tool

    ... until they become sentient.

    -- Multics

  • I bet most the scientists drive cars (or use public transportation) between their place of residence and their place of work. In fact, many important discoveries would not have been possible if they had no way to transport themselves to the lab. Does this make them car designers or mechanics?

    I didn't think so.

    --
    SecretAsianMan (54.5% Slashdot pure)
  • I'm no math wiz, but isn't CS basically all based on number theory, computability, and mathematics that were around before anybody actually assembled a physical computer (computers themselves were just thought experiments until they became a physical reality, right?).

    And in turn, we just found that math is basically "full of holes" [slashdot.org].
  • Precisely. The phrase "Computer Science" refers to a math/logic subject which was given that name. But computer can describe science, the same way any noun can be used as an adjective. Such "computer science" isn't the same thing, so the headline is bad work.

    The rest of the article is good, though, in pointing out how compute-intensive much modern science is. Still, it's not CS; computers are just a powerful tool.
  • Most people who get CS degrees are the farthest thing from being actual computer "scientists".
    Yes, and most people who get Life Sciences degrees aren't "scientists" either -- they're MDs. But that doesn't mean that computer scientists aren't "real" scientists... it's just that most people who study computer science end up leaving the field and becoming programmers instead.

    It's a subtle distinction, but being a "real" computer scientist myself, I'm sensitive about it.
  • I agree with the article that you are seeing lots more simulation in science, and regardless of whether you think that is a good thing, it's happening. The problem as I see it is that there aren't many people out there who know how to solve these types of problems compared to the people who know how to do web or UI stuff.


    Not just in science, but marketing, economics, manufacturing, and lots of other fields, people want huge numbers crunched in order to either figure out how something works, or to optimize some sort of process. For the past seven years or so you have seen the evolution of the computer as a communications device. Now it's time to do some computing.


    The kind of person who is going to succeed in this new area is not the computer-centric person that used to rule in the Internet age. Society is now calling for people who can talk to scientists and experts in the field. This person needs to understand the field well enough to get a handle on the problem, and then apply his knowledge of algorithms, and raw programming ability to the task of solving the problem.


    My advice is to learn all you can about algorithms, and have a solid understanding of math. If you can talk in mathematics, and write in code, you have a bright future.

  • Well, I think it's more than just computational physicists. Certainly Physics has some great software available for QCD and other types of simulations. However, there is a lot more out there than just physics. All kinds of fields now need basically two things. Simulation and Optimization. Simulation to understand how a complex system works and an environment to try new things out before you have to commit to reality, and optimization for things where you have a model and need to find out the best way to accomplish a task.

    Airlines, for example, have been on the cutting edge of this. They have very elaborate resource scheduling programs to put their pilots, other staff, and airplanes in the right place at the right time to make the most money. Also they have the (much derided) pricing systems which determine the highest price they can charge for a ticket and still fill the plane. Both of these systems have made the airline industry much more profitable. In finance, computational models have made a big splash in the past fifteen years or so, and in fact it's physicists that are doing a lot of the work.

    My broader point though, is that you're going to see the PHP/Perl/ASP people occupying the same position in a few years that HTML people did a few years ago. Their skills are widespread and mostly fungible. The people in huge demand are those that can model and optimize problems with big iron. I certainly expect their to be a growing number of modeling and optimization consultancies popping up soon just as you saw the web development consultancies pop up in the 1995-1996 timeframe.

  • All "hard science" is math. What is CS based on if not math?
  • Next you'll be telling us that, since art can be produced, manipulated, reproduced and analysed with the aid of computers, that all art is computer art.

    Since speech can be produced, manipulated, reproduced and analysed with the aid of computers, that all speech is computer speech.

    Since music can be produced, manipulated, reproduced and analysed with the aid of computers, that all music is computer music.

    That is just ridiculous. Computers are simply our way of patching our brains to make up for the difficulty most of us have with performing sustained, repetitive calculations. Cept we haven't managed to 'open the source' to our brains and compile in the changes yet, which is why we're fucking around with these dynamically linked modules we call computers.

  • I am sorry, computer science is an oxymoron! A science is a field of study in which you attempt to deduce the laws of the universe. Sure the computer universe has laws...but they are mutable...all you have to do is hack them into the kernel. Computer "''''''''Science is not a science. It is however applied math, in the case of databases and encryption etc, or applied psycology in games or web pages etc... Computer science does not exist, therefore it cannot be the basis for all other sciences. Clonan
  • I use physics every day. Without it, I couldn't move, walk, live. Therefore, everything that lives, and indirectly or directly uses the results of some physicists work are now physicists.

    I realize that the headline and last sentence of the article weren't really the point of the article, but they still made me take what the author said with a grain of salt. Basically, he said that many fields of science are using computers these days, nothing more revolutionary than that. Perhaps the headline was just an attention grabber, or just a grossly uninformed statement.
    ---------------
  • Exactly correct. Computer Science is the science of computers not the science of fungi. Now had the author said that all sciences were math, he may have had a point (not too sure about this myself, but I throw it out there).
  • Utilizing computational power is NOT computer science. Designing processors is computer science (and computer engineering). Designing computationally efficient algorithms is computer science. Analyzing algorithm complexity is computer science. It's hilarious that so many people equate computer science to "using computers." It's like equating "hacking" and "cracking."

    Computer science (and computer engineering) lays the foundations for other fields to effectively use computers. Where would physicists and biologists be if significant time and effort had never been invested in developing programming languages, communication protocols and designing processors?

    Anyway, that NYT article is just plain silly. The Larry Ellison quote tops it off. I'll agree that much of the computing industry is boring, but computer science is an academic field and it'll be a l--o--n--g time before CS begins to get boring. Go ask good 'ol George Johnson what he thinks of Artificial Intelligence...

    Jason
  • by diablovision ( 83618 ) on Saturday March 24, 2001 @09:22AM (#342991)
    Real computer scientists don't use computers.
    :P
  • "This rings true to me. Much of Comp Sci (my chosen profession, though I suck at it) seems to have a lack of discovery and / or innovation these days, with the exception of nanocomputing. Much of the rest of it is innovation, not invention / discovery. How many Turings do we have in Comp Sci now? "

    I disagree. Computer science is about to get much more interesting in the way you mention, when quantum computing starts getting taken seriously. The entire field of algorithms needs to be rewritten for quantum computers. The fields of cryptography, compiler design, languages, and even theory of computation need to be rewritten. NP-hard doesn't necessarily mean what it used to (it doesn't make a problem intractable with quantum computers). The whole heirarchy of decidability has to be looked at a litle differently.
  • by Noer ( 85363 ) on Saturday March 24, 2001 @07:06AM (#342993)
    Most (if not all) sciences now use computers as tools, but that's no different from using calculators as tools, or calculus as a tool, or statistical analysis as a tool. That does not mean that all sciences are mathematics or engineering. Physicists now need to be able to write code and use computers in fairly sophisticated ways, but they do NOT need to be computer scientists. Computer scientists do NOT just write code; they're generally developing more theoretical stuff, such as the theory of computation, or artificial intelligence, or advanced operating system design. It would be like calling someone who uses physics on a daily basis (gee, pretty much everyone, though I had in mind someone like a radiologist) a physicist.

    The difference is between using tools and theories (which does not make someone a scientist in that discipline, in this case computer science), and DEVELOPING those tools and theories, which is the job of scientists in various disciplines.
  • isnt all computer science state machine theory? so all science is really about the study of the turing machine?

    "all your science belong to us"
    yeah right

  • by dwj ( 91832 )
    Actually, an increasingly common use of computer simulations is for a theorist to quickly run a model and then fix his theory as required. For example, the NYT article mentions cosmology simulation, which due to the dearth of (and questionable reliability of) observational data is almost entirely simulation only. And yet it is enough to drive the theorists on, until a new batch of real observational data arrives. Everyone would agree that the simulation is not true experimental data, but it is still data. The old cliche that the lines are blurred is true again in this case. Simulations can provide the same serendipitous results as real data and can thus inspire new theories in the same way.

    -dwj
  • I completely agree. "Computer science" was probably an unfortunate (deliberate?) misnomer in this case, bringing down the wrath of many CS veterans of slashdot.

    You note the lack of innovation in CS. I have no idea if there are some universities out there who really don't treat CS as "training for a tech job" but I've been getting that impression. CS theory seems to be languishing, with a few valiant attempts on the "is P=NP?" problem appearing now and then with predictable results. Even quantum computing hasn't really rocked the boat yet. There seems to be some work in computational linguistics but it didn't seem like such a fruitful field. I might even go so far as to say these are all true, and that's because CS is an early science. Would anyone care to correct this impression? I'm anxious to hear about counterexamples. I'm another one of those "computer science physicists."

    -dwj

  • by dwj ( 91832 )
    Right, I wasn't trying to suggest simulation replace real experiments and actual observational data. I do not have access to the journal you've quoted so I hesitate, but it does sound rather conceitful. Dozens of theories have been invalidated by what actually happened instead; I need not give examples.

    On the other hand, some theories are hamstrung by the requirement of checking millions of data points (eg., galaxy clusters--my previous work, QCD, etc) and to make at least some progress practically demands the integration of the computer into your theoretical work. There is the risk that you've been wasting your time when the real data waltzes in, but that seems to be the danger with any theory, and the short time spent on simulations (relative to, say, waiting for the next satellite to launch) combined with improved quality (due to fresh contributions from the CS field) tend to justify the effort. If at least one of these efforts turn out to be the real thing, then it is probably worth it.

    -dwj
  • Interesting.

    As much as I believe computers are simply tools, physics are just tools - almost everything is tool.

    Look around your room - what is NOT tools? The can of Coke and the orange. Makes me wonder - does saying "[insert your noun] is just a tool" serve any practical purpose at all?
  • because they used pencils to write down calculations and observations. The computer, like the pencil, is a tool. Scientists use their minds and tools together to create new things or discover existing facts.

  • Absolutely agreed. The arguments that most CS or even "it" grads give is based on the concept of "applied science." This is a loose description of Engineering in most people's language. Unfortunately, I remain doubtful of this. Engineers operate in a world with rules, but not all of those rules are understood. The rules of an operating environment such as a language or even defined hardware are pretty nailed down.

    Don't consider this flamebait, but there's little more "applied science" in programming one language for one or two platforms than there is in baking a cake. There's a bunch of rules and a relatively defined environment. The reason I bring up a domestic activity like cooking is that the "scientific method", or "check and test" is equally well argued in making something new of any sort. I think the phrase "scientific" needs a slightly better meaning than simply a method whereby you don't accept that donig somethign one way is necessarily the best way.

  • He'd be trying to convice us that'd be so much better (for his profit margins) if we gave up on being individual humans and settled on being tapeworms in the gut of his overpriced "host humans".

  • computers are just tools in other sciences. Computers Science is it's own field. I can't even begin to fathom the idiocy of the author in making this claim. Did we call physics calculator science, or even slide rule science when those were the tools we used for computation? Besides:

    "Computer Science is no more about computers than astronomy is about telescopes." - Dijkstra

    I'm getting tired of the confusion about what is and isn't Computer Science. Software Engineering is not computer science. I am a Software Engineer, am I also a Computer Scientist in that role? To a very limited extent, yes. I do use tools and methods from computer science on occasion, but rarely in any deeper a fashion than a carpenter uses physics. Yes computer science permeates every facet of what I do, but I am not doing Computer Science. Now in my own work I do take the role of Computer Scientist, many times without even touching a computer.

    I've always made the distinction between the practical and the theoretical when it comes to computers and Computer Science. Sure the line may be blurry, and it's difficult (if not impossible) to be a purely practical person without having a grounding and theory, and vice versa, to be good at what you do at least.

    Personally I believe that Computer Science, as theory is of higher order than the practical. However, this is not to say that the practical is bad. I pride myself on my, and give others credit for their, practical skills, but it is not Computer Science.

  • engineering (nj-nîrng)
    n. Abbr. e., E., eng.


    The application of scientific and mathematical principles to practical ends such as the design, manufacture, and operation of efficient and economical structures, machines, processes, and systems.

    I'd say that pretty much sums up Software Engineering.

  • Every science deals with the interaction of particles. Without our knowledge of physics, computer science would be void in the real world; rather, it would remain a branch of theoretical math.

  • ...and if you can't blind them with science, baffle them with bullshit.
  • Computers are only a tool
  • >There are many math theories and acts that just aren't physical in the real world.

    Physics could be a subset of math and satisfy:
    "Physics is math".
  • Accepting the Times' definition of computer science, we still have a few sciences that can't be classified as computer science. They include mathematics, and statistics...

    ...oh yeah, and computer science.

  • One of my chemistry professors once claimed that everything boils down to chemistry. One of my physics professors said that in acuality, everything boils down to physics. I'll always remember those two statements, but the statement that I find rings truer than those is one by Galileo himself:

    The book of nature is written in mathematical equations.

    A rough quote from memory, so excuse me if it's not exactly correct. It seems that everything can be expressed in mathematical terms. In a sense, it is the mother of all sciences. Does this mean that all science is computer science? Possibly. Computers, at their core, are simply playing with numbers. Heck, there's a reason that it's called a computer. Saying that all science is computer science might also be giving too much credit to the computer itself, if you ask me. I believe a more honest statement would be "All science is mathematical in nature. Computers are mathematical devices. Therefore, assuming an appropriate computer, all science can be expressed in computational terms."

  • computer science (programming) is more like engeneering, you build something, and work out the kinks through logical analisis. Science is formulating a hypothesis and running experiments to validate your theory (scientific method). I don't know about you, but when I start coding I don't go to prove anything, I just want to make something and make it work.

    According to websters, science is: Knowledge; knowledge of principles and causes; ascertained truth of facts.

    Physics is science, you learn about laws already ascertained and go about smashing atoms to invent new laws.
    Biology is science, you learn about various species, and invent things like darwinism.

    Although I haven't actually read the article, I'm going to assume these scientific strides are things like cloning, genetically modified species and stuff like that. Those are infact bio-engeneering tasks, not really science. To make a goat that makes spider silk is a lot like making a complex protien or a program. You have to take what you know, and apply it, if it doesn't work kill the goat and try again. (:

    In conclusion: Science is observation. Engeneering is making/doing. Programming is engeneering.


    Roy Miller
    :wq! DOH!
  • There have been many comments that say all science is math. There have also been quite a few that say all science is physics. I think we should just ignore this article and simply say that (all science == all science). A marine biologist isn't going to have a new insight into string theory, and a mathmetician isn't going to fix the problem with cloned animals having damaged DNA. Every science is its own realm, and in order to be successful, you often have to draw from other sciences. It's that simple. Afterall, it's not the computer scientist that comes up with new ideas in neurobiology, it's the computer scientist that aids the neurobiologist in building a computer model of his theory.


    /whois John Galt
  • Computer Science is a mathematical field, not a scientific field. Not sure who coined the name, but it is definitely a bad one. Computer science has more to do with proof-theory, than any scientific occupation.
  • I think he's ridiculing Slashdot and the various people submitting AND passing articles like this.

    And I think that ridicule is justified.

    Because of the bad Slashdot title AND blurb, many of the posts would be a waste of time.

    Slashdot is often a waste of time given the "quality" control in the articles...

    I don't really care if the user posts are stupid, but the articles should be much better.

    Link.
  • I do astronomy research for a living. Of course this is completely dependent on computers nowadays. (In addition to telescopes & detectors). Most of my job involves coding (in a high-level language), and the people in my reaserch group can easily be described as software & hardware hackers.

    But this isn't computer science because our goal is to learn about astronomy, not to learn how computers work.

  • Computer science is the science of computer and technology. Other science like social science, quantum science, political science are totally different. Just because a poly-sci proffessor uses his computer to type up a term paper does make him part of the computer science realm.

  • I forget what Science Fiction I read that in...

    But the running joke was that a group of scientists decided that since
    all human thought was produced in the brain, then all science must perforce
    become related to a Neurological-function... Hence all science is actually
    Neuro-Science. And if you were actually working with/on the brian, it
    would of course become Neuro-Neuro-Science. Run that Neuro-joke
    ad infinitum, and you get the general Neuro-idea of how silly that Neuro-propostion
    turns out to be.
  • I do systems work for a math deptarment at a large university. One of our Professors is very involved in using computation and simulation to study various bio-medical phenomenon. Simulation is an important part of her work. By taking experimantal data, she can create simulation models that closely approximate the "real" world. Experimental data is needed to create these models, and then to validate results, but simulation is useful for running multiple senerios to see where to look for experimental research. I don't think that simulations will ever totally replace experimentation, but it will increasingly focus it, and reduce the amount of experimentation necesary to gather specific data.

  • The role of computation in science is magnified beyond its usefulness because futzing with computers is fun, easy, and something to do when you're out of other ideas.

    One example is global climate modeling. The predictions of these computer models are cited all the time, but no one really knows if they're putting out valid results or garbage. (Since these models can't predict the weather 10 days out, one must wonder about their century-term results.) That's not real science.

    Another example, I'll bet, is the computational archaeology mentioned in the article. It is easy to imagine these guys assigning variables to a lot of inexactly quantifiable phenomenon, writing equations for things that are not precisely equatable, and plugging in estimates for unknowns. Garbage in, garbage out. That's futzing, not science.

    (For that matter, has the Santa Fe Institute ever produced any useful science? As far as I can tell, they're a sensational press release factory.)


  • One is often lead to believe that these scientists sit down and say ... "Oh, I think I'll code up a simulator in my spare time and use simulations to do my research." As if the researcher has any idea how to properly handle all the details.

    While there are exceptions, most scientists work with computer scientists. The guy conceiving of the research is unlikely to be the guy looking for that extraneous malloc() with no matching free()anymore than the Auto company designer who says "Hey let's control everything with little computers." is likely to be the same guy that choses the microcontroller and Bus architecture used.

    Things are getting out of hand this way a lot. It used to be someone claiming to be very knowledgable about computers understood them in general. These days, give a guy a web browser and a way to get to the internet, and he figures he has a real good understanding of computers. Make the OS linux and poof .. you have an instant "expert level" computer guy.

    I guess what I'm saying is that all science is computer science like all drivers of new cars are computer users. It sounds good, until you think about how the implied meaning doesn't map remotely to the literal interpretation of the thought.

  • Apart from the current trendy number-crunching fields (genetics, bioinfo ...) I'd say that most current interesting scientific work is independent from the advancement of CS.

    For example in environmental sciences (integrating biology, chemistry, earth sciences and everything that may be useful), where you are actually trying to figure out how real-world systems work, the most advanced use of computing is for implementing models to check if your understanding of the matter matches reality. In order to build working models, you're pulling together knowledge from all kinds of disciplines, from observation and experiments. Computers only needed because spreadsheets come in handy.

    I can't think of any problem in that realm, where our knowlege would be limited by CS-related problems.

  • The problem CS is that it isn't really one subject, but any accumulation of several, some of which could be described as part of maths (combinatorics, proof, algorithmics) and some which are more engineering (systems architecture), and some which are only vaguly related to computers and are as much about psychology (Intelligent tutoring systems, which is what my fourth year project is on and is a combination of AI, psychology, and HCI). Even within these individual disceplines there is no agreement about whether they are sciences, or engineering. Is formal methods (a subject York uni where I am is quite big on) science, engineering, or maths?

    However the same problem exists in other areas. For example, is maths a science. This basically comes down to what we define as a science. Is science, discovery, or invention, or a combination of both. Do mathematicians discover or create mathematical constructs. That of course is an entire philosophy Phd, and an argument that has been raging for years.

    I have to agree with you on your first point about most CS degrees. I meet so many CS people from respected universities that go on about how they get taught Jave/C++ etc (insert latest fad language here). When I was looking for a university to go to one of things I looked for was one that wasn't obsessed about the technologies they use, but about the theory, which is why I came to York. The only high level language we were taught properly was Ada95 (the language of choice in this uni because of the work done on formal methods (inc SPARK ada) and realtime stuff. Instead we get exposed to lots of languages (Occam anyone? :->) that are used for different things. I have used Prolog in at least one module a year in the last three years (NLP, constraint logic programming, etc) In total we have used/been exposed to about a dozen languages. The simple reason for this is that they are tools, something that people seem to forget about. Far too many Uni's teach how to use the tools, not the theory behind them. The former change every few years, the latter remain pretty timeless. Sensibly, my course is called Computer Systems and Software Engineering, reflecting the combination of computer science theory (we do the CS BSc followed by a masters year in software engineering) and engineering.

    Interestinly, if you look at the research groups in the department, many of them contain no or very few CS graduates. many of them are full of maths people. the simple reason is that most CS people's maths is not good enough to do the really complex stuff with image recognition, neural/basian nets etc.

    Anyway, I have rambled far too long (shame I can't actually go rambling, bloody foot and mouth), back to work.


  • You're wrong. The term for people nowadays who do "finite automata or database normalization" is "computer theorist". Finite automata comes into play with compiler construction (a programmer's field), and database normalization is taught in any basic data structures class.

    The difference between a computer theorist, computer programmer and "computer scientist" is arbitrary. Science, according to Webster, is "The observation, identification, description, experimental investigation, and theoretical explanation of phenomena." You wouldn't say programming simulations of real-life events an "explanation of phenomena"?

  • Most programming is usually less rigorous than that.

    I'd argue against that, but I'm too tired right now to make my point.

  • the author says: In fact, as research on so many fronts is becoming increasingly dependent on computation, all science, it seems, is becoming computer science.

    The Logical flaw is supposing that all computation is computer science.

    There is the science of the problem you are trying to solve, and then there is the science of the tools you use to solve the problem. The two are not the same.

    Solving the Human genome is different than programming the computer to analyse the data.

    but there is an overlap. In the same way that it helps to have business and accounting experience to be a systems analyst in a business. Although alot of system analysts do not have this either.

  • That's why, when considering a career in the sciences, you really need to weigh interest a lot more heavily than the job market. Right now I'm seeing a lot of people around me trying to jump into fields that are "hot" at the monent. But, like you said, what's hot and what's not will change before they're done with their schooling.

    That's why I chose an area of study that might never be hot [probably because it's so complicated that it makes normal biologists' heads swim (signal transduction and biochemistry)], but happens to interest me greatly. True, I'll probably never make $100,000+ a year. But at least I'll always be happy doing what I do. And after living a student's lifestyle ($8,000 per year), even $30,000 per year will seem like the high life.

    ---
  • Now that genome projects are hot stuff, people have started to take a good look at the biotech industry. And if they think what's happening now is exciting, wait until they see what's in store.

    I'm willing to bet that the protein folding problem will be solved in the next 50 years. Soon after, we should start to see protein design hitting its stride. What does this all mean?
    Take a bunch of E. coli bacteria. Use the genomic info you already have to insert a new gene for ProteinX that you've designed. The bacteria then make ProteinX in a huge vat, churning out billions of copies of the protein you need within a few days.

    Think nanomachines are hot stuff? ProteinX is only a few nanometers in diameter, has no conventional moving parts (just changes in conformation) and can be regulated just be adding different chemicals into the mix.
    OR
    Think spider silk is strong? ProteinX could be modified silk fibrin, designed for more elasticity and higher tensile strength.

    The sciences will always come out with incredible discoveries. Companies that use these discoveries will always have stock that's worthwhile to own. Maybe in the short-run things might dip, but it'll always make a come back.

    ---
  • Most people who get CS degrees are the farthest thing from being actual computer "scientists". Real computer science is basically mathematics

    That's a very interesting point that you make there. It was about the first thing that popped into my head when I read the article. What use has the average "computer scientist" really in the mentioned scientific fields? In my experience pracitcally none!

    I attend the University of Utrecht (in the Netherlands) for an education in so-called `Computational Science'. It's about 40% phyics, 40% math, 20% computer science in the first two years and thereafter it's about 80% math, 20% application of the computer science we were taught in first two years. We also tend to call our field "large-scale computing". So in fact it's not computer science, but computational science (that's the name we use to refer to this field of expertise, I don't know if it has any meaning abroad) that's starting to get involved in almost every field of science.

    It might be different in other countries or even on other universities, but in my experience a computer scientist never spends any time learning how to solve huge numbers of coupled equations, solve differential equations (numerically or symbolically) or even approximate an integral (with more advanced methods that the trapezoidal-rule). I want to bet that if an average person with a CS degree was told to solve a system of 10000 (or another huge number) equations, they would use Gauss-elimination. That's because it's not their field of expertise. (An iterative method, derived from the Richardson iteration, such as CG or GMRES would of course be the prefered way of solving such a huge system). And because solving these kind of systems and solving differential equations numerically (which in the end are very related) is essential to almost every computer-model that I have encountered to date, I tend to disagree with the statement that it's computer science that is becoming an significant part of almost every other science. (Unless you call computational sciece a subset of computer science)

    That's at least my take on it, based on my experience with both field. If people disagree I'd be happy to hear their story. Obviously I don't have all the answers either.

    --
    Matthijs
  • I guess all the scientists during World War II and before were "chalkboard scientists", because the use of the chalkboard permeated all fields of science, from physics and chemistry to biology and medicine.

    Wait, they were all "pencil and paper" scientists as well. Damn, were they well-educated or what?

    And Archimedes was a "stick and dirt" scientist, right?

    Give me a break. The computer is a tool. A very powerful tool, in fact indispensible now, but a tool nontheless. I'm an Electrical Engineering researcher, and I spend a lot of time writing computer programs for my research in a variety of languages. Please don't call me a Computer Scientist, though, or I might just throw up. I use oscilloscopes a lot too. Does that make me an "oscilloscope scientist"?

    Computer science is a well developed discipline in which very smart people devise new ways to solve problems. People in other fields, like me, use what computer scientists come up with. We are not computer scientists in our own right.

  • Ahh, but all science _IS_ quantum physics, in that if you apply the rules of QM to your system, it will give the correct answer.

    Ahh yes, the reductionist theory of science. Are you familiar with the field of non-linear dynamics (chaos theory)? Maybe you could work out human biology using quantum physics but there are two serious problems. First, we don't have the physics yet to do it, even with infinitely fast computers. It may "exist" (in the platonic sense) but we don't have it. Second, you are mixing quantum theory with DeCartes' "clockwork universe". Basically, maybe we could figure everything out with math, but only if we knew initial conditions to infinite precision. Impossible.

    Also, quantum theory isn't the end-all-be-all. We don't know how to use it to explain charge-parity-time (CP) violation yet, for example. There are theories, but not proven.

    If you really think we can do everything with quantum mechanics, I dare you to solve some fluid-dynamics problems with an abacus.

  • by Kasreyn ( 233624 ) on Saturday March 24, 2001 @07:09AM (#343056) Homepage
    Last month a leader in the software industry, Larry Ellison, the chief executive of Oracle, predicted that the focus of the intellectual excitement will shift again.

    "If I were 21 years old," he said at a company conference in New Orleans, "I probably wouldn't go into computing. The computing industry is about to become boring. I'd go into genetic engineering."


    This rings true to me. Much of Comp Sci (my chosen profession, though I suck at it) seems to have a lack of discovery and / or innovation these days, with the exception of nanocomputing. Much of the rest of it is innovation, not invention / discovery. How many Turings do we have in Comp Sci now?

    Now genetics, this stuff is freaking AMAZING. My girlfriend is going into it, and I'm regularly amazed by the discoveries that are being made in the field. It may well be that computer science is no longer the frontier of human knowledge; I don't know.

    The article is, of course, dead wrong. Mr. Johnson needs to have his head examined if he thinks that just because computers are used as tools in many professions, that thereby all professionals are computer scientists. He wrote an article for the NYTimes online, probably using a word processor - thus by his definition he can claim to be a computer scientist.

    The thing he's dimly perceiving, but failing to adequately put into words, is how computers have become ubiquitous in the professional and academic world, and how a working knowledge of how to USE computers is fast becoming utterly essential. However, he fails to see the vast difference between being a competent end user, and being a discoverer, an inventor, a creator-of-new-things in the computer world.

    So all in all, the article is only interesting in that the author accidentally brings up something else that's worth thinking about: computers and their involvement in genetics research. Now what I want to see is more development in the field of biological computing... the day when genetics and microbiology combine with comp sci and nanotechnology / nanorobotics, will be a portentuous day.

    -Kasreyn

  • I remember a different time, not so long ago, when all science was Slide Rule Science. And my grandaddy told me of a day when all science was Abacus Science.
    -----------
  • If I was 21 and wanted to get into genetic engineering, in order to do anyting above being a tech that follows other people's directions, I would need a Ph.D.

    Five years later, I would have my Ph.D., and would find that I need to do a two or three year post-doc before anyone will consider me seriously.

    Once that was done, I might find that Genetic Engineering was no longer hot, and I have no job prospects. Or that so many other people had the same idea, and there are only so many Ph.D.'s needed, that there aren't a lot of job prospects.

    Unlikely? It's what happened to me, but replace "Genetic Engineering" with "Toxicology". What happened (and is still happening) was a lot of mergers in the pharmaceutical industry. It dumped a lot of skilled toxicologists on the market, and it doesn't take a lot to saturate that segment of the market. I can see the same thing happening in Biotech in the future, where Amgen and others by buying up smaller firms first, then merge with peers in order to stay competative.

    Besides, anyone who thinks being a gene jock is exciting has never done it.

  • that must mean when I was 8 and I got one of those home science kits for my birthday that I was actually doing computer science. I gotta update my resume for that extra 5 years of CS experience I have now. Big raise here I come!

  • I'd like to see you tell that to Gallileo or Newton or any of the hundreds of brilliant scientists who were around before computers.

    The Aztecs had an advanced number system, and developed many of the algabraic formulas we use today. They were also amazing astronomers who made many wonderous discoveries about the heavens. All without the help of a computer.

The trouble with being punctual is that nobody's there to appreciate it. -- Franklin P. Jones

Working...