Top Ten Algorithms of the Century 245
brian writes "'Great algorithms are the poetry of
computation,' says Francis Sullivan of the Institute for Defense Analyses'
Center for Computing Sciences in Bowie, Maryland. He and Jack Dongarra of the
University of Tennessee and Oak Ridge National Laboratory have put together a
sampling that might have made Robert Frost beam with pride--had the poet been a
computer jock. Their list of 10 algorithms having 'the greatest influence
on the development and practice of science and engineering in the 20th
century' appears in the January/February issue of Computing in Science
& Engineering. If you use a computer, some of these algorithms are no doubt
crunching your data as you read this."
Re:Now that's interesting ... (Score:1)
Like it or not, "The Fortran Optimizing Compiler" is not an algorithm. There are numerous Fortran compilers available, and each has its own algorithms (note the 's') for performing optimization. It's like saying "what's the best vehicle of the 20th century" and answering "interstate freeways". Yes, they help transporation, but...
Re:Translation (Score:2)
Not really. They are much more important than that. Essential for putting man on the Moon, solving air flows to optimize the shape of a jet fighter, protein folding calculations, you know, this kind of stuff. These algorithms made it possible to solve large equations back in the 60's (imagine the processing power back then...)
They missed the paperclip. (Score:2)
Re:Why I have no faith in this list... (Score:2)
--
No more e-mail address game - see my user info. Time for revenge.
Re:Most commercially valuable algorithm (Score:1)
suggested using 2 tablets for faster relief.
And showed 2 tablets falling into the water, etc.
Sales did nearly double, however. True story!
Re:Microsoft's Compression Algorithm (Score:2)
---
They should add Shor's algorithm (Score:1)
Regular expressions? (Score:2)
To me, it's a glaring ommission to not have any string matching algorithms. In other words, where are the regular expressions?!? One of the most impressive algorithms I have ever seen is related to regular expressions. There are two types of regular expression algorithms. On follows directly from the regular expression, and the complexety of the regular expression directly correlates with the running time of the algorithm. With the second type of algorithm, the running time of the algorithm is proportional to the length of the string, and the length of the regular expression. In other words, you can always predict how long it will take to run. No matter how complicated. The first kind is NFA, or non-deterministic finite automata. The next kind is DFA, ie. deterministic finita-automata. Anyway, it is really easy to write a NFA, but writing a DFA is super complicated, it involves this awesome thing called the E-closure. The algorithm to product a DFA via the E-closure is the impressive algorithm I think should have been on the list.
I heard about this algorithm in the book: Compilers: Priciples, Techniques and Tools (The Dragon Book, Second Edition) It doesn't have a name for the algorithm though. Anyone who is interested in regular expressions, parsers, lexers, compilers, interpreters and validators should deffinatly check out this book.
Re:Now that's interesting ... (Score:1)
Holy Eigenvalue, Batman! (Score:1)
Re:Why I have no faith in this list... (Score:1)
Gauss invented the Fourier transform *and* the FFT before Fourier invented the Fourier transform. Fourier never figured out the FFT, as one poster has pointed out. If that was unclear from my original text, then sorry about that.
Gauss basically came up with the FFT out of necessity... he was processing a ton of planetary observation data by hand.
Anyway, you say you "don't know how much of my comment to believe", but that's why I included references, eh. The FFT has been invented many times in the 20th century, all of them rediscoveries of what Gauss did. Check any signal processing book that has historical notes.
-N.
Where's A* ? (Score:1)
The magazine printed this big poster of these top-ten algorithms, and a numerical computation person across the hall from us has had it pinned to her door for a month now. We've had so much fun mocking it these last few weeks.
So here's a short list of algorithms that, I guarantee, are far more important than Fast Multipole. :-)
significance of the fortran compiler (Score:5)
it was also instrumental in the vast popularity of the (IBM?) computer it was originally developed for and sold with.
disclaimer: this is all stuff that i remember from my programming languages class last year. it may not be completely correct, as i don't really know anything about fortran other than what i learned in that class...
Arithmetic Encoding (Score:1)
although it's inspirational. Arithmetic coding
has beautiful properties that allow for partial
bit codes.
Re:FORTRAN compiler?! (Score:1)
How To Write A Letter
1) Get Pen
2) Get Paper
3) Apply pen to paper
4) Move pen according to the following movement pattern:
5) (etc)
The above is certainly an algorithm. But try porting it to X-Windows. Just because an algorithm relies on certain other things being present (pen and paper in this case, specific computer architecture in the other) doesn't make it less of an algorithm.
Every who'se complained about the Fortran compiler, stop it: an algorithm takes inputs and produces outputs based on those inputs in a deterministic way. Quicksort is an algorithm, even though it *contains* other algorithms, such as those for iterating through a list and comparing elements. Fortran compiler is an algorithm, even though it contains many smaller algorithms.
My favorite (Score:4)
10 PRINT "RADIO SHACK SUCKS!!!!!"
20 GOTO 10
Watch out (Score:1)
--
"And is the Tao in the DOS for a personal computer?"
Re:Forgot one. (Score:1)
The most important one is sleep(-1). It's tricky to implement, but when used properly, it turns all other algorithms into O(1). NP-Completeness, my ass!
Re:Sentimenal Favourite........ (Score:2)
Some commentary (Score:5)
The Metropolis Algorithm for Monte Carlo: Based on the thermodynamic equivalent of crystals forming in cooling solids. As opposed to the naive optimization algorithm, the Metropolis algorithm sometimes allows a step to be taken even when it decreases the objective evaluation function with a probablity equal to the value of an exponentially decaying "entropy" function. _Numerical Recipes_ has a nice description, although their example is quite abysmal. The GNU Scientific Library (sourceware.cygnus.com/gsl) has an implementation of the algorithm, referred to as simulated annealing.
The Fortran Optimizing Compiler: compiler techniques in general have improved enormously since the 1950s...this selection doesn't really make much sense to me, as this field has been more about continous and gradual improvement, whether than the developement of some breakthrough optimization techniques. If anything, the pioneering work of Zhang and Roberts establishing the computation infeasibilty of an "universal optimizer" is a much more interesting result.
QR Algorithm for Computing Eigenvalues: Elegant method for determining the values for which A*x_i=\lambda\_i*x_i for a square matrix A. the eigenvalues and corresponding eigenvectors (the vectors belonging the the nullspace of (A-\lambda\_i*I)) are in a certain sense the natural frequencies of a matrix. Determining the eigenvalues without solving the characteristic polynomial (as with the QR method) is critical for avoiding numerical analysis problems.
Some notable algorithms which seem to be missing:
-Sieving techniques for the decomposition of large composites into prime factors
-RSA/other asymmetric key exchange algorithms
-Huffman codes, LZ sliding-window compression techniques, arithmetic encoding
-MDS matrices, error correction codes, cyclic redundancy checks
-The linear feedback shift register
-Predictor/corrector methods, adaptive numerical intergration and approximation of ODEs and PDEs
-Lenstra-Lenstra-Lovasz lattice basis reduction algorithm
-Berlekamp's Q-matrix algorithm for the factorization of polynomials
Any runners-up? (Score:1)
What other algorithms would you folks suggest?
I do a lot of computer graphics work, so the FFT is among my top 10 for that field. Marching cubes, phong shading, environment mapping, bilinear interpolation, BSP & portal PVS are also on top of my list.
What a perfect nerd question: what's your favorite algorithm, baby?!
magic
Re:Translation (Score:1)
In the old days, chicken was expensive. Then Purina (and probably other feed companies) discovered Linear Programming. The chickens have certain nutrient requirements which must be met by some combination of ingredients in the chicken feed. Various ingredients have different amounts of nutrients at prices determined by the current market. Solving the Linear program finds (one of) the cheapest ways to make chicken feed. The feed companies helped set up poultry farmers in a high volume low margin business. Incidentally, solving a Linear Program also gives the cost of each constraint.
Some of the algorithms I don't recognize, but they all seem to make major (orders of magnitude) differences is what can be computed.
You can't add things to a top 10 list! (Score:1)
"Someone left out error-correcting codes!"
"Someone left out crypto!"
"Someone left out the following list of algorithms you learn in the third year of a CS program!"
"Someone left out LLL!"
Well, whoop-dee-do. If we added all of your "GLARING" omissions, we'd have a top 100 list. The problem here is that there are too many really good, beautiful, poetic algorithms out there, and no objective way to compare them.
Someone pointed out that this list was biased towards numerical computation and someone else defended this, saying that they wanted practical algorithms anyways, which is a bit silly, because different people have different practical needs. Some of us really don't have any use for the Fast Fourier Transform. Some of us think that an O(N^log 7) matrix multiplication algorithm is kinda cute, but don't really give a rat's ass about it.
I suggest that this list does not achieve its goal. Perhaps it publicizes a couple of numerical methods that people might not have heard of (except LLL), but it certainly doesn't live up to the title "Top Ten Algorithms of the Century". I don't think such a list could really exist. If it could, perhaps Donald Knuth wouldn't be spending so much time on that series of books he's so passionate about...
Donny
Re:Sentimenal Favourite........ (Score:1)
I'm amazed none of you low-level hardware hacker guys have got this one - of the sorts that have been mentioned (QS, Insertion, Merge, Selection etc) - Bubble is the only one which doesn't need extra memory. All the others need a target buffer, while Bubble works in place.
Or am I wrong? I'm not sure....please feel free to correct.
My top 3 algorithms (Score:2)
The next important algorithm that I came across but unfortunately can't describe properly here is used to produce molecular shapes from X-ray diffraction patterns. This helped mankind to understand DNA.
My third most important algorithm is the state engine that drives the TCP/IP stack. This is whats bringing my burblings to you now.
I'm happy with quite a few of the algorithms in the top 10 list - fourier transforms in particular but some of them seem rather esoteric.
Biased to numerical algorithms? (Score:2)
I was also disappointed not to see a symbolic logic algorithm such as the Resolution Theorem Prover [mit.edu] (Robinson, 1965) or Circumscription (McCarthy [umass.edu], 1980). I'd like to have been able to point to something of Jon Barwise [indiana.edu]' Situation Semantics, but couldn't think of a specific algorithm to highlight.
Re:Data compression and encryption (Score:2)
Disqualified! (Score:2)
Interesting (Score:4)
Re:Most commercially valuable algorithm (Score:2)
Double the sales? If people only followed the directions they should repeat until they run out of shampoo.It's an infinate loop!
Meanwhile, we can take over the world?
Need RSA in there! (Score:3)
And finally, DES *has* to at least be an "honorable mention". DES, despite being dated on the day it was created with its 56-bit keys, was the first publically-implemented Feistel-class symmetric cipher, and brought S-boxes and other encryption techniques now commonplace out of the world of spooks and into common parlance. All ciphers since DES have been either inspired by DES or have been a reaction to what people view as flaws in DES. If that isn't "influential", I don't know what is!
-E
Forgot one. (Score:2)
wait();
And the winner is... (Score:2)
1. Turn on TV
2. Watch Until bored
3. Change channel
4. Go back to 2
Re:Some commentary (Score:2)
Fortran Optimizing compiler? (Score:2)
I can see how all of these got in here, but I would expect that the greatest algorithms of all time would last for a bit longer than that. Its not as if the fortran compiler reached its pinnacle of evolution in 1957 and has remained relatively unchanged since.
Other entries like the matrix computation methods, fast fourier transform, etc are still in wide use today because we haven't found (or there aren't) any better algorithms we can use.
Maybe its just me...
--
Eric is chisled like a Greek Godess
Where to find algorithms... (Score:5)
Suddenly I feel very young (Score:3)
On an unrelated note - I only recognize three of the ones on their list. I would have liked to see a description of what each one does, what impact it had, and perhaps how it works. That would make for interesting reading.
--
grappler
Re:Translation (Score:2)
Also, it we should mention (alhough I've been skimming previous comments, so maybe redundant) that Gauss used FFT to compute fourier series -- he didn't make a big deal out of it, as he no doubt conscidered it a straight forward application of base principles. Smart man that.
Re:Strong Numerical computation bias... (Score:3)
Don't forget Diffie-Hellman(sp?) key Exchange, DES, etc.
These crypto algorithms pretty much made secure e-commerce of the 90's possible.
Re:Disqualified! (Score:2)
I ported that algotrithm to the UK and "Laskys", our local equivalent of Electrode Hut.
Laskys, being a bit more up-market, sold HP-85s with built-in thermal printers. If you used Basic's normal "PRINT" function not "DISPLAY" (Rocky Mountain Basic!), then you could spew out rolls of thermal paper. Friends would compete for the shortest marquee display programs that would print large letters sideways onto this stuff, and could hopefully be typed in and run before the shop's sellrats could turf us off.
Not much of a BREAK key on a HP-85, as I remember...
Kalman Filter (Score:2)
Where's the Kalman filter ? We couldn't have got to the Moon without it.
Equally, we'd never have ICBMs and an arms race without it, but when did morality ever stop geeks inventing stuff ?
Silly list. (Score:2)
As in everyday algorithms used all the time that actually make commonplace software work well.
I'm surprised that data compression hasn't been mentioned, nor public key cryptography.
What about the algorithms used in networking? Surely TCP/IP has a greater impact than some obscure matrix multiplication.
And the winner is: (Score:2)
Re:This is the real link! (Score:2)
I looked a bit around in my region and I can get articles either via the local university library for about 10 cent copy fee per xeroxed page or delivered via email as TIFF files from the JASON [uni-bielefeld.de] server for about $1.50 per article. This link [uni-bielefeld.de] will send a test article to your email address. You then need an uudecode utility to turn the attachment into a binary und the result (a selfexploding ARJ for DOS) can be unpacked with the unarj utility that is available for UNIX too.
I believe you should find similiar offers.
Re:Sorting out sorting (Score:2)
Re:Why Public-Key Crypto Isn't On The List (Score:2)
Eric
Re:Also, since when is a compiler... (Score:2)
Re:Translation (Score:2)
However I don't think 10 algorithms is enough to describe the greatest achievements of computer science. Binary search, Binary Tree Manipulations, Data Structure algorithms such as Hashing, BTrees, Heap Manipulations, Queues, Stacks, various forms of Linked Lists. Searching algorithms, heuristic bound searching algorithms such as Chess game algorithms, Dijkstra, Travelling Salesman (unsolved). Recursive algorithms. etc...
The list is by far not exhausted but the only 10 algorithms listed are defenetely not the top priority algorithms (except quick sort)
Re:Strong Numerical computation bias... (Score:3)
As for that FORTRAN thing, I guess that was mentioned because it was the *first* optimising compiler, ever, and it served as an example to every compiler that was made ever since. Talk about influence now...
Oh and of course do I love Binary Space Partioning. Thats the most important algorithm ever. (so what, DOOM is old, but its still fun and at least it has an atmosphere...)
Re:Sorting out sorting (Score:2)
Re:Now that's interesting ... (Score:2)
Dana
Re:Now that's interesting ... (Score:2)
Re:Translation (Score:2)
Re:Most commercially valuable algorithm (Score:2)
It's an infinite loop if you read it like a moron.
Somebody should have stopped at half a cup eh? It's only a joke!
Re:Also, since when is a compiler... (Score:2)
I dunno, sounds a lot like an algorithm to me...
Reed Solomon Error Correction, and DCT (Score:2)
My memory of it is a bit shaky(I studied it somewhat a few years back when I picked up a copy of Digital Audio Processing), but it essentially specifies an method by which an arbitrary number of bits in any block can be in error and the algorithm itself will A) Detect that error in the bitstream and B) Correct that error, if the number of misread bits is within the specifiable boundries of the algorithm. So you have situations where merely n extra bytes appending to an m byte block will automatically correct for some amount of error *anywhere* within that block.
Also a nice touch is that one can easily specify a specific number of bits which may have identical values--it turns out that CD's cannot write more than eleven 0's in a row--the head loses the ability to follow the "groove". No problem at all for this code.
This is essentially <i>the</i> system by which arbitrary-quality error correction is implemented. I'm not saying that the selection of algorithms was in error--but Reed Solomon does deserve some kind of honorary mention.
Oh well, at least it gets a better treatment than the Discrete Cosine Transform, which (literally) was <i>rejected by the NSF because it was considered too simple.</i> The DCT is at the core of both JPEGs(it compresses into a quantizable domain 8x8 blocks of YUV domain pixels) and MP3's(compresses into a quantizable domain each of 32 frequency subbands into 18 frequency coeffecients).
Yours Truly,
Dan Kaminsky
DoxPara Research
http://www.doxpara.com
Sentimenal Favourite........ (Score:4)
Seriously, it's first sort most of us learn, and it has the cutest name.
One of the profs who teaches an algorithms course at my university always shows his class a program that displays performance graphs as various sorts chew through large data sets. Invariably, most of the students cheer for bubble sort, even though we know it will loose. Favouring the under-dog, I guess
Dana
Top Ten by *DATE* (Score:2)
Re:Forgot one. (Score:4)
You forgot the most widely used algorithm on the planet....
10 IN
20 OUT
30 GOTO 10
======
"Rex unto my cleeb, and thou shalt have everlasting blort." - Zorp 3:16
Most commercially valuable algorithm (Score:5)
1. Lather
2. Rinse
3. Repeat
I seem to recall that Nolan Bushnell did a similar thing in the instructions for Pong. Does anyone recall what those were?
The most innovative, by far. (Score:3)
algorithm in modern times:
Amazon's One-Click shoppint.
Seriously, I guess this list shows the value of unpatented computer
algorithms. Any of these can be reimplemented, and are actually
innovative, while dispensing software patents to such silly ideas as
saving credit card info.
Re:Sentimenal Favourite........ (Score:2)
I saw a code example in x86 assembler once that implemented bubble sort. It was written by one of the guys who did Tomb Raider - I believe it's buried somewhere in tr.exe and actually does something useful.
and nothing about security algorithms? (Score:2)
asymmetrical key encryption algorithms?
Any hope you have of privacy now or in the
near future is going to be based on this.
Fortran Optimizing Compiler. (Score:2)
Sheesh...now just another glorified IT company.
Re:Regular expressions? (Score:2)
Two reasons:
Which I found pretty bizzare, since I'm not aware of any good non-computer use of DFAs and NFAs, but I'm not much for non-computer things anyway...
Note, if you poke a bit farther in the dragon book there is a algo to construct a DFA directly from a regexp, without making the NFA first. I think the NFA->DFA also makes a more compact DFA though (and no, I don't recall the name either, and I just coded one up last year).
Also note, the dragon book is defintly not the end-all source of NFA/DFA/regex knolage. There is another book of Aho (and I think Ullman) that talks about more things, like it being an intractable problem to find the minimal regex to match a given set of symbols (exponantial in both space and time).
Careful about where you use Quicksort. (Score:3)
However,it has a really bad worst case, which happens to be fairly common: when your data to be sorted is already more or less in order. Then it performs about the same as bubble sort [O(N^2)]. It also uses O(n) levels of recursion in the worst case, which may cause stack overflow in some cases.
For that reason I personally wouldn't put it in a library routine, because some programmer will eventually use it to sort an array of keys, collect a few more key, and resort the array. I'd only use it where I was fairly sure I was collecting keys whose order had considerable entropy.
Quicksort may be influential because of its simplicity (I believe it is even easier to understand than bubble sort) and widely used because it is easy to code, but it is hardly the best all around sorting algorithm.
My personal favorite sorting alogorithm is Shell sort; this is not to say it is the best algorithm in existence, but it is easy to code and performs just a tad slower than Quicksort in the average case. Most importantly the spread between worst case and best case is small. It is also well suited for internal sorting. The only drawback to this algorithm is that it is not stable, that is to say that two keys of equal value will sometimes end up in different relative order to when they started. This is easy to fix, however.
Quicksort works because it gets rid of large amounts of entropy fast. Where entropy is low (e.g. the keys are already ordered), this advantage is neutralized. It is possible to code hybrid algorithms that use Quicksort at the outset to divide the key file into a number of smaller sets, and then use a different algorithm when the key files become relatively small. I've heard that some people combine Quicksort with Linear Insertion sort when the keyfile get to be ten keys or less, because while Linear Insertion Sort is generally slow [O(N^2)], its simplicity makes it faster on very small key files. Perhaps starting with Quicksort and switching to Shellsort or some other algorithm for some key file size (perhaps 100 or so) would result in a large decrease in worst case times and similar performance in the best case.
The Viterbi Algorithm (Score:2)
This algorithm is used for efficient decoding of convolutional error correction codes and for untangling signals corrupted by multipath dispersion. It has been developed by Andrew J. Viterbi (founder of Qualcomm [qualcomm.com]) while at NASA for receiving the ultra-faint signals transmitted by deep space probes.
An introduction to the Viterbi Algorithm [netcom.com]
Technically speaking, it is an efficient implementation of the Maximum Likelyhood Sequence (MLS) detector, just like the FFT is an efficient implementation of the discrete fourier transform.
----
That warm, daydreaming feeling (Score:2)
OTOH (OT): I am entertaining the idea of making a little FORTH compiler that would be also some sort of OS. Anyone with some good references/resources that coud help (ok, I know, read the Linux kernel source...)
This is the real link! (Score:4)
Rather use this link [computer.org] - it has the missing explanations.
And this link [sciencenews.org] explains integer relations.
bah! (Score:2)
FFT should be much higher, it is so very very useful and important.
The Fortran compiler??? Is this a joke?
Where are the compression algorithms? LZW, Wavelets, etc. Where are the error correction algorithms?!
Compression, error correction, and the FFT are like the holy trinity of modern computing. Everything from CD's DVD's and spacecraft orbiting Mars and Jupiter use compression and error correction. How could they deny the importance of sheer power of these algorithms?!
Re:Now that's interesting ... (Score:2)
"If you use a computer, some of these algorithms are no doubt crunching your data as you read this.
Really?? I do my share of fancy math, though I'm not in the Big Leagues, and I'd be surprised if and of these algorithms are crunching *my* numbers (except maybe in some NSA basement supercomputer, where they are just now learning that my SSN + selected other ID numbers translates into the punch line of a dirty in EBCDIC)
Re:Fortran Optimizing compiler? (Score:2)
I'm guessing that the algorithm for optimizing Fortran that was developed in 1957 paved the way for current optimizing compilers, and would bet that many of them use the same techniques that this one did.
Of course, I know less than nothing about compiler technology, so...
--
Now that's interesting ... (Score:2)
Huh? (Score:2)
Funny, I thought Perl was
(note to moderators: it's a joke, ok?)
nuclear cia fbi spy password code encrypt president bomb
An the oscar goes to . . (Score:2)
Turns network admins into Linux users.
___
Re:Strong Numerical computation bias... (Score:3)
Re:Interesting (Score:5)
Simple algorithms for common problems are much more widely used, and have far more impact and influence, but try telling *them* that!
I hope these links help. (Warning: many are technical) If anyone has personal favorites that are less dry than many of these, please post!.
10. 1987: Fast Multipole Method. A breakthrough in dealing with the complexity of n-body calculations, applied in problems ranging from celestial mechanics to protein folding. [Overview [washington.edu]] [A math/visual approach [duke.edu]]
9. 1977: Integer Relation Detection. A fast method for spotting simple equations satisfied by collections of seemingly unrelated numbers. [Nice article with links [maa.org]]
8. 1965: Fast Fourier Transform. Perhaps the most ubiquitous algorithm in use today, it breaks down waveforms (like sound) into periodic components. Everyone knows this one (or should) [Part II of my personal favorite FFT and wavelet tutorial [iastate.edu]]
7. 1962: Quicksort Algorithms for Sorting. For the efficient handling of large databases. [Definition [nist.gov]][Basic Method [princeton.edu]][Mathworld [wolfram.com]][More technical explanation [syr.edu]][A lecture with animations and simulations [sunysb.edu]]
6. 1959: QR Algorithm for Computing Eigenvalues. Another crucial matrix operation made swift and practical. [Math [unc.edu]] [Algorithm [berkeley.edu]
5. 1957: The Fortran Optimizing Compiler. Turns high-level code into efficient computer-readable code. (pretty much self-explanatory) [History and lots of info [unc.edu]]
4. 1951: The Decompositional Approach to Matrix Computations. A suite of techniques for numerical linear algebra. [matrix decomposition theorem [wolfram.com]] [Strategies [syr.edu]]
3. 1950: Krylov Subspace Iteration Method. A technique for rapidly solving the linear equations that abound in scientific computation. [History [utk.edu]] [various Krylov subspace iterative methods]
2. 1947: Simplex Method for Linear Programming. An elegant solution to a common problem in planning and decision-making. [English [anl.gov]} [Explanation with Java simulator [anl.gov]] [An interactive teaching tool [hofstra.edu]
1. 1946: The Metropolis Algorithm for Monte Carlo. Through the use of random processes, this algorithm offers an efficient way to stumble toward answers to problems that are too complicated to solve exactly. [English [ic.ac.uk]] [Code [syr.edu] and Math [syr.edu]] [Math explained [ic.ac.uk]]
Ahhh The Old Favourites (Score:2)
And it nicely adapts to implementation across Beowulf clusters.
Now you know why we asteroid Catastrophists talk in probablilities all the time - we just play games with dropping random asteoids onto the Earth and call it work.
Why Public-Key Crypto Isn't On The List (Score:2)
But I'm guessing it's not on the list because it's not an algorithm. It's a lack of an algorithm. It's simple enough to multiply two large primes together. The reason public-key crypto works is that there is no known algorithm to turn the product back into its constituent primes in reasonable time.
Maybe that missing algorithm should have been listed as #0...
Jamie McCarthy
Not a single search algorithm! (Score:2)
1. A^* search and variants. Simulated annealing, heuristic repair, etc. You know, GOFAI search algorithms.
2. Minimum Spanning Tree algorithms. Shortest path algorithms. Graph partitioning algorithms.
3. Data structures: heap (the usual one, and the variants such as binomial fibonacci), tree structures(bst, r-b, avl, b+, bsp, quad-, oct- etc.), disjoint sets, hash tables.
4. I don't wanna admit it, but perhaps back propagation algorithm for ANN
5. LR(1) parsing algorithms, without which you can't write decent compilers!
6. Round-Robin scheduling algorithm. Where're you heading without pre-emptive multitasking?
7. Median-of-medians. Power of recursion!
8. LCS (Longest common subsequence), demonstrative of dynamic programming. And you used some diff program surely...
9. RSA and similar public key crypto-algos.
Hope you like this slightly alternative supplement
Clearly there is room for improvement (Score:2)
Moreover, didn't all the "Top Ten" lists from other fields come and go last December?
So how good can the best computer algorithms be, when it still takes them 6 months to generate a "Top Ten" list, and the list is incorrect?
INFLUENTIAL SCIENCE and ENGINEERING algorithms (Score:2)
Also note the GREATEST INFLUENCE part; recent algorithms or clever algorithms don't necessarily make the cut (because influential algorithms usually need time to build up influence, and sometimes the most influential ones are the simplest and not always the most clever).
BSPing and Z-buffering not on the list. (Score:2)
Without them, Quake is impossible. :)
Re:Ridicolous (Score:2)
No, it isn't. Huffman generates optimal codes. Shannon-Fano does not. Yes, Huffman is esentially a "tweak" of Shannon-Fano (the code generation is done in the reverse order), but it's a very powerful tweak that turns a "pretty good" algorthm into an "optimal" algorithm. (given certain constraints. I know arithmetric encoding is better in many ways.)
Bresenham's line drawing algorithm is nothing but solving the equation of a line, only with repeated subtractions instead of a division. (Faster on processors with slow floating point.) Including this would be ridicolous.
All algorithms are nothing but performing a bunch of simple computations to achive some bigger result. Bresenham's algorithms have become "staples" of computer graphics though.
Incidently, Bresenham's algorithm is useful even on machines that don't have slow floating point. Bresenham produces "nice looking" lines. Simple linear interpolation produces ugly lines. If you don't know what I mean, try drawing some diagonal lines in GIMP using the pencil and a 1x1 brush. See how ugly they are? That's because GIMP uses linear interpolation, rather than Bresenham. I hate to say it, but even MS Paint produces nicer looking lines, simply because it uses Bresenham. (and if you think how it looks isn't important... sorry, but that's the way it is in graphics...)
I think you mean Binary Splay Partition algorithms, which truly also are in widespread use. But in such a list it would be better to include some of the real raytracing algorithms.
No, I meant Binary Space Partitioning trees aka BSP trees, which are used for all sorts of things in graphics including polyhedral rendering systems, raytracing optimization, polyhedral set operations, and collision detection. Yes, there are other techniques for achieving all of these, but BSP trees show amazing versatility.
Incidently, I've never heard of "Binary Splay Partition algorithms". Are you talking about binary splay trees?
KMP search is hardly in use except in very special cases. There are more efficient algorithms around.
Yes, I made a mistake. KMP and Boyer-Moore are very similar (in much the same way that Shannon-Fano an Huffman are similar, actually...), but Boyer-Moore is clearly the superior algorithm. I think Boyer-Moore and KMP have become intertwined in my head, because I learned both of them at the same time. (perhaps also because "the BM algorithm" has nasty connotations...)
Okay, strike my vote for KMP, and add one more vote for Boyer-Moore...
In any case, the list I posted was not intended to be "the 9 best algorthms of the 20th century". It was more like "the 9 best non-numerical algorithms that came to me as I was posting to Slashdot". So take it with a grain of salt. I know I missed some great ones in there... I didn't have any tree balancing algorithms in the list, for example. Skip-lists are also pretty cool, and probably deserve a mention, despite the fact that almost no-one seems to use them. Systems algorithms like process scheduling algorithms, caching algorithms, and even the elevator algorithm (used for scheduling requests to "seekable" devices like hard drives) probably deserve a mention too. And how about database algorithms, like the various join algorithms? Or the numerous AI algorithms? A "top 10" list is really too short. We really need a "top 100" list...
FORTRAN compiler?! (Score:2)
*sigh* I've come up with a reason why. It's because it's machine dependent - an i386 Fortran compiler and a PDP11 compiler are completely different.
Re:Now that's interesting ... (Score:4)
Why are you surprised that the list is "heavily biased towards scientific and numerical applications"? The list comprises the algorithms that they believe have had 'the greatest influence on the development and practice of science and engineering in the 20th century.' I would _expect_ this list to be almost exclusively biased towards "scientific and numerical applications."
I'm not surprised that the Fortran compiler is included. Like it or not, Fortran is the grand-daddy of high-level languages for science and numerical apps.
Re:Not quite true anymore (Score:2)
Dijkstra shortest-path algorithm? (Score:2)
Translation (Score:3)
It'd be interesting to see how these relate to the algorithms that computer jockeys see as king (obviously, the list is biased towards science and numbers). Are any of these algorithms related to things like compression algorithms (MP3, Zip, LZW, etc.), encryption algorithms (DES, RSA, Blowfish, etc.) or search algorithms (like the one Google uses, for instance)?
What would a computer geek's top ten algorithms list look like? Would they be scientifically/technologically important or more socially important (like MP3, RSA, or Google)?
The Decompositional Approach to Matrix Computation (Score:4)
This algorithm decreases the amount of time used to multiply to matricies from O(N^3) to around O(N^2.72) (or less) by breaking the matrix up into little pieces that are multiplied by each other (or broken into smaller pieces with the same process) then added together to get the result.
The primary speedup in this algorithm is that additions are much faster for a computer to complete than multiplications, and a single multiplication can be replaced with several additions.
The other benefit of this algorithm is that it divides the work into subproblems that can be dispatched to alternate processors and then recombined centrally.
Unfortunately, due to the large overhead costs of this method it only works well on very large data sets, such as 1000x1000 or larger arrays. Once you get below this size it is best to use the standard matrix multiplication.
Here is a section of code from a matrix multiplication algorithm using this method I wrote for an algorithm analysis class: if (this.length == 2) /* 2x2 matrix */
{
int x1, x2, x3, x4, x5, x6, x7;
int x1a, x1b, x6a, x6b, x7a, x7b, x2a, x3b, x4b, x5a;
x1a = a11.add(a22).getElement(0,0);
x1b = b11.add(b22).getElement(0,0);
x1 = x1a * x1b;
x2a = a21.add(a22).getElement(0,0);
x2 = x2a * b11.getElement(0,0);
x3b = b12.sub(b22).getElement(0,0);
x3 = a11.getElement(0,0) * x3b;
x4b = b21.sub(b11).getElement(0,0);
x4 = a22.getElement(0,0) * x4b;
x5a = a11.add(a12).getElement(0,0);
x5 = x5a * b22.getElement(0,0);
x6a = a21.sub(a11).getElement(0,0);
x6b = b11.add(b12).getElement(0,0);
x6 = x6a * x6b;
x7a = a12.sub(a22).getElement(0,0);
x7b = b21.add(b22).getElement(0,0);
x7 = x7a * x7b;
c11 = new Matrix(1);
c12 = new Matrix(1);
c21 = new Matrix(1);
c22 = new Matrix(1);
c11.setElement(0, 0, x1 + x4 - x5 + x7);
c12.setElement(0, 0, x3 + x5);
c21.setElement(0, 0, x2 + x4);
c22.setElement(0, 0, x1 + x3 - x2 + x6);
}
--
Eric is chisled like a Greek Godess
Re:Translation (Score:2)
Interestingly enough, the so called "butterfly" factorization of the transform matrix which is at the heart of the FFT was discovered by Gauss, but never used in the context of Fourier analysis. It was rediscovered independently.
--
Re:Most commercially valuable algorithm (Score:3)
useProduct(&productLeft) {
If (!productLeft) {
buymoreProduct(productLeft);
return;
}
lather();
rinse();
useProduct(productLeft);
}
(c) 2000 AP, Protected Under US Patent#: (patent pending)
What? No Bresenham's Line or Circle Algorithm? (Score:2)
algorithm must be elegant to implement and efficient to run on a computer.
But beyond that, there are algorithms that "mediate" between our world and the world of the computer. Chief among these is the most fundamental problem of Computer Graphics: What is the best way to draw a line or a curve onto a rectilinear array of pixels?
The problem is VERY non-trivial! "Good" lines must look the same if drawn from A to B as they do when drawn from B to A. Many early line drawing
programs failed miserable at this task, forcing drawing programs to sort the line endpoints before calling the drawing algorithm.
Bresenham's Line Algorithm was the first to meet these constraints (when properly implemented, of course) in a manner that was also EXTREMELY
computationally efficient. Not only did this algorithm give you "good" lines, it also gave you "fast" lines!
The problem gets worse with curves: In addition to directional symmetry, they must also exhibit reflective (not rotational) symmetry. That is, an arc drawn from 1 o'clock to 3 o'clock must be identical to all of its mirror reflections: The horizontal reflection from 9 o'clock to 11 o'clock, the vertical reflection from 3 o'clock to 5 o'clock, and the combined reflection from 7 o'clock to 9 o'clock. Deviations from such symmetry are extremely noticeable to the human eye.
For the generalized ellipse (which trivially includes the circle), an algorithm that meets the above requirements will have an extremely valuable
secondary effect: The algorithm need only be used, at most, to draw just one quadrant of any closed ellipse, since the other three quadrants can be drawn by reflecting the original arc. For a circle, only a single OCTANT (1/8 of a circle) is needed!
While other circle, arc and ellipse algorithms preceded Bresenham's, again none was simpler, more accurate or faster.
The underlying problem has to do with errors: When is it appropriate to turn on this pixel instead of that one? The "easy" result is to work in floating point, with fractional pixels, then round to 1 or 0 to decide if a specific pixel is on or off. And this, in turn, would seem to demand the use of trigonometry, in the case of a circle! Clearly, doing floating point and trig just to draw a line or circle is impractical, yet that is exactly what many of the earliest hardware accelerators did!
Bresenham's algorithms managed to draw accurate and precise lines and circular arcs, accurate to within half a pixel, without floating point math.
Now that's what I call one hell of an algorithm.
More complex curves, including arbitrary curves, lacked an "adequate" solution until the advent of the B-spline and NURBS techniques. While extremely efficient, they resemble elephants compared to the cheetah of Bresenham's.
And, of course, Bresenham's algorithms don't generalize well to meet the contemporary need for things such as alpha blending and dithering in the
color space. But the best of such algorithms still have a bit of Bresenham at their core.
A very good summary of the derivation of Bresenham's Line Algorithm may be found here [ohio-state.edu].
Bresenham's Algorithms (along with many other graphic algorithms such as flood and fill, and "XOR" cursor drawing) are fundamentally what made
"Personal Computing", that is, interactive computing with a GUI, possible with inexpensive hardware.
Bresenham's Algorithms (especially the Line Algorithm) are a glaring omission from the list.
There aren't (yet)... (Score:2)
Data compression and encryption (Score:4)
Re:That warm, daydreaming feeling (Score:2)
Thank you very much for the link!
Re:Sentimenal Favourite........ (Score:3)
Bubble sort is better than a general-purpose sort on a mostly-sorted list, so, if you care about performance, you would always use bubble sort on mostly-sorted lists in preference to quicksort, mergesort or radix sort. Actually, though, insertion sort is better again in this kind of situation, so use it instead of bubble sort.
Strong Numerical computation bias... (Score:5)
And the Fortran optimization one is... well, that's not really an "algorithm", now is it? It's a whole bunch of algorithms, and presumably each Fortran compiler does different things to optimize. Again, the numerical/scientific bias is obvious though... who else would use Fortran today?
Some algorithms (and data structures) I'd nominate are (note that I'm purposely avoiding numerical algorithms here, because the article aleady lists more than enough good ones):
Sorting out sorting (Score:5)
Oh, for those who don't know: It's the winning entry in a competition held at IBM. The competition was about the slowest search algorithm.
--The knowledge that you are an idiot, is what distinguishes you from one.
Re:Translation (Score:4)
Why is this important? Convolution, the primary operation involved in computing all kinds of neat filters, can be performed in O(n^2) time in time space, but O(n) time in frequency space. (i.e. it's a lot faster to perform image filtering in Fourier space). The problem is that the normal Fourier transform takes O(n^2) time, so you can't get into and out of Fourier space to justify performing filtering there.
With the FFT, filtering can be performed in O(n*lg n) time, since you can hop into Fourier space quickly, perform the filter, and hop back (the constants work out to this being useful after n ~= 100,000, which is a reasonable point for audio and most images).
The FFT is a great example of an algorithm that suddenly tipped the asymptotic scales and revolutionized an entire discipline.
-m