Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Education Math Programming Science

Astonishing Speedup In Solving Linear SDD Systems 157

eldavojohn writes "A new paper (PDF) out of Carnegie Mellon University shows how to solve symmetric diagonally dominant linear systems much faster than before. The technique employs graph theory, randomized algorithms, and linear algebra to achieve an astonishing advantage over current methods to solve such systems. From the article: 'The result is a significant decrease in computer run times. The Gaussian elimination algorithm runs in time proportional to s^3, where s is the size of the SDD system as measured by the number of terms in the system, even when s is not much bigger the number of variables. The new algorithm, by comparison, has a run time of s*[log(s)]^2. That means, if s = 1 million, that the new algorithm run time would be about a billion times faster than Gaussian elimination.' Developers out there who maintain matrix packages and linear algebra tools might want to take a peak at the paper. Anyone who has modeled real-world systems will be able to tell you that speedups in linear algebra algorithms have a weighty effect on efficiency in simulations — especially as one approaches the theoretical limits of optimality. This research is currently being presented at the IEEE Symposium on Foundations of Computer Science."
This discussion has been archived. No new comments can be posted.

Astonishing Speedup In Solving Linear SDD Systems

Comments Filter:
  • by azaris ( 699901 ) on Friday October 22, 2010 @10:13AM (#33984906) Journal

    Multigrid [wikipedia.org] is theoretically O(s), so I don't immediately see how this is such a huge leap. Of course the actual complexity also depends on the problem and the implementation. Maybe their method.is applicaple to a wider variety of problems.

    Also, the "iterated sparsifying" sounds a lot like algebraic multigrid.

  • by HighFlyer ( 60002 ) on Friday October 22, 2010 @10:24AM (#33985000) Homepage

    I had the same thought but I guess multigrid is limited to specific matrices while the new algorithm seems to be more generic. Not sure though, my last contact with multigrid lies 14 years ago (when algebraic multigrid was a hot topic in science).

  • by nten ( 709128 ) on Friday October 22, 2010 @10:40AM (#33985130)

    If I had had Gilbert Strang as an instructor for linear algebra instead of who I did have, maybe I would understand what the article is talking about. Having watched those videos I repeatedly said to myself "oh thats what we were doing!" Are covariance matrices SDDs? If so could this be used to speed up principle component analysis?

  • by TrekkieGod ( 627867 ) on Friday October 22, 2010 @10:45AM (#33985164) Homepage Journal

    Multigrid [wikipedia.org] is theoretically O(s), so I don't immediately see how this is such a huge leap. Of course the actual complexity also depends on the problem and the implementation. Maybe their method.is applicaple to a wider variety of problems.

    Also, the "iterated sparsifying" sounds a lot like algebraic multigrid.

    I think the wider variety of problems is exactly the major advantage. A symmetrically diagonally dominant matrix is not necessarily sparse, but it does come up a lot. I know from experience that nodal-analysis based circuit simulators will end up like that most of the time, since each matrix cell represents a node connection, and when you have a circuit element between node i and j, it's obviously also between j and i (exceptions made when modeling things like transistors, because you need to handle the gate differently). The diagonal values consist of the sum of the entire row plus whatever's connected to ground, so it's always dominant.

    That said, circuit simulators also tend to come up with sparse matrices most of the time, since you rarely have every node being connected to every other node...

  • Re:Grain of salt (Score:3, Interesting)

    by ObsessiveMathsFreak ( 773371 ) <obsessivemathsfreak.eircom@net> on Friday October 22, 2010 @12:24PM (#33986384) Homepage Journal

    For example, if your matrix has the sparsity pattern of a planar graph, it has been known for 30 years that you can do Gaussian elimination in O(n^1.5).

    This algorithm is supposed to run in O(n log(n)^2) time, which is actually better than O(n^1.5). Not a lot better mind you, but if you make n very large and don't have a lot of overheads, you should see improvements.

  • by sirrunsalot ( 1575073 ) on Friday October 22, 2010 @01:48PM (#33987766)

    Or in limerick form, as published in a paper by Roe, LeVeque, and van Leer [freeshell.org],

    So medium-term weather prediction
    turns out to be merely a fiction.
    It's just anyone's guess
    if you ask how to dress
    it will offer no useful restrictions.

"When the going gets tough, the tough get empirical." -- Jon Carroll

Working...