Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
News

gridMathematica Announced 124

simpl3x writes "Mathematica for grids was announced at Comdex. It offers support for the usual platforms--Windows, OS X, Linux, and Unix--and offers the ability to use heterogeneous OSes. I haven't used the product in years, but cool nonetheless. Does an off-the-shelf software package, which is scalable as this is provide competition to custom packages--is it easier to add machines than develop custom programs?" And just when you thought Comdex was good and dead.
This discussion has been archived. No new comments can be posted.

gridMathematica Announced

Comments Filter:
  • Yes, but... (Score:2, Funny)

    by ComaVN ( 325750 )
    Can it be used to emulate the universe using a cellular automaton?
  • by BadDoggie ( 145310 ) on Thursday November 21, 2002 @07:26AM (#4721864) Homepage Journal
    According to the Features [wolfram.com] section, it includes:
    • Support for multiprocessor machines, heterogeneous networks, LAN, and WAN
    • Support for scheduling of virtual processes or explicit distribution to available processors
    • Support for virtual shared memory
    • Support for synchronization, locking, and latency hiding
    It looks like they took a few pages from the various distributed projects (United Devices [ud.com], distributed.net [distributed.net]). I can see this being used for universities and some research and scientific institutions, but still, with processor power today, even Mathematica representations of 10-dimensional Calabi-Yau spaces [wolfram.com] can be rendered in minutes.

    woof.

    • by orcaaa ( 573643 ) on Thursday November 21, 2002 @07:44AM (#4721906)
      Actually, Mathematica, in the GUI mode, has a lot of processing overheads. And performing complex calculations can take a while. I use Mathematica regularly, and a class assignment took me 6 hours to "compile" on Mathematica. (It was about modelling proteins and showing how they evolve with time). If it indeed takes such time, then this kind of grid computing can be a boon. I can see this being used a lot in the Universities, where usually, there are a large number of computers not being used simultaneously.
      • by f00zbll ( 526151 ) on Thursday November 21, 2002 @07:58AM (#4721938)
        Back in 95-96, these types of calculations for large complex proteins were typically sent to UCSD. In fact, most of the grad students working with these types of calculations all sent their jobs to UCSD supercomputing facility (if they were lucky). Some of the ones that I've seen took 1 week to finish. Not that I understood what I was looking at, but gridMathmatica should make it easier for grad students to use resources on their campus. Getting time or slots at UCSD's supercomputing facility took some serious butt kissing and wrangling. Atleast that's what I saw first hand. It may have changed since the 90's.
      • by mbourgon ( 186257 ) on Thursday November 21, 2002 @08:38AM (#4722113) Homepage
        At Texas A&M, one of the local CS students was doing his Thesis on some sort of large math problem in HRBB. He had two choices for writing the code to provide the solution. He could either write it in Fortran and use the Cray Y-MP we had (which, 10 years ago, was a fairly big deal). OR, he could use a high level language and use Zilla and our NeXT lab to solve it. He chose the second.

        Amazing to see - you'd tell the Cubes to run Zilla in the background, feed it the problem, and away you go. We had 6 computers in the lab, and you could tell he was working on it when you first logged in - it would be a bit sluggish. IIRC, he later took over the NeXT lab downstairs, which had 30 NeXT "pizza boxes". Not bad, especially for 1991-1992.

        And this paper [mathsource.com] says:
        Parallelization: with a NeXTstep application called Zilla, multiple Mathematica sessions may be invoked on networks of NeXT computers to allow the simultaneous solution of different parts of a large problem.

        BTW: anyone happen to know if they're doing Zilla on OS X? I remember reading something about an easy way to cluster Macs for performance, but I forget the details. It just involved running a client on each workstation.
        • I don't know about Zilla, but as for an easy way to cluster Macs, check out Appleseed http://exodus.physics.ucla.edu/appleseed/appleseed .html
        • Distributed and grid computing is older than that. PVM itself, for example, came out in 1989, and people were using networks of workstations for large computations before that.

          Keep in mind that scientific computing, workstations, object-oriented programming, GUIs, GUI designers, and similar ideas predate NeXT and Apple by a decade, in some cases, many decades. While Jobs did an excellent job selecting and integrating good, pre-existing technologies into the NeXT, I can't think of a single case where they developed something really new (but if you can, please share it).
  • by f00zbll ( 526151 ) on Thursday November 21, 2002 @07:27AM (#4721870)
    already have their own cluster, and grid systems? This should mean some small junior college or state college w/o tons of government research grants may be able to even the playing field. With the reduction of cost, it begins to make it easier for smaller research labs and schools to build grids. I remember assisting graduates studens prep processes so that it could be sent to UCSD's supercomputer. Now more universities will have their own system and be able to utilize their computer labs as grids at night. Atleast in theory.
    • I thought research labs already have their own cluster, and grid systems?

      Many do, but they are somewhat hard to utilize sometimes. For instance, in the lab I am at now, they have a well-sized cluster which in its first incarnation used double-CPU nodes. It turned out that usually, only a single CPU on each node was used. This was because the applixation that people mostly were running didn't support using more than one cpu.

      In this case, it has probably (I am note a user myself) been a headache running Mathematica scripts on multiple nodes. I guess it means writing small Mathematica scripts that are then distributed across the cluster and a perl script or something collects the results and merges them. Being able to write the whole logic in one environment must be a big step forward.

      One of the cooler things here though is the heterogenecy. In the organisations I have been, there have been enourmous computing power in administrative PCs running windows that no-one has been able to really take advantage of. At least without a big effort. gridMathematica may actually provide an easy way of tapping into that resource.

    • by capt.Hij ( 318203 ) on Thursday November 21, 2002 @07:57AM (#4721934) Homepage Journal
      It is not uncommon to find a cluster of networked computers acting as a distributed system. They are usually implemented using things like MPI or other message passing protocols. The programs tend to be written "in house" and are generally written for specific applications.

      Grid Mathematica is different because it is a commerical application, can be used for general problems, and is used for symbolic computations. This makes it much easier to implement an algorithm on a distributed platform. The results may take longer than a program written specifically for a particular application, but distributed environments can be exploited by many more people with this product.
  • by wfmcwalter ( 124904 ) on Thursday November 21, 2002 @07:28AM (#4721873) Homepage
    Is there an open-source or free-software product in the Mathematica / Matlab / Maple etc. space ?

    How to the free solutions, if they exist, compare with their (darned expensive) commercial bretheren in general, and in particular is there anything like grid support?

    • not in the distributed / grid...
      Insted of Matlab there is Octive which is quite good. You can pretty much use your matlab code directly. Some of the tool boxes aren't there.
    • by starseeker ( 141897 ) on Thursday November 21, 2002 @07:53AM (#4721927) Homepage
      Probably the most advanced open source project competing with Mathematica et. al. is Maxima [sf.net]. It's a spinoff of Macsyma, which was the first symbolic integrator. Originally developed by MIT, it's got a lot of features the other programs have, and a few they don't. It's got some bugs, but is under very active development.

      It's major weaknesses currently are in the GUI and documentation department. TeXmacs can do a decent job of providing a nice interface, but it still won't measure up to Mathematica, which can handle 2D input and output. The default interface is a Tcl/Tk program, which is OK but pretty basic. My prefered way to use Maxima is through emacs - it has a very good emacs mode called emaxima.

      As far as grid support goes I'm not aware of anything. The project isn't really to that stage - it's currently working towards a stable 6.0 release which fixes all known mathematical bugs. Then comes feature extensions and new GUI work. That would probably be the point to start thinking about grid support - basically someone would have to decide they wanted it enough to impliment it. The usual open source thing.
      • Sorry to reply to my own post, but one thing I should add - while I don't know quite how this relates to grid support per say, there is a lisp interface to the pvm library at http://www.symbolicnet.org/ftpsoftware/cl-pvm/ which help with getting Maxima to do this stuff.
        It is free for noncommercial use - I imagine you could contact the authors to discuss other uses.

        Part of the readme:

        The CL-PVM package consists of a set of Common Lisp functions that interfaces Common Lisp (KCL, AKCL, or GCL) to the C-based library of PVM. Generally, there is one CL interface function to each PVM C library function. The CL function calls its C-based counter part and relays data to and from the C function. This interface is complete and allows Lisp-based programs to take part in a PVM arrangement and thus facilitates the combination of symbolic, numeric, graphics, and other useful systems in a distributed fashion.
        CL-PVM also offers a set of tools to help use it effectively with Lisp and MAXIMA tasks. Documentation, on-line manual pages, and examples are also included.
      • Funny story [google.com] about Macsyma and Wolfram. In the end, Lisp was vindicated - not even Wolfram can escape Greenspun's tenth rule [utexas.edu] =]. I also hear that there is still some code written by him distributed with Maxima.

        Another scientific computation package that looks promising is Lush [sourceforge.net]. It's a custom Lisp dialect with a compiler in the spirit of KCL/ECL in that it doesn't need an FFI to interface with C (I'm not sure if the compiler is native-code or a C translator, the sourceforge page is down). It comes with a lot of numerical and multimedia libraries, and from the looks of it is a pretty damn cool package. I haven't had time to check it out yet.

        Although I can't get it to compile, Codemist [ntlworld.com] seems to have released [ntlworld.com] the source to their Standard Lisp compiler and (some, all?) of their symbolic algebra system Reduce.

    • by Cyclops ( 1852 ) <rms @ 1 4 0 7 . org> on Thursday November 21, 2002 @07:53AM (#4721928) Homepage
      Hi, you can always checkout octave [octave.org] which is provides a convenient command line interface for solving linear and nonlinear problems numerically, and for performing other numerical experiments using a language that is mostly compatible with Matlab.
    • by bezza ( 590194 ) on Thursday November 21, 2002 @08:08AM (#4721965)
      There is no reason why a Matlab clone couldn't be made (it is just a interpreter with some built in numerical functions that have already been developed). Anyone can write functions to evaluate expressions numerically.

      Maple on the other hand is the most amazing piece of software I have ever used because of its ability to deal with variables etc exactly like a human can. I am studying for my finals right now and I use it to do some of the more tedious work so my study is more efficient. Calculating the exponential of a matrix is tedious at best but Maple does it with ease. I don't believe a product like this could be made in the open-source evironment...a massive amount of research would have to be undertaken and this would require a heap of money, as no methodology could be taken from the Maple team itself.

      I am not trolling but the open-source community is much better at creating a (usually better) alternative to existing software with an obvious algorithm or method rather than investing money into computing theory like a clone of Maple would need.

      A similar example would be linux desktops...take KDE or Gnome...great desktops, but most ideas have been taken from either Windows or the Mac OS'es, who have sunk millions of dollars into market research.

      As I said, I am not trolling, and am open to be proven wrong. Prior similar examples etc.?

      • Maple on the other hand is the most amazing piece of software I have ever used because of its ability to deal with variables etc exactly like a human can.

        There are several opensource symbolic math packages. Checkout Scilab [scilab.org] for instance.

      • Gnu Octave [octave.org]. Easy as 'apt-get install octave'.
      • So you think that people working in their spare time can create a decent Matlab clone. Do you even know what Matlab is, or what it is truly capable of?

        There's a reason why Matlab costs a fortune. Can Octave simulate a complete end-to-end 802.11b system? Can it simulate an automotive drivetrain, or better yet make changes to a dSPACE ECU development module on the fly?

        With OSS, you get whatever some group of hippies decided to put down on paper. With commercial software, you get quality engineering software.
        • Yes I do know what Matlab is capable of. The toolboxes that come with the full package are amazing, but are mostly based on prior mathematical knowledge that are evaluated numerically by the computer. In that respect, only mathematical knowledge is needed to code...no new computing theory has to be made up.

          This is how Matlab is different to Maple.

          • As an aside, did you know that Matlab 6.5 contains a Maple 5 package? Matlab isn't going symbolic, but it is there for you to use.

            I agree with your point about most of Matlab being based on prior mathematical knowledge. But how can you say there is incentive to develop something like Matlab for free? People are used to the exchange of money for goods and services of value. No money exchanged therefore means your "purchase" has no value. It must have no monetary value because no one will buy it.

            You can state the ideals of free software all you want, but at the end of the day someone wants to be paid to do the job.
    • sure: plenty (Score:5, Informative)

      by g4dget ( 579145 ) on Thursday November 21, 2002 @08:22AM (#4722018)
      For interactive symbolic manipulation, Maxima [utexas.edu] is an excellent open-source alternative. For numerical applications, Numerical Python [sourceforge.net] and its associated packages beat both Matlab and Mathematica in my opinion. For 3D visualization, you can get VTK [kitware.com], which also has Python bindings.

      Maxima is also used occasionally as a rapid prototyping language, but it's proprietary and it has a lot of rough edges. You are probably better off using one of a number of open languages with similar features, like Scheme [schemers.org], OCAML [ocaml.org], SML [bell-labs.com], Prolog [inria.fr], or Haskell [haskell.org].

      Don't forget about C++, however. In many ways, C++ nowadays allows you to write numerical code more naturally than any of these other languages (yes, better than Matlab and Mathematica), it has by far the best libraries available for it, and it gives you excellent performance. And you can even do symbolic mathematics in C++, with the right libraries (though it's not interactive, of course).
      • Maxima is also used occasionally as a rapid prototyping language, but it's proprietary and it has a lot of rough edges.

        Oops--I meant:

        Mathematica is also used occasionally as a rapid prototyping language, but it's proprietary and it has a lot of rough edges.

        Maxima is non-proprietary, but you probably wouldn't want to use it for programming either... you can drop into CommonLisp for programming when you have to in Maxima.

    • How to the free solutions, if they exist, compare with their (darned expensive) commercial bretheren in general

      Anyone know how expensive "darned expensive" is in this case? Presumably it depends on the number of nodes you license, but I didn't see a price mentioned on the Purchasing page. I guess "if you have to ask..."

      • A full single-user licence for Maple in the UK costs well over a thousand pounds - getting on for two grand, IIRC. They've done the age-old "sneaky bastards" trick of swapping the £ with $, so it's cheaper in the US. Mathematica is similarly priced.

        Heck, the Student Version alone costs £130. These things, however, aren't just programs, they're ways of life. Being able to use the likes of Maple and Mathematica to investingate how mathematics works is truly awe-inspiring. Fuck it, Maple's going to get me my degree. Worth every single penny.
    • The most impressive open-souce computer algebra system is Axiom, which has a homepage here [earthlink.net].


      The compiler Aldor [aldor.org] for Axiom is available for download. Axiom was developed at a number of places, including IBM, and it being released as open-source is something that has only been finalized in the last couple of months, so the distribution is just beginning to be more widely available.

      • I went to the page, and I didn't find any source.
        Reading the license, it looks like "yet another
        open source license". I am inclined to mistrust,
        especially with only a binary-only package
        available for my Linux platform.

        I teach mathematics at the college level. I'd
        love to see good free software tools (I use
        Octave for example). What is to make me want
        to contribute to this project?

        For the moment, I am definitely not investing
        time into this. We'll see when there is actually
        source available.
    • I haven't seen this being suggested here yet, but R for statistical computing (link [r-project.org]) (GNU 'S') is not only open-source, but also used a lot in several scientific fields, such as statistics and machine learning (books have accompanying source code in R). It has loads of packages [r-project.org] which allows you to do all kinds of stuff.
    • OSS Matlab? Yup ... (Score:2, Informative)

      by fygment ( 444210 )
      ... besides Octave you might try the Euro-centric Scilab (http://www-rocq.inria.fr/scilab/). It is very close to Matlab in abilities, already has provision for parallel processing (via PVM), and has a _very_ supportive user group.

      Honestly though, I've tried just about everything that's out there in OSS. You can cobble together things with C++ libraries, Python/NumPy, etc. But you pay big bucks because the commercial software brings it together seamlessly and, generally, mindlessly.

      Case in point: I got some biochem students running FFT's and principal component analysis w/graphic output in less than half an hour (using Matlab). Didn't have to explain about syntax, wrappers, bindings, etc. Just fft(...) and plot(...). That's worth $1000 because it got them interested. When their problems get more complex than Matlab can handle simply, they'll _want_ to learn the other stuff.
    • Can you imagine if someone invented a computer language that was 100% proprietary, and there was only one vendor for the interpreter? How many Slashdotters would use it? Well, that's what Mathematica is.

      Wolfram recently wrote a book claiming that he could explain the whole universe using cellular automata. The problem is that it's all expressed in Mathematica's notation, so at least one reviewer I know of ended up saying essentially, "It looks like crap, but I don't feel like learning a proprietary computer language just so I can check out all the details."

      My personal experience with Wolfram has been a pretty good example of the abusive relationships you can get into with proprietary software vendors who have no morals. I paid for a copy of Mathematica, which stopped working when I upgraded to MacOS 8. Tech support's reply was that if I wanted to keep running Mathematica, I could pay for an upgrade to a newer version of Mathematica.

      Stuff like this makes me really grateful for Maxima.

    • Is there an open-source or free-software product in the Mathematica / Matlab / Maple etc. space?

      Fear not! Computers were made to crunch numbers - there's always a good chance someone's doing just that =)

      I'm not a mathematician... but I've tried a few math apps here and there. Octave + Gnuplot (people aready mentioned this), and Euler [sourceforge.net] (I particularly like this thing's "workbook" approach... Save all notes in your session to a file, load it back, and fully edit everything in your command history! Way cool.

  • Students (Score:5, Interesting)

    by Omkar ( 618823 ) on Thursday November 21, 2002 @07:28AM (#4721874) Homepage Journal
    Although this would be expensive, couldn't Wolfram set up a subscription service? Students who need temporary access to the power of Mathematica (I'm thinking of doctoral theses) could but computing time.

    On an unrelated note, Integrals.com [integrals.com]is one of the most useful high school math sites ever (up with Ask Dr. Math [mathforum.org]. It ended two weeks of misery by telling integral(sqrt(1+x^-4)dx) is not an elementary function.
    • Actually, integrals.com is broken. As a proof, check out the following screenshots
      http://www.columbia.edu/~bbb2004/pics/snapshot1.pn g
      http://www.columbia.edu/~bbb2004/pics/snapshot2.pn g
      http://www.columbia.edu/~bbb2004/pics/snapshot3.pn g
      http://www.columbia.edu/~bbb2004/pics/snapshot4.pn g
    • See, that's why you need Mathematica. It gives the really useful answer

      In[2]:= Integrate[ Sqrt[1 + x^(-4)], x]

      -4 3/4 -4 2 2
      Out[2]= -(Sqrt[1 + x ] x) - (2 (-1) Sqrt[1 + x ] x Sqrt[1 - I x ]

      2 1/4
      > Sqrt[1 + I x ] (EllipticE[I ArcSinh[(-1) x], -1] -

      1/4 4
      > EllipticF[I ArcSinh[(-1) x], -1])) / (1 + x )
  • It is quite a bit annoying when people say "Linux and Unix". Usually they mean "Linux and Solaris", or maybe "Linux, Solaris and HP", or (increasingly often, as the business world funnels itself down to only one or two Unixlike systems... ahhh, how great monopolies are born!) "Red Hat Linux for x86 and Solaris for SPARC".

    It would be simpler-- and more accurate-- to just list the actual operating systems it's available for.

    And then there's that whole thing about 'Unix'. Technically, as most of us know, the name 'Unix' only applies to whomever owns the trademarked name 'Unix' at the moment (first AT&T, and then I seem to recall control passed to Novell, and to SCO, at various points in history? Can someone provide a full history?), but most Unix admins just consider any "Unixlike" system to be Unix-- and that includes Linux.

    If you don't know the actual OS names, you could always use "...will be available for several Unixlike systems, including Mac OS X, Linux, and Solaris", or whatnot...
    • by BadDoggie ( 145310 ) on Thursday November 21, 2002 @07:38AM (#4721894) Homepage Journal
      From the Supported Platforms page [wolfram.com] (first link on the Specifications page):
      • Windows 95/98/Me/NT/2000/XP
      • Mac OS X
      • Mac OS 8.1 or later
      • Linux (Redhat 7.1 or equivalent) or later
      • PowerPC Linux (Yellow Dog 2.1 equivalent or later)
      • AlphaLinux (Redhat 6.2 or equivalent) or later
      • Solaris 8 or later
      • Compaq Tru64 Unix 5.1 or later
      • HP-RISC HP-UX 11.00 or later
      • IBM RS/6000 AIX 5.1 or later
      • SGI IRIX 6.2 or later
      If the call is for "Redhat 7.1 or equivalent", I can't think of any reason that a distro with kernel 2.4.2 or later wouldn't work.

      woof.

      • At least with Mathmatica 4.0, the linux install is simply a shell script. I haven't had any problems with it on a 2.4.x kernel. You really just need a relatively up to date distro and it'll work fine.
      • It's all about support right? I'm sure any 2.4.2 kernel would be fine. Supporting every distro under the sun would probably be a cost issue on their end.
      • I have had the luxury of installing Mathematica on nearly all of these platforms and can say with confidence, that it works equivalently on all platforms. It does the same things on windows, mac, *nix, etc in the same ways. And all versions have the same features so you know your Linux box isn't getting short changed compared to windows.
  • I've used this.. (Score:1, Insightful)

    by Anonymous Coward
    I've used this. Unfortunately it only has around 70% of the functionality of other packages such as matlab and of those functions it has, only around 50% of them are of a decent usable quality. The performance of the Unix version is also rather poor, being around 60% that of the Windows version.
    • Re:I've used this.. (Score:2, Interesting)

      by orcaaa ( 573643 )
      First of all, what are u referring to when u say that u have used "this". Is this Mathematica or gridMathematica? I am assuming its mathematica. In that case, I would say that u are highly mislead or have not used the product in a long time. I have used all three, and have found Mathematica most suitable for the kind of work I do, namely, sybolic manipulation. Matlab, on the other hand, is excellent for number crunching. Claiming that one is better that the other is a statement similar to saying Redhat is better that Debian .... or something similar.
      While Matlab is more efficient, Mathematica, certainly, has a more usable interface(more eyecandy - which is responsible for slowing it down a bit).
      While talking about functionality, i have found that Mathematica is the most functional amongst all, for me, with a very large number of inbuilt functions that do the job extremely well.(here come the flames). Also the unix version actually works more efficiently on my system than windows.
      I dont know where are u pulling those numbers out of. But it seems u just made them up.
  • bah (Score:5, Funny)

    by grub ( 11606 ) <slashdot@grub.net> on Thursday November 21, 2002 @07:51AM (#4721921) Homepage Journal

    No matter how much horsepower I put behind Mathematica, it still gives me errors when I divide by zero. My employer didn't spend zillions of dollars on SGI Origins just to get errors. Can't Wolfram include some sort of Clippy helper? /sarcasm

  • discount (Score:5, Funny)

    by sstory ( 538486 ) on Thursday November 21, 2002 @07:58AM (#4721939) Homepage
    wonder if i'll be able to get a $130 gridMathematica for Students version. :-)
    • per processor!
    • There is a reason they are charging you only $130 for the student version. It's like a drug: after you have spent many hours getting familiar with Mathematica, you will then buy the full version for $1880 [wolfram.com] rather than spend more time learning another system. Matlab is even worse: last I checked, the full version costs upwards of $3000 in any reasonable configuration. And you'll end up paying that every time you change jobs. Matlab and Mathematica packaging may be convenient, but they are not technically that much better than the free alternatives to justify that kind of hassle and expense (in fact, I would argue that they are technically worse than the free alternatives, but that's a different argument).

      Do yourself a favor and don't invest much time in "student versions". View the use of Mathematica and Matlab in the classroom for what it is: a carefully orchestrated marketing program designed to get students hooked. Spend your time learning something that is open and that you can add to your personal toolbox without having to pay some company large amounts of money again and again.
      • yours is largely a good post, but i question the comment that I will have to pay the full fee every time I change jobs. If I pay the fee, the license is mine, regardless of where I work.
  • free alternative (Score:4, Informative)

    by g4dget ( 579145 ) on Thursday November 21, 2002 @08:03AM (#4721951)
    You can get a copy of Numerical Python [pfdubois.com] and run PyPVM [sourceforge.net] or PyMPI [sourceforge.net] with it for distributed computing.

    I think for numerical computation, that's technically actually a better environment than Mathematica. And, while I'm not usually one to harp on the fact that free software also doesn't cost money, given the steep price of Mathematica, in this case, the money saving aspect really does matter as we..
  • Distributed Solver (Score:2, Informative)

    by Knacklappen ( 526643 )
    Cool. What I really want to have is a distributed solver for dynamic simulations. But a dream scenario would be to do the setup, pre- and postprocessing in any simulation program (like ADAMS [adams.com], LMS [lmsintl.com] or even block-scheme based like EASY5 [adams.com] or AMESIM [amesim.com] or good old Simulink [mathworks.com]). For the solving part, however, I'd like to just export the equation sets (implicit or explicit) and let a distributed solver take care of this. As I understand, it could be possible to use the Mathematica solver as it exists today. Maybe not very efficiently though, but this could be compensated by quantity. I would love to install such baby in our company...
  • by jki ( 624756 ) on Thursday November 21, 2002 @08:25AM (#4722036) Homepage
    The company I work for has some competence in grid computing, and we have a platform that we have tailored for some customers. There would be a gazillion of END-USER companies interested in utilizing grid/meta computing but a high percent of them are faced by the same problem: LICENSING methods which do not take the need for gridcomputing in account. Even if a computer only works as a processing slave providing computational resources, for many types of software from many vendors the end-user company still needs to purchase similar licenses as when all the features of the software are used. This makes integrating existing software with a gridsolution just to enhance performance less favorable than buying huge amount of memory and CPU power for a single node: because then they survive with less license costs. A license for such software can today cost $20 000 per license for example. In my opinion, this practise should be changed and maybe the end-users should combine forces to make these changes happen and put some pressure on the software vendors. There are cases where companies could do their simulations hundreds of times quicker if they could just afford the licenses.

    It is good that atleast mathematica has altered their licensing methods a bit. Maybe this licensing scheme could be used also when utilizing mathematica over 3rd party grid architecture. If someone from Mathematica is listening, I don't mind you contacting me. :)

  • Quick Poll (Score:4, Funny)

    by ackthpt ( 218170 ) on Thursday November 21, 2002 @08:38AM (#4722114) Homepage Journal
    For mathematical modeling I use:

    Mathematica

    LPL

    AMPL

    I code it all myself in assembler, thank you very much!

    Fingers and toes

    Another method

    CowboyNeal works it out for me on his abacus

  • Don't get me wrong, I absolutely love Mathematica, but if your numerical problem is so huge that it benefits greatly from running on a cluster, then you shouldn't be doing it in Mathematica in the first place. Nothing can match custom written Fortran, C, or C++ code for performance and efficient use of memory (Mathematica is a huge memory hog). Non-numerical problems, on the other hand, might benefit greatly from this.
    • Wolfram are relying (quite sensibly) on the fact that those technical (but not computer-technical) people who've invested significant effort in mastering the appropriate program for their profession (e.g. AutoCAD, 3DS, Photoshop, Word, Excel, Mathematica) henceforth use their one tool as if it were a universal tool, a digital swiss-army-knife.

      I've seen a menu done in AutoCAD and a 3d mechanical drawing done in photoshop - both to quite hysterical results.

      Wolfram know that there are _lots_ of users who'd much rather persuade their employer/institution to buy lots of gridMathematica licences than have they themselves switch to working in Fortran.

      • I agree here, and more generally that Mathematica is very inefficient at doing a lot of basic things.

        I would break down problems into two types: Those that Mathematica can solve quickly and easily, and those that it can't. For the latter category, I would think that 99% of the time the proper solution is to use something other than Mathematica, not distributed computing.

        I find that when (in Mathematica) something takes more than a few seconds, it's because I'm doing it wrong or I should code the problem in C or something else. After the first several seconds have elapsed, I either abort the calculation and figure out how I screwed up, or if I'm desperate and it's late, I'll let it run overnight. Then when I return, it will either still be running or will have produced a result that I didn't want anyway. So in my experience, throwing more CPU time at Mathematica is almost never helpful. On the computing cluster I used during grad school, every week or two I would find that my analysis jobs were running slowly, because someone had a runaway mathematica kernel going at interactive priority. If it's using significant CPU, kill it! Don't give it more CPU's to burn!

        I think Mathematica is great, and worth the cost if you do a lot of mathematics. But like any program, you have to know when to use it. Too many people have been trained on Mathematica and little else, and they can't understand why it takes mathematica an hour to integrate a wave packet (and still gets it wrong), while a C program can get the right answer in microseconds. You can actually do this with Mathematica if you're bent on it, but you can't use Integrate or NIntegrate -- you have to do it as a sum, and it still takes several minutes. That's just one example of many I've experienced. Same principle applies when doing symbolic math, too.

    • There's a tradeoff, and in most cases, hardware is cheaper than labor. Let's say that someone earns $60,000 a year as a coder. After all of the costs to the employer, it will cost around $90,000 to $100,000 to keep them employed for a year.

      That means that if the coder spends two weeks writing his program to do the math more quickly, it cost the employer $4,000 - in addition to the opportunity costs.

      Now, given that you can pick up a 2 GHz-class machine for $500, you're presented with another option: You could just use Mathematica, and let it run across 8 machines in parallel.

      Which way makes more sense? It depends on the situation. Let's say your coder is in great demand for other projects - the hardware route might make more sense. If he's sitting around anyway, you might as well put him to work. There are a lot of factors which can tip the scales one way or another.

      steve
  • Now I can get my SGI, Sun, and other UNIX boxes in my house to do top secret nuclear computations!!

    Seriously though, alone those machines can only do so much ...
  • Could someone write a Mathematica program to do DiVX encoding and get free clustering from this product? mmmmm.... movies...
    • That's not the optimal method of proceding.

      A video codec, such as DivX, typically utilises some form of transformation, be it FFT, IDCT or some form of wavelet transform, then quantisation. Whilst you could split that into separate sections, it makes more sense to operate at a higher level - i.e. to split your video stream into chunks and then send each chunk off to a node, to compress in the usual fashion.

      The advantages of this approach is that the codec only need be written once, and given that that's a hard part, this is a good thing. It also means that the cluster interface is codec neutral, thus no particular work need be done on the part of the codec developers.

      transcode [uni-goettingen.de] has such a mode.

  • random thoughts (Score:1, Interesting)

    by Anonymous Coward

    It is a shame that Mathematica is so expensive, and so slow for numerical calculations -- it is great for prototyping and tinkering, but lousy for serious computations compared to code written in C(++) or Fortran.

    I have love/hate relationship with Mathematica -- I have used it for years, and have writtens lots of purpose-built programs in it to support my research, but this "legacy" code makes it hard for me to switch to anything else.

    Consequently, I am forced to pay Wolfram's industry-leading prices for the program and put up with their slow-as-molasses approach to bug fixing. For instance, a set of hypergeometric integrals were broken for at least 18 months, and it seriously screwed up the work of at least one colleauge of mine -- and if the source was available we could have fixed it ourselves.
    Consequently, I would love to see (and would be keen to contribute to) an open source clone of Mathematica, or at least something that could parse mathematica code into a form understandable by an open source package like Maxima.

    However given Wolfram's trigger happy approach to lawsuits I rate the chances of this succeeding to be fairly small -- which is ironic, since Wolfram's recent book fails to acknowledge a lot of prior art, and he was successfully sued by his co-investors for essentially asset-stripping his own company in the early nineties.

    Ah well. Back to work.

    • I just read "A New Kind of Science". It really is funny to observe Wolfram's attitude toward prior art. He's so bad at acknowledging other people's work that it's almost touching. It's like he's crying out for recognition.

      He claims to have invented a new kind of prose for the book, and one of the symptoms of this claim is that he constantly refers to the discoveries of his book or his book's discoveries.

      My impression of the book: If he were better at collaborating he'd be much more productive. The same is no doubt true of Mathematica.


      Maybe we should all write him an email: "Hey dude you're a genius!" Maybe then he'd open source Mathematica. He definitely want Mathematica to be the lingua france of math, but I don't think money is the reason.

  • by pridkett ( 2666 ) on Thursday November 21, 2002 @11:11AM (#4723310) Homepage Journal
    It seems like this attempt to market something as "gridMathematica" is really a little deceiving. In reality it is more distributed Mathematica. Grids involve virtual organizations, authentication, etc. For more information see Ian Foster, Carl Kesselman, and Steve Tuecke's paper The Anatomy of the Grid [globus.org].

    There are other packages which do very similar things and have a for a long time, such as NetSolve [utk.edu] and Ninf [apgrid.org] which allow you to do cool stuff with most any application that needs computational power.

    There is also a Commodity Grid Kit (standard interface to Globus [globus.org] services) for Matlab that should be out soon, more info can be found here [globus.org].

    So for now, I'll just consider this more someone wanting to capitalize on the hype around Grids at SC2002 [sc2002.org] than anything else. Unless I'm missing something obvious.
    • Problem is, they don't do mathematica, do they? I only checked Netsolve, and they do support Matlab, which is handy, but it's not mathematica.

      As for Ninf, seems like a lot of programming is require to tailor it to your application.

      So they don't natively support mathematica. And trust me, when you're a student or modeling something professionally (ie getting paid for it), you're already busy with your own maths to want to do extra work.
  • ...are a lot like the problems and advantages of Microsoft's products. Mathematica is slow and bloated, but it's also easy and trusted. There are a ton of undergrads who know Mathematica and like it because of the symbolic interface. Any open source competitor needs to have a symbolic interface on par with Mathematica to compete because one of the reasons that gridMathematica may take off is precisely because there will be a lot of undergrads coming to grad school who won't want to program C++ or using the numerical Python module and just want to click sigma and do an infinite sum. Or, in this case, come up with something complicated and click "Solve on the grid" and it will Mathemagically do it, albeit slowly. As someone else said, gridMathematica could reduce a lot of the trouble for smaller universities to get a high powered computing set up, even if there is a ding in speed.

    I think, pedagogically, it is actually better to start making your students program, say with VPython (which is easy as pie) and looking through numerical methods books to find a needed numerical method because it really dissects the guts of a math or physics problem better. You have to think about the error term, step sizes, singularities, processing time, etc, rather than just clicking solve in Mathematica.

    But that doesn't matter: the more people are used to the Mathematica interface, the more this will catch on.

    Fight the national One-Strike law! [viewfromtheground.com]

  • How is this different from standard mathematica (which already supports multiple kernels) and the pre-existing paralellization add-on?
  • Whole classes (indeed, whole doctorates) in CS departments center around the creation of algorithms, making O(n^2) level problems into O(logN) problems and so forth. there's a real ar to making your formulae efficient, and the pressure for that art is processor time.

    When you only have one processor, the difference between a 20 minute calculation and a 2 day calculation matters.

    My worry here is that if whole university clusters were opened up, so even an inefficient O(n^3) problem ran in reasonable time, it sucks up an inordinately large amount of processing power, slowing down the entire organization, while the student running that problem has no idea that with some better written mathematicode they could solve the problem in O(n) time, and even if they did realize, they have little reason to care, when the problem gets solved reasonably fast anyhow, regardless of the cumulative effect on all the other processes...

    I'd be fascinated to know if there's any kind of accountability, or better yet, optimizations in Mathematica that can spot common inefficiencies and suggest either a more efficient way to handle a problem, or at least postulate a theorhetical O() level compared to a function's actual level.

    A lot of computing power is a good thing in principle, yet I wonder how much faster Word on my G4 would be if it was designed to work on my old 68000.

    In summary: extra cycles are a currency that is far too often traded in for lazy programming instead of increased performance.
    • My worry here is that if whole university clusters were opened up, so even an inefficient O(n^3) problem ran in reasonable time, it sucks up an inordinately large amount of processing power, slowing down the entire organization, while the student running that problem has no idea that with some better written mathematicode they could solve the problem in O(n) time,
      [...]
      In summary: extra cycles are a currency that is far too often traded in for lazy programming instead of increased performance.

      Worst case complexity (which is what "an O(n^3) problem" usually refers to) has little to do with how long programs take to run. What matters is average case complexity on representative distributions of problem instances.

      Now, what about complexity? Well, a lazily implemented algorithm with average case complexity O(n) will beat a highly optimized algorithm with average case complexity O(n^3) easily when problems get larger. And you need to keep implementations of algorithms with better complexity simpler because the algorithms tend to be more complex. The irony is that a lot of software today is inefficient and bloated because the programmers spent a lot of time optimizing tiny little pieces and creating software that is so complex and difficult to maintain that they can't fix the big picture (Microsoft Office, KDE, Gnome, etc. come to mind).

      Furthermore, for many problems, programmer time is much more valuable than computer time. Even if it takes 10 times as long to run or 10 times as many CPUs, an inefficient implementation is often preferable if it takes 1/2 as long to implement. And the simpler and lazier you keep the implementation, the less likely you are to introduce bugs. Students find that out when they miss the problem set deadline trying to hand-optimize and chase down bugs they introduced during optimization.

      In short, your attitude is probably at the root of a lot of bugs in today's software. Programmers need to get lazier for software to improve, and, as a bonus, lazily written software will often run faster, too.
      • "In short, your attitude is probably at the root of a lot of bugs in today's software. Programmers need to get lazier for software to improve, and, as a bonus, lazily written software will often run faster, too.,/I>"

        I think in your composition you forgot my actual point: that code written when the coder has constraints on processing power and memory often perform much better when those constraints are removed, because the coder writes cleaner code.

        Though it's not an obvious distinction, this is not the same thing as a coder trying to squeeze every possible optimization into a bloated codebase to get that little bit of extra speed.

        The former results in simpler programs that tend to be more elegant and faster for their simplicity. The other results in frgile bloatware that has to be re-engineered to take advantage of every minor enhancement for each chip.
  • Some of the main strengths of Mathematica are its symbolic algebra and pattern matching capabilities. These unfortunately also work rather slow.

    Here is where gridMathematica could come as a real boon. At least the kind of problems I have worked on (Feynman diagram calculations), and I suspect many others, are trivially parallelizable; what previously took days to complete might now be done in hours, depending on the number of machines at your disposal.

Avoid strange women and temporary variables.

Working...