Canada to Launch Countrywide Virtual SuperComputer 195
LadyCatra writes "A serious shortage of world-class computing power in Canada prompted University of Alberta scientists to create the next best thing -- a countrywide, virtual supercomputer.
On Nov. 4, thousands of computers from research centres across the country will be strung together by a U of A effort to create the most powerful computer in this country.
The full story is here"
Distributed computing? (Score:3, Interesting)
Re:Distributed computing? (Score:1)
I'm sure many others in the world would too.
Re:Distributed computing? (Score:2, Informative)
The computers will be linked by the Internet, but involve a simple networking system, Lu said. Keeping the linkage as simple as possible was the goal.
Read the article the next time, will you?
Re:Distributed computing? (Score:2)
Re:Distributed computing? (Score:1)
the implementation, other than it is internet based [please correct me if I'm wrong]. There's
no reference to this project on the UoA web pages,
either. UoA dept of Physics seems to have a Beowulf cluster, but this "Virtual Supercomputer"
sounds more like Globus (http://www.globus.org)
to me.
Re:Distributed computing? (Score:5, Insightful)
Because there will always be creeps who won't play fair. Much of the work that SETI@home does is security, combatting those who would submit false or abreviated results in order to get higher stats. UofA want to do real computing on a variety of applications. They've concluded that it is more efficent (for their purposes) to go for a small pool whose results they can trust, than to go for a large pool whose results they have to check and double-check.
Each approach has significant advantages and disadvantages. It depends on the type of work you are interested in performing.
Re:Distributed computing? (Score:1)
We used to call it Napster, but now it's called deadmeat something.
CISS FAQ (was Re:Distributed computing?) (Score:2, Informative)
And.... (Score:3, Funny)
Wow (Score:4, Interesting)
I think what this really needs is to be make easier for the mainstream, so anyone could do it. Perhaps bundle the tools (programming and deployment) with mainstream operating systems?
It's just an idea, my NeXT had Zilla (it's version of this) years ago - seems a shame that this hasn't caught on more widely. So come on Apple - let's see it, put it in the Darwin project and put a nice UI on it in Mac OS X.
Re:Wow (Score:2)
And before that, other people did the same thing. And there is at least a dozen projects worldwide that are doing this already on a wide scale.
So come on Apple - let's see it, put it in the Darwin project and put a nice UI on it in Mac OS X.
And what, pray tell, should that "nice UI" actually do that current software isn't already doing?
Re:Wow (Score:3, Insightful)
Of course Apple have some good tools here - perhaps Rendezvous (Apple's dynamic discovery or services over IP) could help. These such tools could help make it much easier to provide "community supercomputers". This would be especially useful in higher education, a place where Apple has been traditionally strong.
Re:Wow (Score:5, Informative)
Check out the Google Compute Faq [google.com] and the Kuro5hin discussion [kuro5hin.org] on the subject.
Re:Wow (Score:5, Informative)
Sun have Grid Engine [sun.com] and I believe Intel have something similar. The issue is that this kind of distributed processing is only useful for problems that can be divided into many discrete subtasks, which do not need to interact with other nodes while they are running, otherwise the work you need to do to communicate between nodes slaughters performance (that's why clustering hasn't taken over the world, vertical scaling on an active backplane is still the best solution for most jobs). The typical corporate large-compute job is data mining or decision support, neither of which scale particularly well horizontally.
Re:Wow (Score:2)
Re:Wow (Score:2)
On the NeXT there wasn't any power saving - such things hadn't been thought of, so this wasn't an issue. But as the display and the harddisk aren't needed for this kind of application they can shutdown as normal. I guess once the "community supercomputer" had finished doing whatever it was asked to do then it should restore all the power saving features (and be able to suspend them if it had a new problem issued to it).
Thinking ecologically about it, remember how much energy was used in the manufacture and delivery of those computers - we should use them as much as we can to make best use of the resources already invested in them.
SETI@Home, Canada's Fastest Supercomputer (Score:2)
There are real benefits for Canadian research that can come from this project - certainly there are a number of problems that are numerical and parallelizable, so there can be a lot of future to it if they do enough coordination, but most of Canada's academic supercomputing is currently driven by SETI. Besides scientific research, the other traditional users of supercomputers are weather prediction, oil exploration, and sometimes financial modelling - Canada may have more total supercomputer-based supercomputing than anybody realizes, in addition to SETI. However, the June 2002 top500.org list [top500.org] doesn't show anything in Canada above #227.
Other results from the Top500.org list - SETI@Home is still about 7 times as large as the largest single machine on the list , Japan's NEC Earth Simulator, which is about 5 times as large as the #2 machine, LLNL's ASCII White.
What if it gains conciousness? (Score:5, Funny)
Re:What if it gains conciousness? (Score:5, Funny)
Re:What if it gains conciousness? (Score:4, Funny)
Re:What if it gains conciousness? (Score:1)
Sun is Right (Score:5, Insightful)
It would be nice to see a worldwide system. If this is going to work there must be some CPU time quota system, perhaps a quota that can be bought and sold. This could make it interesting for ordinary home users to join (earn quota, sell quota, make $$$). There are many projects in the academic world that could never make a SETI@home launch, since the research is to boring. Still, we need to use all that idle time buring away across the world.
Make $$$ (Score:3, Informative)
You mean like Popular Power [popularpower.com] tried (and failed)to do? Check their old site [popularpower.com] to see what they used to propose.
Looks like selling CPU cycles is not a lucrative business...
Re:Make $$$ (Score:2)
Re:Sun is Right (Score:4, Interesting)
Do we? Idle time means the CPU is using less power, and generating less heat. I suppose that theoretically you are also making your processor transistors life slightly shorter, although there are probably arguments that a constant 50% CPU utilisation is not a bad thing because it will be more likely to maintain a constant temperature...
In any case, multiplied up by many millions of installed PCs, using that idle time means increasing energy consumption by a not insignificant amount. We need to use less energy, not more! Indeed, saying that idle time is "buring(sic)" away is quite the opposite of the truth.
Re:Sun is Right (Score:5, Interesting)
When usage is 50%, the CPU is probably not turned off at all, since turning on and off clock trees (and getting the PLLs to sync) take time.
Since most home computers will not power down, we can use that potential computer power to save energy by not running super computers elsewhere.
Re:Sun is Right (Score:2, Interesting)
Re:Sun is Right (Score:2)
But I don't think that the difference in power use is big. Still, no need for super computers at the universities, i.e. less power consumption there.
As the problem of energy preservation in these kind of situations is very complex, we'd better use the economical benefits for the users (i.e. you don't need to buy a super computer to do a few minutes worth of number crunching).
Re:Sun is Right (Score:3, Informative)
Re:Sun is Right (Score:1)
Re:Sun is Right (Score:2)
Re:Sun is Right (Score:2)
I guess Linux did this well before Windows did. Even though it's a different chip, Macs also do the equivalent of a HLT.
I think the PowerPC chips have several speeds they can throttle down to. I think. Too lazy to check.
Heating Canada Through Supercomputing (Score:2)
Actually, climate change has been a real problem for the ecosystem of the north coast, with a lot of ice melt and more open water than usual. One of the effects is that seals have more of the year, or in some places year-round, that they can find open water instead of making breathing holes in the ice. Polar bears and the traditional Inuit hunting methods both depend on catching seals at their breathing holes, so their hunting is much less effective.
On computer-related topics - laptop batteries really don't like background CPU-burners. I used to run the GIMPS Great Internet Mersenne Prime Search, and I used to commute by train, with about an hour of battery time each way. NiMH batteries don't have the same failure behavior as NiCD, and they're nowhere near as nasty a toxic waste disposal problem, but they really don't like this kind of treatment. To compound matters, for some of that time period, I was running Windows NT 3.51, which was much more stable than Win95, but it insisted on being a *server* operating system that didn't need laptop power management drivers, so when it got a hardware low-power shutdown signal, instead of going into hibernation mode (see, the polar bears *were* relevant), it would blue-screen and die. I had to stop running the prime search.
Re:Sun is Right (Score:3, Informative)
Yeah. There's actually quite a lot of research going into this currently. It's called the Grid (think "power grid", ubiquitous, simple to use), and I predict it will be the next big buzzword.
See Global Grid Forum [gridforum.org], Grid Today [gridtoday.com] and the Globus project [globus.org] for starters.
The problem of buying and selling computation power on some sort of broker basis is a quite interesting problem in itself. Exactly what are you selling? Hardly CPU hours, since the value of those depends on the hardware.
Re:Sun is Right (Score:5, Interesting)
I'd like to suggest something like the JavaVM, i.e. a standard virtual machine, from which you buy and sell basic ops, i.e. a byte-code instruction.
The biggest problem will probably be that you will not make any real money from letting your CPU be used. Perhaps a good idea would be to let a university supply you with internet access in exchange for CPU time. They usually have quite alot of bandwith.
unrestricted high speed (Score:2)
Let's call it.. (Score:2, Funny)
Re:Let's call it.. (Score:3, Funny)
Re:Let's call it.. (Score:3, Funny)
Re:Let's call it.. (Score:1)
What a great idea ! (Score:3, Funny)
Please file... (Score:3, Funny)
- standard Beowulf trolls mixed with standard Canadian accent lexicon ("eh?", "aboot")
- posts about how a Beowulf cluster could perhaps help Canada out with a stereotypical Canadian "problem" (lousy beer, socialized medicine)
- jokes combining the word Beowulf with the name of the mentioned U of A chemist Wolfgang Jaeger
Thank you.
Re:Please file... (Score:2, Funny)
Dude, Canadian beer and socialized medicine are the solution, not the problem.
Re:Please file... (Score:2)
Re:lousy beer my ass... (Score:1)
Custom solution for a specific task? (Score:5, Insightful)
"The computers will be linked by the Internet, but involve a simple networking system, Lu said. Keeping the linkage as simple as possible was the goal."
Based on the article I would assume that they have made a custom tailored system (if not kludge) for one specific purpose ("for calculating energy shifts as two molecules are manipulated around 3-D space") - and not a platform which could be easily tailored and managed to solve different kinds of tasks with different kinds of relationships between the tasks.
Ohh, I could also link my grid computing links [cyberian.org].
Re:Custom solution for a specific task? (Score:5, Informative)
I was visiting the Vancouver site a couple of months ago when they were assembling it. It looks sweet. A nice big array of Dual Athalons. The system is being linked together over CA*Net 3, a nation wide OC192 fibre network.
They're also experimenting with distributing different parts of the system in different locales. Like disk storage in one part of the country, heavy number crunchers in the other, to see how distributed a system can really be and still function well.
CA*Net is still looking for applications, the network is being severely underutilized. http://www.canarie.ca/advnet/canet3.html
Re:Custom solution for a specific task? (Score:1)
http://www.canet3.net/stats/CAnet4map/CAnet
Re:Custom solution for a specific task? (Score:2)
Ohh, that explains it. Thanks for the info!
Re:Custom solution for a specific task? (Score:1)
Quake!
SHARCNET (Score:3, Informative)
SHARCNET has been up and running for a while and last year accounted for about 27% of supercomputing power in Canada (half of all supercomputing power in Canadian universities), with three sites on the Top 500 list and total power exceeding institutions like Cambridge, Princeton, Cornell and Caltech. There's loads of information available about the hardware and software [sharcnet.ca] used at each facility, as well as CPU load and usage statistics at members sites like these status charts [sharcnet.ca] from the most powerful individual site, at the University of Western Ontario. As for applications, a number of researchers are already using the system for a variety of projects across science, engineering, and economics.
Big Iron. (Score:3, Insightful)
And the innovation is where? (Score:2, Insightful)
So how is this different from DC or SETI?
Processing power is not everything (Score:4, Informative)
Re:Processing power is not everything (Score:2, Funny)
I wondered why.... (Score:1, Funny)
(I live in the Great White North, so I'm allowed to say this!)
-psyco
The Always Classic... (Score:3, Funny)
step 2: ???
step 3: global domination!
More info (Score:4, Informative)
From the article Gerald Oakham and his fellow physicists have a problem. In the hunt for the most elusive speck of matter known to science, they are about to generate more data than any computer on the planet can analyse.
This is a cool idea, but actually.. (Score:4, Interesting)
Why build a new, separate system? (Score:3, Insightful)
They talk about how they feel that Canada should be pursuing its own supercomputing, but why not join up with other universities that have been pursuing similar projects and give Canada access to the computing power of other countries as well? Isn't the goal here for people to work together for mutual benefit? I don't understand why they feel the need to isolate their Canadian initiative, rather than giving Canada the access to computing power far greater than they can acheive on their own.
Check out photos of UVA's branch of Legion: http://legion.virginia.edu/centurion/Photos.html
This room has big glass walls, and everytime I walk by it I wish I had a room like it.
Re:Why build a new, separate system? (Score:5, Funny)
Probably because the Canadian researchers got tired of hearing things like, "so, ye'all are from Keaynada, huh? We was just sittin up on the ruff and be drinkin sum pap."
Re:Why build a new, separate system? (Score:2)
Have you ever noticed that there's never anybody in there?
Evil plans are perhaps already afoot (Score:3, Funny)
Re:Evil plans are perhaps already afoot (Score:1)
And the news is what? (Score:3, Interesting)
So, why is this news? Is there some new technology they are using?
Out of Hours (Score:1)
Between 7pm and 7am We have 50pc's doing nothing at all in my office, I'm sure they could be doing some usefull math, especially if there's a national emergency.
Re:Out of Hours (Score:1, Funny)
Raw Power (Score:1, Funny)
Those cards will FLY to their place, thereby improving productivity ten-fold.
Rueters News report (Score:4, Funny)
Today, the Canadian Ministry for computing announced their initial tests of the Canada-wide massive computer project..
Computer Scientist Thom Serveaux had this to say," when we switched it on every command was answered with the word "eh?" and it kept calling us "knobs" and was asking for "back bacon" we are trying to see if there is any problems in northern nodes that were like the Quebec nodes that started a fight with the other nodes demanding every command to be repeated in french."
Updates will be posted on their progress..
Shatner (Score:4, Funny)
A supercomputer capable of creating more convincing commercials, perhaps?
Supercomputing problems (Score:4, Interesting)
When the University of Toronto did purchase a Cray in the mid-eighties, there was a massive fight. Many felt that the resources to support the Cray were sucking money desperately needed everywhere else. (although, boy, we in meteorology a happy bunch...)
While lower profile and somewhat more painful to use, this is far more practical solution for the realities of academic computing today.
Re:Supercomputing problems (Score:2)
Re:Supercomputing problems (Score:2)
Not saying it was the case here, but remember that it's not a question of if it's elegant, or quick, but if it's right.
I can easily see them giving hours and days of computing time in exchange for KNOWING that they're doing big plodding calculations over and over again, but they'll get a damn right answer, instead of using a fast, sexy little thing that might, never the less, throw a calculation off by .000001 percent.
Physicist programming skills vary widely (Score:2)
On the other hand, their code basically jumped around in a large matrix, without much locality, because that matched what the real system they were modelling would do. They needed 12-14 MB of table space for the system they were modelling, and our VAX 11/780 only had 4MB of RAM, so I played with a number of virtual memory operating systems (4.1BSD, SVR2.0p, various tunings) to get something that would survive being thrashed to death, and helped them do a lot of work on checkpointing their code, because their standard run took a week, and even if something else didn't cause the machine to crash, we'd get power hits during summer thunderstorms. After about two years of this, the price of RAM dropped to the point that we could afford to upgrade the machine to 16MB, which made our run time drop to about an hour....
Meanwhile, that Cray-1 of yours was mostly similar in performance to a Pentium 133, and some of the recent graphics chips have really immense memory bandwidth, though they're mostly running fixed-point or at best single-precision floating point rather than double-precision or quadruple-precision, so they're not *quite* a Cray-replacement even though they're faster at many things.
Very cool project (Score:1)
Our, ehrm... Big Iron... (It'll get bigger!! Really!!)
http://www.cs.unb.ca/acrl/
Grid Computing (Score:5, Interesting)
There are other tools out there which do this: Legion, Avaki, Sun Grid Engine, Globus, to name a few but the goal is to create a network of (mostly) supercomputers which doesn't require a lot of reconfiguration at each site. What differentiates this work from many other approaches is that it is transparent to the system administrator.
For those who ask "why can't you just do something let seti@home" the answer is that not all problems in science and business can be easily decomposed into small chunks. Bandwidth requirements and latency may also be a problem. A lot of scientific programmers have to worry about communications much more than about processing power (although this tradeoff has been seesawing backwards and forwards with new advances in both technologies).
There's a worldwide effort through both business and academia to create a number of good, interoperating frameworks for doing this sort of transient, virtualised supercomputer.
Have a look at the Global Grid Forum [ggf.org] (which is becoming the focus for Grid computing standards) for more information.
similar to the Indian Grid Computer ? (Score:3, Informative)
They won't break Japan (Score:1)
I think that a year ago or so, the Japanese supercomputer for earthquake simulations had more power than the other top 499 supercomputers combined.
Sure, they'll be able to build a large, loose network of computers, but the access-speed will hardly compare to a single-site computer.
CISS (Score:3, Informative)
It seems that November 4th they will be doing a full 'production' test. Cool.
Based on SSH (Score:2)
No MS Passport or
Re:Based on SSH (Score:3, Informative)
Heh heh, the U of Alberta hosts the web and ftp space for OpenSSH and OpenBSD. Also, Bob Beck [ualberta.ca] works at U of A. Bob helped develop the first OpenSSH release [openssh.com], not sure how active he is these days.
For U of A, that all adds up to "premium class" tech support for anything to do with SSH.
First Task? (Score:4, Funny)
Use of Supercomputer (Score:3, Funny)
Re:Use of Supercomputer (Score:1, Flamebait)
"Cold again, eh?"
Reminds me of the weather forecaster character that I think was played by Steve Martin in "The Single Guy," who pre-recorded his LA weather forecasts.
There are already some Super Computers in Canada (Score:5, Informative)
Virtual Laboratory of Eastern Ontario.
The High Performance Computing Virtual Laboratory (HPCVL) was formed by a consortium of four universities located in Eastern Ontario (Carleton University, Queen's University, The Royal Military College of Canada, and the University of Ottawa).
http://www.hpcvl.org/
It's also in the Top 500 supersomputer list, so it must be half-decent. So if four universities can have a dencent computer in Canada, others probably do too.
U of A Computational Resources (Score:3, Interesting)
The U of A (U of Eh?) also participates in MACI (www.maci.ca) and houses three SGI Origin computers, and is involved with the WestGrid project (www.westgrid.ca).
Prof. Schaeffer's point isn't that we don't have "computrons", but that research is increasingly using simulations (see Jaeger's work) and other computational methods, and computational resources are becoming increasingly overloaded as budgets are not growing as quickly as research advances.
Canadian Internetworked Scientific Supercomputer (Score:2)
shared memory virtual supercomputer (Score:2)
</joke>
Programmed, of course in C eh eh (Score:2)
However, the government is concerned that the supercomputer doesn't properly reflect Canadian values. For example, the color of the cases is a light beige, which is racist. So all computers as part of this net will be painted multi colors.
In addition, they must be used to do CANADIAN computation from a CANADIAN programmer at least one run time out of three.
And, to further show off Canadian innovation, they've developed a new language for this cluster, C eh eh.
Here's an example.
#define EH EH
#include "political_correctness.eh"
#include "Liberal_Party_Donation.eh"
#include "Kickback_via_golf_course.eh"
#include "Payoff_to_Bombardier.eh"
eh main (eh)
eh
printefe(FONT_DOUBLE_SIZE, "Bonjour, monde")eh
printf(FONT_HALF_SIZE, "Hello, world")eh
eh
Re:Programmed, of course in C eh eh (Score:2)
You enjoy your taxes being hiked? Oh, it's for health care, sure. The fact that Jean Chretien wants to lay $3B worth of high speed rail to help out Bombardier (who developed a high speed train even though there's NO high speed rail in North America) has NOTHING to do with it, oh no no no nooooooooooo
WestGrid (Score:2, Informative)
NEWSFLASH (Score:2)
Today, University of Alberta computer scientists announced an experiment involving hundreds computers around the country in an attempt to build what may be the largest supercomputer in Canada.
"Mindnumbing" was the word Dr. Al Koholic used to describe the system devised by himself and his graduate students.
The system will include hundreds of 286s and 386s strung together with dental floss and toothpicks. "With recent budget cutbacks and lack of funding we had to head back to the drawing board and devise this cost-saving, efficient design".
Dr. Koholic asks Canadians to "... donate any computer hardware they may have that is collecting dust
Once the supercomputer is complete it will be used to sort through the megabytes of data generated by the atom smashing lab also located at the UofA. Physicists there are in search of the ellusive 'Hangoverless-drunk' state obtained by smashing together beer molecules.
Most powerful computer yet! (Score:2)
"On Nov. 4, thousands of computers from research centres across the country will be strung together by a U of A effort to create the most powerful computer in this country."
Thereby surmounting the previous record holder [apple-history.com].
Canada to Launch Countrywide Virtual SuperComputer (Score:2)
Re:Beo..wait. (Score:1)
Re:Beo..wait. (Score:1, Offtopic)
Go get some sleep.
Re:Not hard... (Score:1)
Re:Imagine... (Score:2)
Re:Security (Score:2, Informative)
Re:Most powerfull computer in Canada? (Score:2, Funny)
Easy - just keep putting the Liberals into office. Their buddies make out like bandits. The rest of us will be back to an abacus and tin cans & string in no time.