Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
News

World's Oldest Working Computer On Display 82

riflemann writes: "The Sydney Morning Herald has an article today about the world's oldest working computer finally having a permanent place in a Melbourne Museum. It's good to see such a historical computer, over 50 years old, being put on display permanently." Seeing this makes me remember reading Cryptonomicon - of course, the definition of what's the oldest and working is up for grabs, but as a BA in History, it's cool to see stuff like this put on display for all to see.
This discussion has been archived. No new comments can be posted.

World's Oldest Working Computer On Display

Comments Filter:
  • by supersnail ( 106701 ) on Friday January 05, 2001 @02:47AM (#528999)

    I don't think this pheonominum is peuliar to computers. The computer (that is the stored program eclectronic computer) have been around for about forty years.

    In 1940 aircraft had been around for about forty years and in 1940 an aircraft from 1935 would have looked postitively antique. I mean two wings! canvas covering! wooden framed! no supercharger! compared with an aluminium monoquoce monoplane with a four valves per cylender supercharged pressurised plane of the 40s.

    I really think computer development is in for a big slowdown in the next ten years. The main reason being that in two years time we will have more computing power than we know what to do with. We would have reached this stage already if "modern" computers didn't spend most of thier time running a bloated operating system rather than doing useful work.

  • How long were "words" then?

    I don't know anything about this machine (I'm a Brit) but machinery of that era often used long words, that would still be a respectable length today. Computers then were mathematical number crunchers, not text processors, so the data word was usually long enough to hold a floating point number in a single word. In the '70s, mini computers started to be dedicated to handling real-time data from A/D converters (often 10 bit) and so they in turn used words of 10 or 12 bits; tailored closely to the size of their most significant external data, not their internal chippery.

    Secondly, another poster mentioned mercury being used for memory. This would have been an acoustic delay line, and some of the architectures with those were wholly serial machines - effectively single bit parallel. Only one bit at a time was represented electronically, the rest were being stored as acoustic signals travelling down a pipe full of mercury.

    As a complete guess, I'd expect the CSIRAC to be a serial machine with one bit words.

    Bytes only become significant when eight bit memory ICs are available as commodity products. The natural word size of the valve and soldered joint is a single bit, so '50s generation kit simply didn't have the same fondness for standardised word lengths that we know today.

  • Why can't we dig up Turing and put him on display

    if you want to see something disgusting, just wait a little longer - the goatse.cx guy will be along shortly.
  • This is on display at the Manchester museam of science and insdusty and is worth a visit.
  • by acb ( 2797 ) on Friday January 05, 2001 @02:57AM (#529003) Homepage
    I once heard a speech from a (now retired) academic who worked with CSIRAC. He said that the CSIRAC project was scrapped in 1964 because the British Foreign Office had a word with the authorities in Australia and sternly reminded them that Australia had no business doing research not related to mining or agriculture, and such projects belonged in the UK.
  • Even more fascinating would be if Babbage had played with Faraday's solenoids - it would have been a more interesting bridge between the mechanical to electro-mechanical to electronic world. I suspect that he would have had greater success storing values in solenoid state RAM.

    Punch cards could still be used, but there would be a drive to make the solenoids smaller, and perhaps explore magnetic storage a per core memory at an accelerated pace.

    I keep pictures of Charles Babbage and Lady Ada on my wall in my cube, and it's amazing how many people don't realize who they are.

  • I've still got a working Sinclair ZX-81 with 2K of RAM -- also had the 16K RAM expansion pack. 'Course now I guess I've dated myself....
  • First of all, no I do not want to see anything associated with goatse.cx. I think the next version of SlashCode should have filtering abilities... anybody posting text or a link about goat* should be instead nuked and suffer ping flooding... ;) Maybe work in something to crash a Windows box if anyone is silly enough to be using one...
  • I'm not sure about that... I'd assume they mean electronic computers, as abacuses have been in use for much, much, longer than 50 years, and the slide rule was invented in 1895, I believe. However, I believe that even before this "computer" people were using potentionometers to modulate voltage, thereby multiplying and dividing.
  • dude...wrong link! [photopoint.com]
  • What he said, but I believe this is more of a replica than a restoration? I seem to remember something about the original being destroyed after the war (to keep it secret).


    Hacker: A criminal who breaks into computer systems
  • I really think computer development is in for a big slowdown in the next ten years. The main reason being that in two years time we will have more computing power than we know what to do with.

    Dunno - I think there's plenty of room for wanting more powerful processors. Simulations & virtual environments will pretty much eat any amount of power you can throw at them (the more you throw, the better they get). I can also think of some uses for personal-level data mining will which require heavy hp.

  • No, PPL Utilities, formerly know as Pennsylvania Power [pplweb.com] has got you covered with 2 nuke plants. *sucks 300A from his 100A service panel and uses his ion gun to explode the neighbors transformer*
  • They never buried Lenin. He was embalmed, and he's in a case in the Kremlin. You can go right up and look at him. Stalin was in there, but controversy had him moved into another building.
  • There's actually a whole genre of SF devoted to this...do some Googling on "steampunk".
  • Because the Smithsonian Insitution has ENIAC and it still works. They rotate parts of it out into display every so often.

  • Also check out Bruce Sterling's "The Difference Engine". Drags in Lady Ada Lovelace as well.

  • The Difference Engine by, uh, Stirling and Gibson was it? Good book, it explores this idea.
  • You have to remember also that the Difference Engine was never actually built - That which sets the CSIRAC computer apart from other systems of its time is that it is still maintained in whole. As the article on th SMH rightly points out, other systems of that time have been cannablised as part of their ongoing development.

    Rob
  • Turing was Jewish?

    I suppose we could dig up his corpse, they did it with Lenin and he's still there...
  • by EABinGA ( 253382 ) on Friday January 05, 2001 @04:51AM (#529019)
    During 1936 to 1938 Konrad Zuse developed and built the first binary digital computer in the world (Zl). A copy of this computer is on display in the Museum for Transport and Technology ("Museum fur Verkehr und Technik") (since 1989) in Berlin. It's construction was personally supervised by Zuse himself.

    The first fully functional program-controlled electromechanical digital computer in the world (the Z3) was completed by Zuse in 1941, but was destroyed in 1944 during the war. Because of its historical importance, a copy was made in 1960 and put on display in the German Museum ("Deutsches Museum") in Munich.

    Next came the more sophisticated Z4, which was the only Zuse Z-machine to survive the war. The Z4 was almost complete when, due to continued air raids, it was moved from Berlin to Gottingen where it was installed in the laboratory of the Aerodynamische Versuchanstalt (DVL/Experimental Aerodynamics Institute). It was only there for a few weeks before Gottingen was in danger of being captured and the machine was once again moved to a small village "Hinterstein" in the Allgau/Bavaria. Finally it was taken to Switzerland where it was installed in the ETH (Federal Polytechnical Institute/"Eidgenossisch Technische Hochschule") in Zurich in 1950. It was used in the Institute of Applied Mathematics at the ETH until 1955.

  • Cryptonomicon was great, has stephenson written anything since then that is any good.. My bookshelf is looking a little dry atm and I need some new books. Will be getting all the ender series after reading the first one [which blew me away, great book], can anyone suggest some other good books? -sol btw, asimov owns all ;P

  • Uh.. a Beowulf cluster of these (or any similar antiques!) would probably bring down the power grid in the entire state!
    Imagine the scene at the power company -
    "Oh *shit* the computer museum's doing a demo again - quick! bring Three Mile Island back online - NOW!"
  • Do you suppose that Turing would have wanted to have his rotting corpse displayed?

    Alan Turing naked and petrified?

    No, thanks.
    __
  • Hey Eniac:

    CSIRAC's in position.
    Soon we begin the endgame.

    Your pal, Hal 9000
  • Not that's what I'd call imaginative naming. ;-)

    --
  • Konrad Zuse worked in Nazi Germany. That could mean something.
    __
  • You'd have a hard time finding anyone today who could surpass the intricacy of some clockwork and steam powered mechanisms they made.

    Not at all true. There is no human who can possibly match the precision of a seven-axis laser-guided computer-controlled milling machine. With these amazing (and huge!) devices you basically create a 3D mathematical model (using curves, not polygons), provide a set of hints as to the sequence in which it should shape the metal and press GO. It will machine any shape you'd like to tolerances of billionths of an inch and do it in minutes.

    Sure there were clever engineers and highly-skilled craftsmen, but computers can manipulate far more complex shapes than any human and no matter how light your touch is on the micrometer you'll never match laser interferometry for precise measurements.

    SciAm published a report a few years ago on the scientists and engineers who constructed a working replica of Babbage's difference engine. They had a difficult time doing it and by the time they were done they'd had to machine many parts to tolerances that were probably unachievable without modern machining technology. It's *possible* that with enough money and enough determination Babbage could have got the thing to go, but the conclusion of the SciAm article was that it's not very likely.
    --

  • I read somwhere (sorry, I've forgotten) that Moore's law would not work with mechanical devices like Babbage's engines.

    Doesn't Moore's law partly depend on stuff getting smaller and smaller? Aside from nano wouldn't you have trouble making gear wheels small enough to pack enough of them into a reasonable space?
  • Tough decision to make, but I feel it is reasonable and valid to try to make the thing work (I'm steering clear of "practical").

    Sure, it could be done, but getting it going again would be like taking the Wright Brothers Flyer out for a test run. It'd be great while it worked, but what happens when you break something - and with a tube computer you *would* get many failures.

    If you really wanted a working tube computer, building a replica would be a far more responsible thing to do. At least that way the original would be left intact.

    Anyway, what would be the point? We know exactly how the machine worked, we've got an emulator for it, it's preserved so if anyone wants to the the physical layout and engineering techniques available, it's all there, and most importantly real efforts to record the history of the machine have been made while many of the people that were involved in its use are still alive.

  • The main reason being that in two years time we will have more computing power than we know what to do with.

    640 K should be enough for anyone.

    The market for computers is probably 5 or 6 in the whole world.
    __
  • Digital video editing is going to be the next great PC application, and can chew up as much CPU, memory and disk as we can feasibly throw at it for at least ten years. Remember, HDTV is coming, and filming in it quadruples (roughly) hardware requirements there and then.

    And no, dedicated hardware isn't the be-all and end-all. Compression and decompression might be handled by special-purpose hardware, but special effects (fades, wipes, and the myriad effects that are used routinely on still images with programs like the GIMP) are going to be performed on general-purpose CPUs.

  • Lenin is actually not in the Kremlin per se but is on display in a mausoleam in Red Square, adjacent to the Kremlin. The room is only open for brief periods each weekday and they restict the number of people who can go in there simultaniously so as to stop Lenin from falling to bits. Stalin was never in there.
  • No, there were bytes before ICs came into common use. Mid-60s IBM and Burroughs machines were byte-based, and I don't think that they were the first.

    My guess is that the adoption of transistor technology (rather than ICs) was the turning point.

  • Oughtred invented the slide rule in 1621, based on an earlier but less convenient gadget called Napier's Bones. Napier was the same clever fellow who figured out much of how logarithms work.

    There were a number of analog computers (both electronic and mechanical) in use prior to digital computers. But they weren't general-purpose programmable machines. They were built to embody one or a small set of algorithms, and that's what they did. Examples include astrolabes, artillery calculators, bomb sights, etc. Most seemed to be connected with surveying, navigation and blowing things up...

  • Those were the Colossus machines, the high-speed, electronic but not-quite-stored-program code-breaking devices that Turing helped develop at Bletchley Park. Churchill ordered that these (there were several) were to be broken up into pieces 'no larger than a man's fist'. The precise reason for vandalising these ugly but historic beasts has never been satisfactorily explained.
  • my grampa's older than 50!
  • by Jester998 ( 156179 ) on Thursday January 04, 2001 @10:01PM (#529036) Homepage
    This is definitely cool... I hate to see any sort of technology go to waste. Hell... I have a Commodore 64 and a Commodore PET sitting downstairs.... at least I'll be able to tell my kids "See what WE had to put up with when we were 8 years old? REAL hackers don't need 4GB of RAM..." ;)
  • by Pseudonym ( 62607 ) on Thursday January 04, 2001 @10:10PM (#529037)

    It doesn't work, it's just intact. There's actually an archival issue here. Do we keep CSIRAC "as it was", or do we restore it to working (and keep replacing valves as they burn out at the rate of at least one per day)?

    When I was at the University of Melbourne, we were lucky enough to get a guided tour by one of the original members of the computational machinery laboratory. It's quite easy to see how the meme of the computer as ominous "electronic brain" took hold when you can literally walk through it.

    CSIRAC not only had a hard disk (one platter, with a motor which delivered such low torque that you needed to put some pressure on the drive belt with a screwdriver to get it to spin up; I believe one of the engineers still has the screwdriver), but it also had a high level language, the interpreter for which fit in its (off the top of my head) 768 words of memory.

    Oh, and another anecdote: When CSIRAC lived in The University of Melbourne, it was first housed next to the particle physics laboratory, which caused some scheduling problems, because CSIRAC wouldn't work when the cyclotron was firing. They also had difficulty with the mercury memory in hot weather, but I suspect all the early computers had that problem.

  • Not that it's directly relevant, but I just have to suggest Neal Stephenson's The Diamond Age, as it deals with (neo) Victorians and (micro) mechanical computers, both of which you mentioned. Hemos also mentioned Cryptonomicon in the /. story, which is one of Stephenson's other books... Damn, he's a good writer.
  • by Lover's Arrival, The ( 267435 ) on Thursday January 04, 2001 @10:05PM (#529039) Homepage
    It is funny how in the computer industry things age so much quicker than in any other field of Human endeavour. Computers that are 5 years old are regarded as antique, so I don't know how this is termed ;)

    One thing I have always wondered about historical computing is the "what if" question. In this case, what if Babbage had got commercial success with his difference engine? I have wondered just how advanced a purely mechanical computer could be. What if the Victorians had thrown boundless cash at mechanical computers. Just how advanced could we reasonably hope these computers to be? I am most interested ;)

  • The Difference Engine was not a programmable computer in the modern sense of the word. This biography [vt.edu] explains that the Analytical engine, which Babbage designed but never built, would have been a real programmable machine.
  • by Chuck Flynn ( 265247 ) on Thursday January 04, 2001 @10:05PM (#529041)
    I realize "the world's oldest computer" makes for a good show piece in a museum, being inanimate and all, but I personally would rather see more recognition being given to the people who forged the revolution and not just the objects the people built. As someone who got started with IBM not too long after they did the translation hardware for the Nuremburg trials (back before "business machines" meant "computers"), I still get some weird looks in this industry dominated by young upstarts. Why can't we dig up Turing and put him on display? That'd be a lot more informative than a hunk of springs and diodes, if you ask me.

    If you ask me... People don't seem to ask me much anymore. Please?
  • by tbo ( 35008 ) on Thursday January 04, 2001 @10:06PM (#529042) Journal
    ...electronic digital computer... Let's not be platform-biased.

    Don't forget the work of Charles Babbage, such as his Difference Engine [reading.ac.uk]. I'm sure there were other computers before this one that still work (I think one of Babbage's still does).
  • Can it run Linux? If it don't run Linux, I don't give a crap.
  • "Downloading"?? Whoa, sonny, are you from the future or sumthin'??? Back in the day.... ;)
  • I have a working Vic-20 with the whopping 4k ram expansion still floating around somewhere ;) Talk about being l33t at the time.
  • Thanks very much. I have wanted to read the Cryptonomicon for some time now. I have read some of his essays, but never any of his books. Its just one of these things I have been meaning to get around to.

    Bye :o)

  • Check out the Universal Machine exhibition at the Powerhouse Museum [phm.gov.au]. They've got lots of cool stuff on display, like a specimen piece from Babbage's Difference Engine, an analogue synthesiser, an Enigma cipher machine, and a robot cow.

    BTW, where was CSIRAC when I was in Melbourne for CALU [linux.org.au]? :-(

  • by jonfromspace ( 179394 ) <jonwilkins@NOSpaM.gmail.com> on Thursday January 04, 2001 @10:21PM (#529048)
    A beowulf cluster of these... would still be slow. However, it would be bigger than my house!
    um.. I done, you can stop reading...
  • I take it you mean never built by Charles Babbage. London's Science museum built one some time ago.

    A working component was completed though. This was in a sense programmable (It could add numbers togoether and be instructed to add a different number on the 100th iteration for example).
  • Ye gods, that's sick. Do you suppose that Turing would have wanted to have his rotting corpse displayed?

    I'm all for honouring the people who made the computer age possible, but there's got to be a better way... wax replicas, maybe.

    Besides, let's face it, outside of Golden Age science fiction, most scientists are not broad-shouldered, square-jawed, manly explorers of the unknown. Who the Hell(tm) wants to look at a bunch of lab-coated geeks? (If you find any, especially if they're cute women, let me know, so I can dig up a lab coat.)

    And you're going to get wierd looks if you're older then 30 and are good with computers. In this day of script kiddies and 733t h4x0rs, they can't comprehend adults being able to do anything other then answer e-mail.

    Just my 2 shekels.

    Kierthos
  • Yes ... my bad ...

    The original engine that was built by Babbage's engineer, Joseph Clement, consisted of about 2000 parts but was only a small portion of the envisionaged differential engine. The engine was never completed and most of the parts produced were later melted for scrap. The engine later built by the London Science Museum was completed in 1991. Information on this system can be found at the following sites ;

    http://www.museums.reading.ac.uk/vmoc/babbage/ [reading.ac.uk]
    http://www.nmsi.ac.uk/on-line/treasure/objects/186 2-89.html [nmsi.ac.uk]

  • Chuck Flynn: Why can't we dig up Turing and put him on display?

    Because Turing was gay, and The Establishment won't allow gay people to be perceived as heroes.

    Any homophobe who uses a computer is a hypocrite.

    --

  • I assume that they might crank it up from time to time but why not leave the thing going? Perhaps it could be the machine that cracks RC5 or, even better, Seti@home. Irvu.
  • by Anonymous Coward
    At http://www.cs.mu.oz.au/csirac/csirac.html there's a slightly more complete history than i've managed to find anywhere else.
  • I mean two wings! canvas covering! wooden framed! no supercharger! compared with an aluminium monoquoce monoplane with a four valves per cylinder supercharged pressurized plane of the 40s.

    Hmmm, I wonder if anyone ever told Geoff DeHavilland he was behind the times with the Mosquito (aka The Flying Sofa, because it was made of wood & canvas by furniture crafters) during WWII. And I wonder if anyone from 418 Squadron (or any of the other Mosquito squadrons) ever realized it, either. Anything I've ever read about it praises the plane for its adaptability (everything from "night intruder" to "light bomber"), handling, and general airworthiness, although one author does relate that the Mosquito did have an occasional tendency to come unglued. (Ooops... Well, on the other hand, so do we all...)

    Interrobang, callously disregarding Karma as usual
  • Actually the Victorians were really good with mechanical tech. They had to be, it was pretty much all they had. You'd have a hard time finding anyone today who could surpass the intricacy of some clockwork and steam powered mechanisms they made. If the necessary investment of money and expertise could have been made I think they could have realized Babbages's Engines on a large scale. Not easily and not overnight, but I think they'd have managed.
  • I am going to disagree. Joe user is not going to be doing any real sort of video editing. The average user can't even properly edit still images. Much less do anything with video. Case in point my Aunt. She spends hours drawing things in Microsoft Paint. I installed Ulead ImagePals for her. An older but very capable photo/art program. I picked that one because I had an extra install CD. (I don't pirate programs). She was back to using MS Paint within 2 days. Imagepals was to complicated for her. Do you really think she could handle Video Editing?
    No in my opinion Joe User will never be able to handle even any video editing more complicated than cropping parts of clips out, and maybe putting different clips together. If Joe User can't do it then there is no reason for him to buy that 4 GHZ Athlon Thunderbird 7.
    Now the other big thing that drives the PC industry is Video games.
    There are several problems slowing advancement there.
    One of the biggest is that PC makers still don't put 3D cards in computers, and use shared memory when they do. (A few exceptions exist).
    Joe User has a Winmodem and AOL. Try playing a real multiplayer online game with either one of those, (much less both).
    Video Game graphics have matured and will slow down in their advancement. 2D graphics basically maxed out in the early 90's. (New 2D games really don't look a whole lot better than Super Nintendo games did back then). The same is beginning to happen to 3D graphics also. Compare the quality jump from Doom to Quake, to the difference between Quake and Quake 2. The difference is even more minor in quality between Quake 2 and Quake 3.
    The Pentium 166 that I am posting from now, (work computer), serves perfectly for the Internet and document editing purposes that it is used for.
    So where is the continued market in making computers faster and faster.
    I would love to see clock speeds frozen in place for the next ten years while everything else catches up.
  • Just how advanced could we reasonably hope these (mechanical) computers to be?

    Not very. Major efforts were made in mechanical computing up until about 1950. Here's a list of the fastest computers in the world in 1956 [gapcon.com], from an era when some mechanical machines were still in use. You'll see "IBM Tabulator" in that list, rated at about 0.4, while the UNIVAC I comes in at 8755 and Whirlwind at 500000. The unit of measure appears to be IPS, or 10e-6 MIPS.

    There were some neat electromechanical machines. The IBM 602A, an electromechanical punch-card using, plugboard-programmed machine that could add, subtract, multiply and divide, probably was the most powerful ever to be produced in quantity. Those machines did much of the scientific number-crunching of the early post-WWII era. Some of them were still grinding away into the 1970s. The 602A took several seconds for a multiply. That's about as good as it ever got. Inertia and wear set rather low limits on what could be done mechanically.

  • Curators of the Science Museum in London measured Babbage's surviving components and found them to be well within tolerances. However, he had to pay an awful lot for them. Their conclusion was that he could have built the engine if he'd had less determination, and compromised on the specifications (e.g. by using 10-digit denary numbers instead of 30). But Charles Babbage was a very stubborn man.

    The Science Museum has successfully built part of Difference Engine No. 2 and is now working on its printer.


  • Uhhhhhhm... Okayyyyy...

    1. If you're referring to the Baby, it wasn't built until a couple of years after the end of WWII.

    2. I find the idea that a university would destroy a computer in order to keep it secret, a bit far-fetched I'm afraid.

    Y'see, the problem with destroying an invention like the computer in order to keep it secret from one's enemies (e.g. the Commies), hence preventing them from benefiting from the knowledge, is that you kind of lose out on benefiting from that knowledge yourself...

    So, apologetic skepticism is the main impression you should receive from me here. :-)

    D.

  • Hey! that could be put to use somewhere as a server or something! Why store it in a museum? Tis perfectly useful! I could hollow it out and use it as a bedroom!

    Also.....i still have an old Tandy 1000 that I still use every now and then...no reason to kill old computers..there is always some use for them. (even if it is for a coffee table to to sit on)

  • More reasons not to do restore it to working order:
    • It used approximately 30 kilowatts of electricity.
    • It wouldn't exactly meet current electrical safety standards
    • Restoring it to meet those standards would remove its authenticity
    • In any case, there is no need to physically restore it. Provided the details of the machine's instruction set and timing details are kept, an emulator can be (and I believe has been) written, so the programs that ran on it can be preserved far more effectively than say, old films or recordings.

    In any case, if you're interested in the history of computers, Australia isn't such a bad place to visit these days. As well as CSIRAC in Melbourne, the Powerhouse Museum in Sydney has a piece of a Difference Engine. There's more to see than kangaroos, the Opera House, and the Barrier Reef :)

  • ...this is providing, of course, that it doesn't run Windows 1949. (That version took five minutes just to go to the Blue Screen of Death.)
  • Ahhh, Cassette tape drives. Gotta love 'em. Weren't those the coolest. I wish todays boxes could tap right in to the TV. I'd finally have a 19" color monitor!


    .
  • "In its day CSIRAC was a marvel, but today any decent desktop computer can do more work in 30 seconds than CSIRAC did in its entire 15-year service."

    If we take decent to mean any computer that can get about 20 FPS (that is to say, consistantly playable performance) in Half-Life, and this sentence to NOT mean a LOT more work...then this machine would get about 40 frames per year, or about .00000127 frames per second. Enough maybe to play against someone traveling near the speed of light, I guess.

  • Pseudonym wrote:

    > It doesn't work, it's just intact.

    Good point. What would be (IMHO) much more interesting are oldies which still does their stupid jobs.

    If someone had a webcam pointing to a computer printing bills since 1980 - THAT would make a great story! Or if a boy called B1FF still hab his C64 online - well, that could be a myth.

    What about another uptime contest, minimum uptime ten years (no, I can't contribute)?

    [x] ulf
  • [...] an emulator can be (and I believe has been) written [...]

    It has, and I believe it ran something like three orders of magnitude faster than the original machine on 1995 hardware.

  • Powerhouse has a trs-80 model one display thats running an apple program. I wonder how that happned.

    The new butt ugly [abnormal.com] Melbourne Museum [vic.gov.au] isn't fully open yet. Their sci/tech section is still closed but admission is 1/2 price for the 1/3 that is open.

    If you have a need to be close to the oldest surving computer, I've got a room for rent across the street from the butt ugly [abnormal.com] building.
  • Ahhh, but how long does it take to compile 2.4.0 on it? :-)

  • Alan Turning went to work at Manchester University in the UK after the war and a lot of work on early computers was done there. Check out the History of the Computer Science Department [man.ac.uk] page for some interesting history, including the world's first stored-program computer, which was restored and made operational again a few years back (simulator [man.ac.uk] available!), and the prototype for the world's first manufactured production computer. Manchester was also the birthplace of virtual memory.


    D.

  • Yeah, but he blue-balled us at the end of Diamond Age. Cryptonomicon blows away his other books. You're a lucky bastard, by the way, getting a score of 2 for something so far off topic (fully aware I'm even more off topic...)
  • I wish todays boxes could tap right in to the TV. I'd finally have a 19" color monitor!

    There are adapters that converts vga to TV. And video cards with tv-out. But the 19" Tv-color monitor isn't so cool with its lousy resolution. Only 625 lines (PAL) or 5xx (NTSC)...

  • From the article:
    How did he manage with a computer whose memory could hold only 2,000 bits of information - about as much as a couple of e-mails?

    Does "bit" here mean "binary digit" as we know it today, or does it mean a discrete piece of information, such as a character or opcode? How long were "words" then? 2000 bits won't even contain the headers of a "couple of e-mails". Did they mean "bytes", or did they mean this kind of e-mail:

    Subject: [no subject]
    Date: Sat, Dec 23 2000 22:30:01 -0600 (CST)
    From: 1337h4xx0r15840924@aol.com
    To: subscriptions@hotsluts4u.com

    Mee to!

  • What if the Victorians had thrown boundless cash at mechanical computers. Just how advanced could we reasonably hope these computers to be?

    They had a long way to go. First, they werent that good at mechanical technology. That is one of the reasons Babbage didn't get his computer to work - it needed lots of parts made with very high precision.

    Take the trouble of advancing the mechanical tech, and what do you get? You could perhaps get a few hundred Hz with todays mechanics. If you want more power you'll need to go parallel instead. We still aren't good at that for general-purpose machines, but of course you can do that for special purpose machines. Oh, and a 100 Hz mechanical computer would wear down fast. Working parts would have to be replaced at regular intervals.

  • A "bit" ment the same then as it does now -- a binary digit. Bytes were not invented then and computers were dealt purely in numbers.

    I am not sure what the word length was or if it did floating point, but I think it must have done as fixed point calculations are not much good for scientific problems!

    So even with very short floats say 10 bits for the number and 6 bits for the power of you could only store a max 125 numbers, but, you would have to fit the program in there somewhere as well.

    As far as programming goes it was probably about the same as one of those very early Texas Instraments programable calculators, which, could store a whole 16 numbers!

  • Surely that counts as the 'world's oldest working computer'.

    Go on admit if you were living a few thousand years ago you'd look pretty geeky with an abacus

    Even better the Abacus was patent free ;)
  • The difference engine - William Gibson and Bruce Sterling
  • There's a lot of truth in what you say, but don't neglect the theatre. Derek Jacobi's role in "The Enigma of Intelligence" was a superb homage to Turing, and at the time (late '80s) Turing was almost unknown.

    I don't know if there's any film or video version of this play, but catch it if you can.

    Besides, I don't believe it's just homophobia. Alan Blumlein (One of several people with a good claim to be "The Inventor of Television") was a straight contemporary of Turing, yet is even less well known today. There's a recent Blumlein biography (Amazon [amazon.co.uk]), but I was less than impressed with it.

  • And one system still a few years older is Zuse's computer which ran in the early 40s (and thus not very well known, as Germany wasn't wel behaving, top say the least). The first machine, programmable, was based on relais (sp?), not very fast as you can imagine but still computing. The machine is on display at the Technisches Museum in Munich (Bavaria, Germany) and (last time I checked) still operating (on request/schedule)
  • One thing I have always wondered about historical computing is the "what if" question. In this case, what if Babbage had got commercial success with his difference engine? I have wondered just how advanced a purely mechanical computer could be. What if the Victorians had thrown boundless cash at mechanical computers. Just how advanced could we reasonably hope these computers to be? I am most interested

    Check out "Of Tangible Ghosts" by L.E. Modesitt, Jr. It's set in an alternate past where Babbage's Engine sparked a "technology explosion" - but with mechanical computers. I believe he's got some other novels set in the same universe, but I haven't had time to read them (so many books, so little time).

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...