Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×
Software Operating Systems News Hardware Technology

Xerox Alto Source Code Released To Public 121

zonker writes: In 1970, the Xerox Corporation established the Palo Alto Research Center (PARC) with the goal to develop an "architecture of information" and lay the groundwork for future electronic office products. The pioneering Alto project that began in 1972 invented or refined many of the fundamental hardware and software ideas upon which our modern devices are based, including raster displays, mouse pointing devices, direct-manipulation user interfaces, windows and menus, the first WYSIWYG word processor, and Ethernet.

The first Altos were built as research prototypes. By the fall of 1976 PARC's research was far enough along that a Xerox product group started to design products based on their prototypes. Ultimately, ~1,500 were built and deployed throughout the Xerox Corporation, as well as at universities and other sites. The Alto was never sold as a product but its legacy served as inspiration for the future.

With the permission of the Palo Alto Research Center, the Computer History Museum is pleased to make available, for non-commercial use only, snapshots of Alto source code, executables, documentation, font files, and other files from 1975 to 1987. The files are organized by the original server on which they resided at PARC that correspond to files that were restored from archive tapes. An interesting look at retro-future.
This discussion has been archived. No new comments can be posted.

Xerox Alto Source Code Released To Public

Comments Filter:
  • by Anonymous Coward

    osage writes: Several colleagues and I have worked on an open source project for over 20 years under a corporate aegis. Though nothing like Apache, we have a sizable user community and the software is considered one of the de facto standards for what it does. The problem is that we have never been able to attract new, younger programmers, and members of the original set have been forced to find jobs elsewhere or are close to retirement. The corporation has no interest in supporting the software. Thus, in th

    • maybe pay a competitive wage would help.

    • Upload everything you have to an open repository (github, sourceforge?) and create a torrent on TPB. Once you have a few seeders, it could become immortal...

      ... or not.

      You can learn a lot from how people used to do things. And why they stopped.

  • even back then.... (Score:2, Interesting)

    by Anonymous Coward

    http://www.computerhistory.org... [computerhistory.org] (from tfa)

    they knew the best display aspect ratio for getting work done

    • by rioki ( 1328185 )

      What you are referring to is not aspect ration, but screen orientation. But yes, you are correct, for many applications portrait is the favorable orientation.

      • by Urkki ( 668283 )

        What you are referring to is not aspect ratio

        I believe he's referring to the specific 0,75 (606:808) aspect ratio. If that is the best, I don't know, but it is an aspect ratio.

      • by wed128 ( 722152 )

        you're technically wrong (the worst kind of wrong)

        the screen is always oriented so English text goes from left-to-right (not top-to-bottom). the fact that the monitor in the image happens to have a 1 aspect) does not mean it's 'oriented left', it's actually built this way.

        • by rioki ( 1328185 )

          Have you actually looked at the picture?! The monitor is in portrait orientation... That looks like 4:3 ratio; or rather a 3:4.

          But don't take my word for the meaning of the word "orientation" when it comes to displays or paper, take Wikipedia's article: Page orientation [wikipedia.org].

          • So you are saying that the aspect ratio seems to you to be a more productive 3:4 rather than the older standard of 4:3?


  • Is there an emulator this would run under?

  • by msobkow ( 48369 ) on Wednesday October 22, 2014 @08:41AM (#48202615) Homepage Journal

    If possible, it would be interesting to cross compile the code to a modern processor and see how fast it would fly, given the limited capabilities of hardware at the time. Remember, we're talking about 1MHz 16-20 bit processors at the time the project started, if that.

    • by CastrTroy ( 595695 ) on Wednesday October 22, 2014 @08:50AM (#48202671) Homepage
      I looks like there's a some assembly code there from my browsing, which might be difficult to cross-compile. I would guess from the age of the code that a fair amount of it is assembly code. It would be possible to run it on an emulator. Even that could yield some serious speed gains.
      • Might still be possible to convert the assembly into C. An inefficient way of doing things compared with a proper conversion, but it should be faster than an emulator.
        • by tibit ( 1762298 )

          I don't think it'd be inefficient at all if you use a modern compiler. If the compiler happens to notice some loops and vectorize stuff, it may be actually way more efficient per clock cycle than the original machine was.

        • Use an assembler emulator instead, or machine language emulator which is a common. This was a very custom-built CPU, with custom features.

          I suspect there may be stuff there that is binary-only, so that machine emulation is the way to go. For example, later on Smalltalk would come as images, and while you had the source code what you did not have was a way to bootstrap that code into an image easily.

      • by Greyfox ( 87712 )
        It's probably fairly simple assembly. It wouldn't have been more than an 8 bit architecture. Anywhere it calls into any firmware might be problematic, but it'd probably be possible to read the asm and port it to newer asm or to C (Either one probably about just as easy.)
        • by Anonymous Coward

          Both of your assumptions (simplicity and 8-bit architecture) are wrong. While the Alto's processor itself was microcoded, this microcode itself could be re-written to suit the task at hand, so even the assembly instructions themselves vary depending on what microcode was in use for a given task. The Alto's hardware was ridiculously baroque by today's standards.

        • by frank_adrian314159 ( 469671 ) on Wednesday October 22, 2014 @10:13AM (#48203253) Homepage

          It was a 16-bit architecture. Use the Wiki [wikipedia.org]:

          Alto was a microcoded design but, unlike many computers, the microcode engine was not hidden from the programmer in a layered design. Applications such as Pinball took advantage of this to accelerate performance. The Alto had a bit-slice arithmetic logic unit (ALU) based on the Texas Instruments' 74181 chip, a ROM control store with a writable control store extension and had 128 (expandable to 512) kB of main memory organized in 16-bit words. Mass storage was provided by a hard disk drive that used a removable 2.5 MB single-platter cartridge (Diablo Systems, a company Xerox later bought) similar to those used by the IBM 2310. The base machine and one disk were housed in a cabinet about the size of a small refrigerator; one additional disk could be added in daisy-chain fashion.

          It would be relatively simple to come up with an emulator that could run well. Although I'd rather see a Dandelion clone, anyway - I knew all about the AMD 2900 series, back in the day.

        • It's microcode, not assembly. I also wonder how the physical HW would get emulated. The Alto had some pretty nifty barrel processing inside, with the microcode acting as both device drivers and CPU in a round-robin fashion. I wonder if you wouldn't need to actually physically simulate a rotating disk...
        • 8-bit? Back then there were 12, 16, 32, 36, and other bit counts for native word sizes. 8-bit didn't really take off until the home microcomputer market, a boon for home hobbyists but not important to serious computing.

    • I think you mean "micro" processors because things like the IBM 360 were far more powerful than that.

    • by tibit ( 1762298 )

      On a modern cellphone, the whole thing could probably run in the baseband processor :)

    • by ledow ( 319597 )

      The chances of the code even compiling any more are slim. Let alone the required hardware and devices being present in a PC.

      You're looking at a full emulation environment, which would kill all the performance anyway. It'd still fly on a modern PC, even so, but I can remember entire games fitting in 16Kb of RAM and Windows graphical interfaces that you needed to upgrade to 2Mb of RAM in order to run.

      Of course they'd be fast on modern architecture. But they won't run directly. And by the time you get them

      • It's asking ourselves why Windows needs a Gig of RAM in order to even boot properly, when the user "experience" of such is a desktop background bitmap and a clicky button in the corner. Windows 3.1 could do that in 2Mb of RAM.

        If you are truly asking yourself that, you just don't understand the differences between hardware of the differing eras and the magnitude more of which modern Windows does compared to Windows 3.1
        • Here is an example of what you're talking about.

          In windows 3.1, there was no networking. Period. Windows since 95 have had built in networking stack. Now think about all the networking applications built into Windows now (not just the stack), file sharing, remote control, hell as crappy as IE is today, there was nothing like it (yes, still built into the OS).

          I remember how much of an advance Windows 3.11 for was with its networking additions. In fact, looking back, 3.11 is what killed off DOS apps, because

          • by ledow ( 319597 )

            Er... windows 3.11 had the same minimum spec as Windows 3.1... 2Mb RAM. And a 15Mb hard disk. So the point still stands.

            And I have personally contributed to a project that brought Linux networking and TONS of extra features that we'd have died for in the 3.11 era to a single, bootable, 1.44Mb floppy disk.

            Sure, Windows 95 upped the ante, but in terms of what you were given was it really that much of an advance? That's where things started to go downhill if anything... networking stack, yes. Firewalling o

    • by tlhIngan ( 30335 )

      It would be fairly fast, but the graphics part was a bit overstated. The Alto didn't support overlapping windows (Wozniak, who did the overlapping windows implementation on MacOS, later found that out after he did (and patented) regions (and after his plane accident).).

      Given the Alto was the inspiration for MacOS (and Apple did license the idea from Xerox by giving them stock), I wonder how many other things we thought the Alto had, but it really lacked.

      • Smalltalk, running on the Alto, definitely had overlapping windows. See for example http://www.vpri.org/pdf/m19770... [vpri.org] .
        • True. Alto was a moving target. You can't point to it like you could the original Mac and detail all the features. Because every month it would change yet again. Certainly the software changed all the time, but even the hardware changed. This was primarily a research product and not a commercial one.

    • Cross-compiling to a modern machine would definitely be interesting. As others have noted, many applications were written in BCPL with bits of assembly language (very similar to Nova assembly language) plus microcode for "tight loops" such as BITBLT. There is also a simulator called Salto, written by Juergen Buchmueller, that works well enough to give a feel for the Alto but still has bugs. This page has .zip file with executables and disk images ready to run on Windows, plus links to the source code: http: [toastytech.com]
      • Bitblt was a great invention. Because it was bit based (as opposed to later things calling themselves bitblt) and and support for it in hardware/microcode, there were a lot of possibilities it could be used for. Amiga had something similar, since even though it was color it used bit planes instead of pixels.

  • Ten or twenty years ago, when enthusiasts still had this hardware, this would have been very interesting. I remember David Case having a big pile of the stuff he had nothing to do with because software was too hard to come by. Today, virtually all of that stuff has been landfilled or recycled.

  • by Charliemopps ( 1157495 ) on Wednesday October 22, 2014 @08:55AM (#48202699)

    Oh wow... it's like you spend your whole life understanding your childhood.

    When I saw that image of the Sol-20, it immediately took me back to being 6yrs old. I'd go with my father to work in a manufacturing plant. He ran "The lab" and up until the late 70s, they'd program their machines with an infrared laser onto a chip... and it was a nightmare because it took hours and if anyone turned on a light it would ruin the etch. Then these computers started showing up with floppy drives and the first one I remember seeing looked exactly like that Sol-20. I'm assuming that's what it was. I got to type on it for fun a couple of times. Later they swapped to Commador's, apple IIs, IBM clones, etc... whatever was cheap.

    This was probably the first computer I ever touched. Wow!

  • Is it just me? (Score:3, Insightful)

    by Anonymous Coward on Wednesday October 22, 2014 @09:34AM (#48202961)

    Am I the only one here that is impressed that they were able to restore the archives from tape from 40 years ago just fine? :)

  • by wonkey_monkey ( 2592601 ) on Wednesday October 22, 2014 @09:52AM (#48203077) Homepage

    mouse pointing devices

    You went with that because you didn't know whether to put mouses or mice, right?

    It is, of course, mieces.

  • by OmniGeek ( 72743 ) on Wednesday October 22, 2014 @10:00AM (#48203141)

    In 1977 or thereabouts, I was a co-op student at Xerox' Webster, NY Research Center. At lunchtime, I had access to an Alto, and spent far too much time playing MazeWar, a networked multi-player real-time 3D-perspective game wherein the players navigated a maze (displayed as wireframe 3D with an overhead map at the side), finding other players (who appeared as giant floating eyeballs) and zapping them. Once zapped, you respawned elsewhere in the maze and attempted to sneak up on your opponent and return the favor.

    The graphics were extremely simple; there was no detail in the walls, just lines showing the edges, and player positions were limited to the center of each grid square; player movement was in discrete jumps. All of this was done to reduce the computational load for the graphics, of course. As a result, the system was very responsive, and the experience was quite immersive.

    • I also worked for Xerox at Webster, NY from '80-'90. The Altos were utterly amazing, as well as the software that ran on them...Pilot OS, Mesa, SIL for creating schematics, Swat for debugging. It spoiled me. Even programming on the Sun hardware and OS years later was a step down.

      My favorite game was Polish Pong, but I loved Star Trek too.

      Randy Stegbauer -- thosewerethedays

    • So basically, Wizardry I, Death Maze 5000 kind of movement/wireframes.. (to list some personal computer games with that look).

  • What on earth is "An interesting look at retro-future"?
  • There really should be an august monument to noble accomplishments Xerox achieved.

    Stylish, classic, with a simple inscription:

    "On this spot, Steve Jobs stole all his good ideas."
    • Re:PARC monument (Score:4, Informative)

      by hackertourist ( 2202674 ) <hackertourist@NOsPAM.xmsnet.nl> on Wednesday October 22, 2014 @12:18PM (#48204373)

      "On this spot, Steve Jobs bought all his good ideas."


      • I would add "And then refined them."

        There is no doubt that Xerox was instrumental in GUI development. However Apple will be remembered because they brought it to the masses. While Xerox had great ideas about GUI, it lacked some refinement. They may have done it if the company had backed the researchers and fully embraced the idea of computers. Instead management was stuck on being a copier company.

      • Re: (Score:2, Troll)

        by Namarrgon ( 105036 )

        Funny how Xerox didn't see it that way [nytimes.com].

    • by dbc ( 135354 )

      Except that he didn't steal enough. He took what he could see: graphical display, windows, menus, pointing device. But he didn't understand what was under the hood, and missed a huge opportunity. The Smaltalk language was a huge part of the Alto system, and Jobs ignored it completely. If the original Macintosh had shipped with a Smalltalk interpreter in ROM, the world would be a hugely different place. Turning the world's hackers loose with Smalltalk on an original Mac would have made the Mac and Apple

      • by mbkennel ( 97636 )
        | If the original Macintosh had shipped with a Smalltalk interpreter in ROM, the world would be a hugely different place.

        It would have been 2x as expensive and 5x as slow, and a flop.

        All the original Mac programs were exceptionally hand-optimized 68000 assembly.

        On his NeXT project, Jobs had Objective-C built in, whose object model is nearly Smalltalk, at a time when C++ was the overwhelmingly dominant object-oriented language. And so NeXT had the first major commercial operating system with a serious objec
        • by dbc ( 135354 )

          There was a commercially available Smalltalk that ran just fine on the original Mac. Adding it to the Mac ROM image would have added less than 64K bytes to the image (originally 128K) so no way would it have doubled the costs. And it would not have precluded the 68000 C/assembly programs -- it would have provided the same hacker-friendly extension environment provided by BASIC on the Apple II as an addition. For a small incremental cost they could have enabled a huge eco-system of community-created appli

    • http://en.wikipedia.org/wiki/A... [wikipedia.org].

      Jobs and several Apple employees, including Jef Raskin, visited Xerox PARC in December 1979 to see the Xerox Alto. Xerox granted Apple engineers three days of access to the PARC facilities in return for the option to buy 100,000 shares (800,000 split-adjusted shares) of Apple at the pre-IPO price of $10 a share.[39]

  • I wonder how many software patents these revelations will bust?

  • They've been copying the design for years, now you can copy the source code too!

    Jokes aside, these were groundbreaking machines that determined the next 30 years or so of UI design. It had to be polished a bit to work on personal computers of the day (by Mssrs Gates and Jobs) and unfortunately somewhat cut down. The Alto screens were meant to replace paper, and only now has the price come down enough that we are getting screens with the resolution to rival paper.

  • thank god Xerox didn't do open source back then, free software might have never seen the light of day:

    In 1980, Stallman and some other hackers at the AI Lab were refused access to the source code for the software of a newly installed laser printer, the Xerox 9700. Stallman had modified the software for the Lab's previous laser printer (the XGP, Xerographic Printer), so it electronically messaged a user when the person's job was printed, and would message all logged-in users waiting for print jobs if the printer was jammed. Not being able to add these features to the new printer was a major inconvenience, as the printer was on a different floor from most of the users. This experience convinced Stallman of people's need to be able to freely modify the software they use.

    (from: http://en.wikipedia.org/wiki/R... [wikipedia.org] )

"'Tis true, 'tis pity, and pity 'tis 'tis true." -- Poloniouius, in Willie the Shake's _Hamlet, Prince of Darkness_