Xerox Alto Source Code Released To Public 121
zonker writes: In 1970, the Xerox Corporation established the Palo Alto Research Center (PARC) with the goal to develop an "architecture of information" and lay the groundwork for future electronic office products. The pioneering Alto project that began in 1972 invented or refined many of the fundamental hardware and software ideas upon which our modern devices are based, including raster displays, mouse pointing devices, direct-manipulation user interfaces, windows and menus, the first WYSIWYG word processor, and Ethernet.
The first Altos were built as research prototypes. By the fall of 1976 PARC's research was far enough along that a Xerox product group started to design products based on their prototypes. Ultimately, ~1,500 were built and deployed throughout the Xerox Corporation, as well as at universities and other sites. The Alto was never sold as a product but its legacy served as inspiration for the future.
With the permission of the Palo Alto Research Center, the Computer History Museum is pleased to make available, for non-commercial use only, snapshots of Alto source code, executables, documentation, font files, and other files from 1975 to 1987. The files are organized by the original server on which they resided at PARC that correspond to files that were restored from archive tapes. An interesting look at retro-future.
The first Altos were built as research prototypes. By the fall of 1976 PARC's research was far enough along that a Xerox product group started to design products based on their prototypes. Ultimately, ~1,500 were built and deployed throughout the Xerox Corporation, as well as at universities and other sites. The Alto was never sold as a product but its legacy served as inspiration for the future.
With the permission of the Palo Alto Research Center, the Computer History Museum is pleased to make available, for non-commercial use only, snapshots of Alto source code, executables, documentation, font files, and other files from 1975 to 1987. The files are organized by the original server on which they resided at PARC that correspond to files that were restored from archive tapes. An interesting look at retro-future.
Aging and Orphan Open Source Projects? (Score:1, Funny)
osage writes: Several colleagues and I have worked on an open source project for over 20 years under a corporate aegis. Though nothing like Apache, we have a sizable user community and the software is considered one of the de facto standards for what it does. The problem is that we have never been able to attract new, younger programmers, and members of the original set have been forced to find jobs elsewhere or are close to retirement. The corporation has no interest in supporting the software. Thus, in th
Re: (Score:1)
maybe pay a competitive wage would help.
Re: (Score:2)
Upload everything you have to an open repository (github, sourceforge?) and create a torrent on TPB. Once you have a few seeders, it could become immortal...
... or not.
--
You can learn a lot from how people used to do things. And why they stopped.
even back then.... (Score:2, Interesting)
http://www.computerhistory.org... [computerhistory.org] (from tfa)
they knew the best display aspect ratio for getting work done
Re: (Score:2)
What you are referring to is not aspect ration, but screen orientation. But yes, you are correct, for many applications portrait is the favorable orientation.
Re: (Score:2)
What you are referring to is not aspect ratio
I believe he's referring to the specific 0,75 (606:808) aspect ratio. If that is the best, I don't know, but it is an aspect ratio.
Re: (Score:2)
you're technically wrong (the worst kind of wrong)
the screen is always oriented so English text goes from left-to-right (not top-to-bottom). the fact that the monitor in the image happens to have a 1 aspect) does not mean it's 'oriented left', it's actually built this way.
Re: (Score:2)
Have you actually looked at the picture?! The monitor is in portrait orientation... That looks like 4:3 ratio; or rather a 3:4.
But don't take my word for the meaning of the word "orientation" when it comes to displays or paper, take Wikipedia's article: Page orientation [wikipedia.org].
Re: (Score:2)
:)
Re: (Score:2)
Or ..
Have 2 monitors side by side.
Re: (Score:3)
Re: (Score:2)
Re: (Score:1)
Re: (Score:3)
The space age rocked.
The Apollo program and the military (Minuteman missile) pushed integrated circuit technology. Remember that 1969 was the culmination of Apollo, not the start. PARC was founded in 1970, the Alto started in 1972 and they had a working system by '76.
There were a lot of things pushing computing in the 60's and 70's. The space program was a big part but business was using computers as well. The national laboratories were pushing for faster and faster computers. The Cray I supercomputer
Re: (Score:2)
True that, but in the 60s and 70s you would rather find a general purpose computer at an university or research facility than a business.
Re: (Score:3)
That is completely wrong. The IBM 360 was introduced in 1964. The POP even calls the instructions the commercial instruction set. Before that there was the IBM 702 (1952), the IBM 650 (1953), and the IBM 1401 (1959). All were general purpose machines used by businesses.
Re: (Score:2)
Re: (Score:2)
Who said otherwise? I was responding to the incredibly inaccurate statement that general purpose computers were not used by businesses until the late 70s and 80s. Regardless of whether or not older equipment was still in use, general purpose computers WERE used by businesses long before that. The 1401 was a general purpose computer, and they had orders for more than 5000 of them the first month, in 1959. The 360 was a general purpose computer and they had orders for 2000 of those in the first couple of
Re: (Score:2)
Re: (Score:2)
What odd definition of 'general purpose computer' are you using? A 'general purpose computer' is a computer that can be programmed. 'General purpose' has NOTHING to do with binary or decimal modes, or scientific vs commercial applications. Earlier models of 'computers' were NOT general purpose, they had specific built-in functions, such as adding machines or tragectory calculators, and that is all they did. The 1401 is certainly a general purpose computer. The fact that you list RPG and COBOL proves th
Re: (Score:2)
Re: (Score:2)
What does being able to efficiently perform scientific calculations have to do with something being defined as a general purpose computer? NOTHING. General purpose says NOTHING about the suitability of a processor for a given task.
The processor in your cell phone is a general purpose computer. Is it particularly good at performing high-precision scientific calculations? No. Is it particularly good at performing decimal operations? No. Does that mean it is not a general purpose computer? NO!
Many busi
Re: (Score:2)
"general market" computers (Score:2)
A small embedded CPU in a radiation-hardened box is a 'general purpose computer' by the theoretical definition but nobody would buy one to play games and do the wide variety of tasks a PC does today.
Re: (Score:2)
Re: (Score:2)
Exactly where in that post do you see anything at all about 'converging mainframe architectures'? He talks about 'tabulators, time clocks, and other specialized machines', then starts talking about general purpose computers. And, in fact, that is pretty much what the division was - there were specialized machines for things, and then there were general purpose computers and the specialized machines died off. The problem with his post is not in the definition of general purpose, it is that he is about 2 de
Re: (Score:2)
I agree that the comment that sparked this was talking about special purpose machines (tabulators, etc.) vs computers. I suspect that he went googling for computer history, though, and found the rather specialized definition of "general purpose computer" that the mainframe people created.
For those of us with a computer science, rather than an IT background, general purpose computer means Turing complete. And while doing scientific computing on a BCD machine may be like going to LeMans with your turnip tru
Re: (Score:2)
I agree that the comment that sparked this was talking about special purpose machines (tabulators, etc.) vs computers. I suspect that he went googling for computer history, though, and found the rather specialized definition of "general purpose computer" that the mainframe people created.
I didn't "google" for it, I have known this since I was nine or ten, a quarter century ago, when I got interested in the history of computing. Although I admit that in my native tongue, we called them "universal computers", not "general-purpose computers", which is obviously the same meaning in English, as per the IBM page. The English form of the same term is the only thing I found recently. (Obviously, all the historical publications I was reading as a kid were in my native tongue [aha-antikvariat.sk], not in English.)
Also, s
Re: (Score:2)
Well, when I was referring to the original comment, that was the one written by rioki, not you.
I got interested in computers at about the same age as you, but for me that was around 1978 in the US. At that time things like tabulators were ancient history.
We did have a test scoring machine that was semi-standalone when I was in high school but I think it had a microprocessor in it. You could program it with an answer key and then it would mark the Scantron (fill-in-the-bubble) forms based on the answer key
Re: (Score:2)
" there were ever tube computers in space"
Not that I'm aware, in the West, no, but tubes certainly were used in space for the radios. Pencil triode, traveling wave tube and probably some planar triodes here and there.
I have heard of miniature "integrated" tubes from the East and something called a thermionic integrated micro module from the West.
Re: (Score:2)
Re: (Score:2)
You're off by at least a decade, maybe two.
IBM mainframes were never really used that much in scientific applications. They cost too much. That's why minicomputers were so popular. Big business used IBM mainframes starting from the early 50's. COBOL (COmmon Business-Oriented Language) was introduced in '59.
Computing follows the money. The money was initially in the military market, then the business market.
Re: (Score:3)
The difference between NASA and PARC is that NASA just followed the narrow goal of space exportation and PARC (being XEROX) tried to get the computer into the office. The use of computers for space exploration is does not bare many fruits for normal people. On the other hand almost everybody needs text processing and table calculations, including people that do not work in offices.
Re: (Score:1)
No, not space.
Codebreaking and the Bomb were the reason for computers.
The Minuteman missile was the reason for integrated circuits.
Re: (Score:1)
"Codebreaking and the Bomb were the reason for computers."
Not even. It was curiosity and mathematics... Then came the codebreaking.
"I wish these tables had been calculated by steam!" or some such... Babbage? 19th century? People needed tables for navigation, calculation, artillery tables, etc?
emulator? (Score:1)
Is there an emulator this would run under?
Re: (Score:2)
You would have to emulate the custom bitslice processor. I think that all of the schematics exist and it would be possible but it would take a lot of work.
Great! Somebody should build it in Minecraft.
Re: (Score:2)
It would be interesting (Score:4, Interesting)
If possible, it would be interesting to cross compile the code to a modern processor and see how fast it would fly, given the limited capabilities of hardware at the time. Remember, we're talking about 1MHz 16-20 bit processors at the time the project started, if that.
Re:It would be interesting (Score:4, Interesting)
Re: (Score:2)
Re: (Score:2)
I don't think it'd be inefficient at all if you use a modern compiler. If the compiler happens to notice some loops and vectorize stuff, it may be actually way more efficient per clock cycle than the original machine was.
Re: (Score:2)
Use an assembler emulator instead, or machine language emulator which is a common. This was a very custom-built CPU, with custom features.
I suspect there may be stuff there that is binary-only, so that machine emulation is the way to go. For example, later on Smalltalk would come as images, and while you had the source code what you did not have was a way to bootstrap that code into an image easily.
Re: (Score:2)
Re: (Score:1)
Both of your assumptions (simplicity and 8-bit architecture) are wrong. While the Alto's processor itself was microcoded, this microcode itself could be re-written to suit the task at hand, so even the assembly instructions themselves vary depending on what microcode was in use for a given task. The Alto's hardware was ridiculously baroque by today's standards.
Re: (Score:1)
I never said it was impossible, it's just going to be a pain in the ass to cross-compile the assembly code given the dynamic nature of the microcode. As one of those "MAME/MESS guys" I'd say we're in perfectly good shape to emulate the Alto given that the author of one of the only Alto emulators out there, Jürgen Buchmueller, is one of the current developers on MESS.
Re:It would be interesting (Score:4, Informative)
It was a 16-bit architecture. Use the Wiki [wikipedia.org]:
It would be relatively simple to come up with an emulator that could run well. Although I'd rather see a Dandelion clone, anyway - I knew all about the AMD 2900 series, back in the day.
Re: (Score:2)
Re: (Score:2)
8-bit? Back then there were 12, 16, 32, 36, and other bit counts for native word sizes. 8-bit didn't really take off until the home microcomputer market, a boon for home hobbyists but not important to serious computing.
Re: (Score:3)
I think you mean "micro" processors because things like the IBM 360 were far more powerful than that.
Re: (Score:2)
On a modern cellphone, the whole thing could probably run in the baseband processor :)
Re: (Score:2)
The chances of the code even compiling any more are slim. Let alone the required hardware and devices being present in a PC.
You're looking at a full emulation environment, which would kill all the performance anyway. It'd still fly on a modern PC, even so, but I can remember entire games fitting in 16Kb of RAM and Windows graphical interfaces that you needed to upgrade to 2Mb of RAM in order to run.
Of course they'd be fast on modern architecture. But they won't run directly. And by the time you get them
Re: (Score:2)
If you are truly asking yourself that, you just don't understand the differences between hardware of the differing eras and the magnitude more of which modern Windows does compared to Windows 3.1
Re: (Score:2)
Here is an example of what you're talking about.
In windows 3.1, there was no networking. Period. Windows since 95 have had built in networking stack. Now think about all the networking applications built into Windows now (not just the stack), file sharing, remote control, hell as crappy as IE is today, there was nothing like it (yes, still built into the OS).
I remember how much of an advance Windows 3.11 for was with its networking additions. In fact, looking back, 3.11 is what killed off DOS apps, because
Re: (Score:2)
Er... windows 3.11 had the same minimum spec as Windows 3.1... 2Mb RAM. And a 15Mb hard disk. So the point still stands.
And I have personally contributed to a project that brought Linux networking and TONS of extra features that we'd have died for in the 3.11 era to a single, bootable, 1.44Mb floppy disk.
Sure, Windows 95 upped the ante, but in terms of what you were given was it really that much of an advance? That's where things started to go downhill if anything... networking stack, yes. Firewalling o
Re: (Score:2)
Why should a Bluetooth icon take more RAM than an entire former OS?
It should not.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
True. Alto was a moving target. You can't point to it like you could the original Mac and detail all the features. Because every month it would change yet again. Certainly the software changed all the time, but even the hardware changed. This was primarily a research product and not a commercial one.
Re: (Score:1)
Re: (Score:2)
Bitblt was a great invention. Because it was bit based (as opposed to later things calling themselves bitblt) and and support for it in hardware/microcode, there were a lot of possibilities it could be used for. Amiga had something similar, since even though it was color it used bit planes instead of pixels.
Re: (Score:1)
Talk about late (Score:2)
Ten or twenty years ago, when enthusiasts still had this hardware, this would have been very interesting. I remember David Case having a big pile of the stuff he had nothing to do with because software was too hard to come by. Today, virtually all of that stuff has been landfilled or recycled.
Oh wow (Score:3)
Oh wow... it's like you spend your whole life understanding your childhood.
When I saw that image of the Sol-20, it immediately took me back to being 6yrs old. I'd go with my father to work in a manufacturing plant. He ran "The lab" and up until the late 70s, they'd program their machines with an infrared laser onto a chip... and it was a nightmare because it took hours and if anyone turned on a light it would ruin the etch. Then these computers started showing up with floppy drives and the first one I remember seeing looked exactly like that Sol-20. I'm assuming that's what it was. I got to type on it for fun a couple of times. Later they swapped to Commador's, apple IIs, IBM clones, etc... whatever was cheap.
This was probably the first computer I ever touched. Wow!
Is it just me? (Score:3, Insightful)
Am I the only one here that is impressed that they were able to restore the archives from tape from 40 years ago just fine? :)
Re: (Score:2)
Re: (Score:3, Informative)
Re: (Score:2)
Hmm, I think I have a couple 9-track tapes up in a box in the rafters of my mother's garage. Was hoping to be able to read them back someday... I knew I should have stuck to punch cards.
"mouse pointing devices"? (Score:3)
mouse pointing devices
You went with that because you didn't know whether to put mouses or mice, right?
It is, of course, mieces.
Re: (Score:2)
mouse pointing devices
A device which slows mice down enough to be pointed at is indeed a technical marvel to be venerated.
Re: (Score:2)
Snap!
Re: (Score:2)
mouse pointing devices
You went with that because you didn't know whether to put mouses or mice, right?
It is, of course, mieces.
I, for one, can't stand a dull mouse. I need only the sharpest, pointiest mice.
Re: (Score:2)
Mices, but pronounced "MY-sees" like index -> indices.
My favorite Alto application: Mazewar (Score:4, Interesting)
In 1977 or thereabouts, I was a co-op student at Xerox' Webster, NY Research Center. At lunchtime, I had access to an Alto, and spent far too much time playing MazeWar, a networked multi-player real-time 3D-perspective game wherein the players navigated a maze (displayed as wireframe 3D with an overhead map at the side), finding other players (who appeared as giant floating eyeballs) and zapping them. Once zapped, you respawned elsewhere in the maze and attempted to sneak up on your opponent and return the favor.
The graphics were extremely simple; there was no detail in the walls, just lines showing the edges, and player positions were limited to the center of each grid square; player movement was in discrete jumps. All of this was done to reduce the computational load for the graphics, of course. As a result, the system was very responsive, and the experience was quite immersive.
Re: (Score:1)
I also worked for Xerox at Webster, NY from '80-'90. The Altos were utterly amazing, as well as the software that ran on them...Pilot OS, Mesa, SIL for creating schematics, Swat for debugging. It spoiled me. Even programming on the Sun hardware and OS years later was a step down.
My favorite game was Polish Pong, but I loved Star Trek too.
--
Randy Stegbauer -- thosewerethedays
Re: (Score:2)
So basically, Wizardry I, Death Maze 5000 kind of movement/wireframes.. (to list some personal computer games with that look).
Okay, I'll bite (Score:2)
Re: (Score:1)
Maybe he wanted to say "the past", but had a stroke at the last moment.
PARC monument (Score:1, Flamebait)
Stylish, classic, with a simple inscription:
"On this spot, Steve Jobs stole all his good ideas."
Re:PARC monument (Score:4, Informative)
"On this spot, Steve Jobs bought all his good ideas."
FTFY.
Re: (Score:2)
I would add "And then refined them."
There is no doubt that Xerox was instrumental in GUI development. However Apple will be remembered because they brought it to the masses. While Xerox had great ideas about GUI, it lacked some refinement. They may have done it if the company had backed the researchers and fully embraced the idea of computers. Instead management was stuck on being a copier company.
Re: (Score:2, Troll)
Funny how Xerox didn't see it that way [nytimes.com].
Re: (Score:2)
Except that he didn't steal enough. He took what he could see: graphical display, windows, menus, pointing device. But he didn't understand what was under the hood, and missed a huge opportunity. The Smaltalk language was a huge part of the Alto system, and Jobs ignored it completely. If the original Macintosh had shipped with a Smalltalk interpreter in ROM, the world would be a hugely different place. Turning the world's hackers loose with Smalltalk on an original Mac would have made the Mac and Apple
Re: (Score:2)
It would have been 2x as expensive and 5x as slow, and a flop.
All the original Mac programs were exceptionally hand-optimized 68000 assembly.
On his NeXT project, Jobs had Objective-C built in, whose object model is nearly Smalltalk, at a time when C++ was the overwhelmingly dominant object-oriented language. And so NeXT had the first major commercial operating system with a serious objec
Re: (Score:2)
There was a commercially available Smalltalk that ran just fine on the original Mac. Adding it to the Mac ROM image would have added less than 64K bytes to the image (originally 128K) so no way would it have doubled the costs. And it would not have precluded the 68000 C/assembly programs -- it would have provided the same hacker-friendly extension environment provided by BASIC on the Apple II as an addition. For a small incremental cost they could have enabled a huge eco-system of community-created appli
Re: (Score:3)
http://en.wikipedia.org/wiki/A... [wikipedia.org].
Patent Buster? (Score:1)
I wonder how many software patents these revelations will bust?
Re: (Score:2)
Re: (Score:2)
Sponsored by Jobs and Gates (Score:2)
They've been copying the design for years, now you can copy the source code too!
Jokes aside, these were groundbreaking machines that determined the next 30 years or so of UI design. It had to be polished a bit to work on personal computers of the day (by Mssrs Gates and Jobs) and unfortunately somewhat cut down. The Alto screens were meant to replace paper, and only now has the price come down enough that we are getting screens with the resolution to rival paper.
Good, but... (Score:1)
In 1980, Stallman and some other hackers at the AI Lab were refused access to the source code for the software of a newly installed laser printer, the Xerox 9700. Stallman had modified the software for the Lab's previous laser printer (the XGP, Xerographic Printer), so it electronically messaged a user when the person's job was printed, and would message all logged-in users waiting for print jobs if the printer was jammed. Not being able to add these features to the new printer was a major inconvenience, as the printer was on a different floor from most of the users. This experience convinced Stallman of people's need to be able to freely modify the software they use.
(from: http://en.wikipedia.org/wiki/R... [wikipedia.org] )
Re: (Score:1)
You weren't noticing. Apple stole it from Xerox!
Re: (Score:3, Funny)
Apple invented the desktop. Xerox, Microsoft and Linux are just faggots who've stolen the idea.
Tthe first caveman who propped up a flat rock invented the desktop. Xerox just virtualized it first.