RIP: Betty Holberton, Original Eniac Programmer 154
DecoDragon writes "Betty Holberton, one of the original ENIAC programmers, died on December 8th. An obituary describing her many achivements as well as her work on the ENIAC can be found in the Washington Post. Her accomplishments included contributing to the development of Cobol and Fortran, and coming up with using mnemonic characters for commands (i.e. a for add). She was awarded the Lovelace Award for extraordinary acomplishments in computing from the Asssociation for Women in Computing, and the Computer Pioneer Award from the IEEE Computer Society for "development of the first sort-merge generator for the Univac which inspired the first ideas about compilation.""
Big ass memorial (Score:1, Funny)
Re:Big ass memorial (Score:1)
peace love and free thinking
VAX
..
Re:Big ass memorial (Score:1)
FYI: If you want to visit ENIAC (well, a piece of it), the Museum is on the ground floor of the Moore bldg. at 33rd & Walnut in Philadelphia, PA.
Re:Big ass memorial (Score:1)
Pieces of ENIAC are on display at the Smithsonian American History Museum.
Loss and Gain (Score:3, Insightful)
Re:Loss and Gain (Score:1)
nobody bloody reads anything anymore. if you can't be bothered to more than skim the summaries, how can you bother posting (or moderating. yes, i mean you, who modded this up insightful). if you had, you would have seen the bit that referred to the work she did that was pivotal in the creation of these things called compilers. you know, things that turns code into more things computers understand.
granted, it was a the end of the article, and "generator" and "compilation" have four syllables apiece, so i should probably cut you some slack...
Re:Loss and Gain (Score:1)
VAX
...
Re:Loss and Gain (Score:2)
The average John Q. Public neither knows nor cares about people like that, but doesn't think twice when he sorts a column on his spreadsheet. Perhaps he should.
Re:Loss and Gain (Score:4, Insightful)
How could her accomplishments possibly be minor compared with today's programmers? Today we may code operating systems or apps, but she helped to invent programming. She did "change the way we think about computers."
Read the obit first, it's very interesting and you might actually learn something.
Re:Loss and Gain (Score:3, Insightful)
I doubt it, probably just attributed to another person. There are very few ideas that are not part of the society they spring from. It just depends on who is recognised as being first.
Re:Loss and Gain (Score:3, Interesting)
Anyway, the point is yeah, it would definitely have been invented within a few years (months?) of when it was.
Re:Loss and Gain (Score:4, Interesting)
Actually she did. We know that software has not progressed as far as hardware. Most of it's relative progress was made by the original ENIAC TEAM. And Betty more than anybody else on that team wanting something that most of modern day programers are also hoping for... make computers fun, user fiendly and a good part of our daily life.
Re:Loss and Gain (Score:3, Funny)
what exactly are you smoking? (Score:2)
Actually her accomplishments seem a hell of a lot more important than those of any contemporary programmers. I know slashdotters tend to have messianic complexes, but come on, show some humility; she represents the generation that created the computer revolution.
Re:what exactly are you smoking? (Score:1)
Re:what exactly are you smoking? (Score:1)
Bug (Score:1, Informative)
Re:Bug (Score:1)
Re:Bug (Score:3, Informative)
--Blair
Picture of the bug (Score:2, Interesting)
Re:Picture of the bug (Score:2, Interesting)
Re:Picture of the bug (Score:1)
Pfft, picture. I saw the real thing. For a while they had it on display at the Smithsonian.
Full info... (Score:3, Interesting)
"Grace Murray Hopper, working in a temporary World War I building at Harvard University on the Mark II computer, found the first computer bug beaten to death in the jaws of a relay. She glued it into the logbook of the computer and thereafter when the machine stops (frequently) they tell Howard Aiken that they are "debugging" the computer. The very first bug still exists in the National Museum of American History of the Smithsonian Institution. The word bug and the concept of debugging had been used previously, perhaps by Edison, but this was probably the first verification that the concept applied to computers."
No, that would be Grace Hopper (Score:2)
Admiral Grace Hopper to you.
Re:No, that would be Grace Hopper (Score:1)
When she retired in 1986 at the age of 79 she was the oldest commissioned navy officer onactive duty. The retiement ceremony was held on the U.S.S Constitution [navy.mil] which is also still on active duty.
I guess any woman tough enough to do what they did when they did (not sit at home) doesn't like to quit.
[OT] (Score:2)
Yeah, I remembered the "Rear" as I pressed "Submit," but I wasn't certain how much of a difference that makes to the official rank. I guess it's just like the subdivisions in a general's rank. (I haven't done as much work with the Navy as with the other branches.)
She also won a Turing Award, didn't she? In '74 or '78?
Re:[OT] (Score:1)
She didn't actually win the Turing Award [acm.org]. They made the Grace Murray Hopper Award [acm.org] in 1971. On the whole I think I'd prefer having an award in my name too.
Nope.... (Score:2, Interesting)
(New Hacker's Dictionary)
Something to think about. (Score:5, Insightful)
From the article "By the completion of the ENIAC project in 1946, work that once took 30 hours to compute instead took 15 seconds."
Since most of us were born after the advent of computers we take for granted that mundane computation tasks can be automated for fairly low cost and at great time savings. However, for all that technological progress has been hailed in the last 20 years, is there any task that we have received this kind of improvement in efficiency on?
Are we becoming too focused on the day to day improvements in computing, each one of ever decreasing relevance to people who actually use the computer?
How can we focus more in the future on finding the areas where our efforts can be best utilized to produce efficiency gains of this sort, rather than Microsofting everything by putting 74 new features into a product just so a new product can be sold?
These kind of questions stand as the things that can best be answered by open source, where we are not constrained by profit. This should be what we think about in the future, rather than what featuress we can copy from someone else's software just because they have it and we don't.
Re:Something to think about. (Score:4, Interesting)
Having said that, for OSS to foster the giant leap forward that you suggest would require a large shift in the way people look at and create OSS. The simple truth is that 99.99% of all OSS is just reinvention of closed source software to scratch an itch or for political reasons. This is not the type of environment in which such a leap springs forth.
While Open Source has many benefits, it would take an awful lot for me to agree with your premise that its more well suited than closed source for the type of efficency gain you're looking for. Such leaps are often made by one or very few people, with everyone else following later. Given that, such a leap is just as likely to occur with plain-old closed source as with OSS.
Re:Something to think about. (Score:2)
Of course, there are excpetions, but they are only exceptions.
Wheelier wheel (Score:2, Insightful)
We're talking about a basic shift in the way things are done, from humans adding colums of numbers to an industrial number-adding machine.
You don't get the next big thing from microsoft, or from open source, or from programming at all.
You get it from inventing the next widget that automates, streamlines, accelerates some human activity.
What is it? A better word processor? Nope. Who knows. An automated intuiter? An enlarged and speed up memory core for the human brain? Something that turns dioxin into peanut butter?
Ginger?
Damned if I know, kemosabi. But when you're making those kind of calls, you're in the high country....
Re:Wheelier wheel (Score:1)
Frankly, I don't know either, but here are my guesses:
Re:Wheelier wheel (Score:1)
-aiabx
Re:Something to think about. (Score:2)
(Somebody wanna help me with who said that?)
Google (Score:1, Informative)
Consider the FFT. (Score:4, Informative)
One should realize that the most fundamental numerical algorithms do not change very rapidly. The most common numerical algorithms (sorting, linear algebra, differential equations, etc., both in serial and parallel) have been the subject of intense research by an army of applied mathematicians over the last half-century. All you have to do to take advantage of that work is to call your friendly local numerical library [netlib.org].
Of course, sophisticated 3D graphics methods are still the subject of intense research.
So in sum, I would argue that as far as "serious" numerical methods go, excellent solutions usually exist. (These methods are "open source", indeed open source before the term existed! They are usually published in the scientific literature. The main gains that remain are in "entertainment" applications. Bob
Re:Something to think about. (Score:2)
The next BIG thing will be actually putting that processing power to use, building machines more intelligent than ourselves. I can't see how that can not happen, maybe it will take much longer than 30 years but I'm pretty sure the number is closer to 30 than 300.
Anything else happening in the period are just details, minor details.
Re:Something to think about. (Score:1)
Even though the actual AI developed (and GP evolved) systems, and the AI itself, would grow too complex for any human to understand, that doesn't mean that the slave-owner/author wouldn't still want the "neuralnet source" (or whatever) to be made available free (as in speech & beer).
e.g. The Open Learned Common Knowledge Base vs. Microsoft StreetSmart(TM). With MS, if they still existed, they would want to hold you hostage with a subscription to a proprietary central AI slavefarm, while the "OSS community" would network their cheap "brains" to greater effect for the common good.
Re:Something to think about. (Score:1)
Since most of us were born after the advent of computers we take for granted that mundane computation tasks can be automated for fairly low cost and at great time savings. However, for all that technological progress has been hailed in the last 20 years, is there any task that we have received this kind of improvement in efficiency on?
Yes, the WWW and Google achieve a similar speedup for looking up information. Only for more advanced topics in non-computer related areas we have to resort to libraries or phone calls these days.
I guess she finally decided to ... (Score:2)
Another one for the bit bucket
while my compiler gently weeps
-
The Origin of Pale Grey Boxes, etc. (Score:3, Interesting)
Now we got folks who what their case midnight black.
But given all of the design issues we have seen, it is interesting to note that the human interface problem was being considered from the very beginning.
[Insert your Microsoft insult joke here]
Re:The Origin of Pale Grey Boxes, etc. (Score:2, Funny)
Just goes to show, that even great minds make cockups from time to time!
Betty Picture (Score:4, Informative)
Re:Betty Picture (Score:1)
The most irritating part of it... (Score:4, Redundant)
I got my master's in Comp Sci at UPenn in '89 (I used to walk past some of the remnants of ENIAC on display there, every day). And I can't help but be saddened by this:
She hoped to major in the field [mathematics] at the University of Pennsylvania but was discouraged by a professor who thought that women belonged at home.
I'm glad she finally got her chance to shine during the war, but who knows what else she might have accomplished, had someone's idiotic prejudices not dissuaded her into working for the Farm Journal?
Stupid git.
Then again, maybe he just meant /home...
it gets better (Score:1)
Re:The most irritating part of it... (Score:2)
Don't dis serendipity. Had she not been discouraged, her life would not have taken the course it had. She's little more than a footnote in history, and computer history at that, but how many of the billions who have lived can even claim that? Had she gone on to major in mathematics, she might have become...a math teacher. Or perhaps she would have gotten her PhD and disappeared into research, occasionally publishing obscure monographs on semihemidemigroup homologies.
Parrallel processing from the start. (Score:5, Interesting)
"There were no manuals," one of the women, Kay McNulty Mauchley Antonelli, later told Kathleen Melymuka for an interview in Computer World. "They gave us all the blueprints, and we could ask the engineers anything. We had to learn how the machine was built, what each tube did. We had to study how the machine worked and figure out how to do a job on it. So we went right ahead and taught ourselves how to program."
Mrs. Holberton took responsibility for the central unit that directed program sequences. Because the ENIAC was a parallel processor that could execute multiple program sections at once, programming the master unit was the toughest challenge of her 50-year career, she later told Kleiman.
Now that is a programming challenge.
Imagine that the first programs were parrallel processing problems from the start, with no manuals or instructions in programing because they had to invent it all first. And the pressure of being in wartime as well.
very impressive indeed. one of those things that get done because no one knows it is impossible yet.
More like microcode (Score:2)
Similar tricks were used in machines that had drum and delay-line memories, arranging the instructions such that the next one needed would emerge from the magnetic head or piezo reader just as the previous one was finished executing ... often with other instructions belonging to a different "thread" filling the gaps.
In those days you programmed to the bare metal because it was the only way to get anything useful done.
Female Programmers (Score:3, Insightful)
As for those who are belittling her use of mnemonics, you shouldn't take it for granted. Imagine having to type out 'file system consistency checker' instead of fsck among other commands.
Re:Female Programmers (Score:2)
My first computer I used was something like an 8086 (I think). The way you booted it was to load up a paper tape, and manually insert the boot sector at a specific address, and then manually press the go secuence (by loading in something like 377.
This loaded a driver for the teletype machine, which you could enter assembler codes (as numbers). Mnemotics like "a" for add and "b" for bring would have been an achievement worth speaking of.
Re:Female Programmers (Score:2)
Except, JMP is a mnemonic, so you'd just have to know the code for "jump" (which was 303 for the 8080) as well as the address you wanted to jump to (which would be 377 in your example).
My first coding class was assembly on 8080 workstations which had an octal keypad for input and 3 rows of 8 LEDs for output. It sucked. When we finally got to use Z80 workstations with keyboards and monitors we had a true appreciation for the blessing of mnemonics! Stored memory didn't seem so cool, though. It was less painful to retype in my program every day than to have to deal with those stupid cassette drives.
Re:Female Programmers (Score:1)
In the end, with the Basic RPN calculator, I used magic numbers to control the flow. This allowed for fewer labels, and robust code. The magic cookies were used to control the fan-out, with pre-testing of values (eg divide by 0) before it was attempted.
When the subroutine was done, it passed back magic cookies to indicate error codes, and its result in PA(0). It was up to the display module to announce the error message and deal with PA(0).
The message stack was also used for other things as well: you could peek at any register of the calculator.
But whatever it was, the codes were laid out so that all functions that would crash on DIV0 were together, all functions that required the range -1 to +1 were together, and so forth.
For the record, I used six magic cookies, in two sets of three. In half the cycle, set A would be controlling the flow, and set B were scratch data, while the other half B controls the flow, and A is the scratch data.
This greatly reduced the number of GOTO and GOSUB lines. I think in the end, there were 10 different labels of major jumps.
As for tweaking code: I still do it to great effect. I can still pull 8x increase in speed by selecting the correct algorithms.
Re:Female Programmers (Score:5, Insightful)
No, it's a question of perceived status. At that time, being a computer -- recall that 'computer' was the title of the person doing the math, not the noisy room-sized thing you did the math on -- was considered something of a drudge job. The men discovered the algorithms, the women did the computing.
Later, as the idea of working with a (machine) computer as a career became more fashionable, more and more men moved into the field, as it was no longer considered "merely" women's work.
Remember Lady Ada Lovelace, the first programmer? Babbage couldn't be bothered to do the menial work of actually designing algorithms. Then the act of designing algorithms lost some of its stigma, and men took over. Finally the act of actually coding the algorithms has lost its stigma, and so I (a male) can sit here making a fabulous living as a coder, while my equally-talented coder girlfriend doesn't make as much money.
The glass ceiling is still there. It just shifts up and down to include/exclude different professions as culture changes. :-(
Re:Female Programmers (Score:2)
Women's lib (Score:1)
Unfortunately, it kind of hit a dead-end with many unresolved issues. Now, a great many more women work full-time jobs, but so do men. In fact, the average work-week has grown steady up from 40 hours to 45 or 50. Meanwhile it has grown more and more difficult to financially support a family on one income, but the housework still needs doing and kids need taking care of.
It's true that women tend to make less for the same work as men, but even if that gets equalized, that doesn't solve the bigger problems. Being forced to work 50+ hours a week is hardly "liberation," and certainly not what the feminist movement was hoping to achieve.
Ultimately, both men and women need to work together to liberate ourselves (at least those of us in the bottom 99% economically).
Re:Female Programmers (Score:2, Interesting)
Re:Female Programmers (Score:2)
To clarify the - women were better at tedious arithmetical calculation. Men were obviously better than women at math(ematics) because there were almost no women mathematicians.
At least, that was the misconception of the time...
Re:Female Programmers (Score:2, Interesting)
Programming ENIAC must have been a challenge, because what you had was basically a set of vacuum tube shift registers and whatnot that had to be wired together for a given purpose using patchboards.
Re:Female Programmers (Score:1)
Actually, Eve was the first programmer. She had an Apple in one hand and a Wang in the other.
To a great geek, from a proud one, I salute you (Score:3, Insightful)
This woman rocked in many aspects... (Score:2, Funny)
Second, she is probably the programmer with the longest active career: she started before the war, and retired in 1983.
Third, hey, she had a husband 33 years younger than her!
I think she got a few things worth of envy, huh?
Re:This woman rocked in many aspects... (Score:1, Informative)
Re:This woman rocked in many aspects... (Score:1)
Old time computing (Score:4, Informative)
My computer at college in 1981 was something nearing the end of its life. It was an 8086 with 4K of ram, and a paper tape drive. To boot it, you load up the tape, and load three values into ram (by a series of eight switches and a "set" switch), and then send a command 377 to the processor. This would jump it tot a location in memory, and then run the commands that you loaded there (effectively JMP address), which would then run the KEX program. KEX was a driver for a teletype. After that, you input through the keyboard by assembler code.
Compared to that, mnenomics like a for add and b for bring would have been a godsend.
Of fortran, basic and cobol. In the days of wire wound core, each bit of the byte made the machine more expensive, and there was some comprimise on the size of the bit. Fortran was designed to run on a six-bit machine. Even Knuth's MIX is underpowered by modern computers.
BASIC is intended to run in small memory. MS made their packet by bumming it into 4K of ram, with a point and shoot interface.
In effect, you moved a cursor around the FAT and entered on the file you wanted to run or edit, at least on the tandy 1000. Still, I built a RPN multibase hackable calculator in 6K of code.
Where BASIC comes off the rails is that people start using it as a general programming language. Its inability to pass parameters to subroutines is easily overcome
Thus var1 = fn3130{x, v, z} can be written as:
A1=x:A2=v:A3=z:GOSUB 3130:var1=A1
In fact, once the kernel is written and documented, you can turn a generic RPN calculator script into specific special purpose code. I had mine so that all variables in the calculator start with O, P and Q. The idea was that you could write messy code outside these letters, and use the calculator as an input device.
And they say girls can't program. Ha. We just do it differently.
Re:Old time computing (Score:1)
Hmmm. Sounds more like an 8080...
Re:Old time computing (Score:2)
Re:Old time computing (Score:1)
Re:Old time computing (Score:1)
My mistake :(
This was a laptop that supported a fat of 32 files, the interface was to move the highlight around the screen and select it.
I never suggested that there was a disk involved. The files were saved and edited in RAM, as the tandy 100 did not have a hard disk.
I find it amusing .... (Score:3, Insightful)
I wonder how many IT people suggest technologies that are not computer-related: eg how many people suggest paper cards as a solution. I know I have.
You see, once you start fiddling around with the hardware like Betty H did, you start using it wisely. It is one of the reasons that Unix works so well.
Re:I find it amusing .... (Score:1)
Mainframes still rule.
Re:I find it amusing .... (Score:1)
Now, with Beowulf clusters, Distributed computing, and IBM runing Linux in mainframe VM machines, one is seeing lots of little PCs competing on mainframe turf. PCs have largely displaced mainframe terminals. A pc running a 3270 emulator is cheaper than a 3270 terminal!
Re:I find it amusing .... (Score:1)
Still, of the four you quote, one bemoans the change of data and information. Another just agrees with what someone else's correction, a third is a hi to some troll, and the fourth is the story. This makes five.
I can find a lot to do in the evenings, thank you. It's just morning over here: nearly lunch time, actually. And yes, I am currently bored, apart from passing on my experiences in the old times.
I'd be more likely to while the hours away on Civ3 or REXX than read Ann Rice novels.
info: good by bettie (Score:1)
Right down the hall (Score:3, Interesting)
good book (Score:1)
Go Betty! (Score:1)
Re:dance on her grave (Score:4, Insightful)
What languages have YOU designed?
Re:dance on her grave (Score:1)
you invented Visual Basic?
Actualy, the guy that invented visual basic [mailto]. Happens to be one of the richest men in the world.
Visual Basic [microsoft.com] is used by thousands of organisations around the world use it, and thousands of programmers make a lot of money by providing real world solutions, written in visual basic to real world problems.
There are many computing/business problems out there that C/C++/Linux is simply overkill.
I'm glad that your ability to program in c/c++/whatever gives you a high self-esteem, but why don't you take your illusions of grandure and shove them up your ass.
Your give the /. and linux community a bad name.
BTW - No, I'm not a vb programmer.
Just a correction (Score:1)
At least (Score:2)
In OS/2 it is easy to do that. This is because of EXTPROC and REXX. From that, you can write a REXX script for the processor, and feed the language script in as data. Saved many hundreds of hours' work doing it like that.
Maybe you can do that in Linux too - don't know enough about that.
Most good authorities on computer languages says you should learn several languages. This allows you to think more creatively. I know.
Sure Linux can do that. (Score:2)
Of course you can do that under linux. IBM Object REXX for Linux [ibm.com] or other REXX for Unix [rexxla.org].
Re:Sure Linux can do that. (Score:1)
EXTPROC works like this. Suppose you have a batch ADD.CMD that goes like this:
EXTPROC RPN
b %1
b %2
a
s
The command ADD 1 2 would execute the command RPN ADD 1 2, which would read each line of ADD.CMD through RPN, and give the result of 5.
That's how I read it.
Whether you let REXX or CEnvi or Perl process the RPN script is a mute point. What I was getting at was that you could create commands for different processors, AND be heard correctly.
For recollection, Linux shells do support it. And yes, I have object and regina rexx for Win/OS2/Linux :)
[OT]Re:Sure Linux can do that. (Score:2)
In general you can do this under *nix with #!at the top of the file. So if you had a *nix command /usr/bin/rpn your example would translate into:
#!/usr/bin/rpn
b %1
b %2
a
s
Just about every *nix command that accepts a file containing commands as an argument will work this way.
I never did like REXX back when I was using OS/2 and I don't know why. It looks like REXX is an easy language to program with. And really I should have jumped at the chance to use a "real" scripting language versus basic and those hideous dos batch files.
REXX (Score:1)
What makes it useful is that you can write a filter or batch processor quickly.
As far as batch processors go, you can write one that has just one command, eg "$#$". Anything that follows $#$ is processed as a sub command, and anything else just passes through unprocessed. You can then use this to produce RTF output (since most lines are unprocessed, and only the values to be substituted get processed).
You can do the same in PERL, but in the DOS/Win/OS/2 world, REXX is the most widely known.
Re:At least (Score:1)
But if all you are after is REXX, it's better to check out Quercus REXX [quercus-sys.com] as their rexx multi-tasks better. It also is available as a Windows version, where you can experiment with the bugs of the Windows command line: eg passing parameters to a script, eg "REXX frog a b" does not work.
OS/2 is strong where Windows is weak, and weak where Windows is strong. It's an interesting contrast. The OS/2 community, like the Linux community, attract people who want to be there, and the latest version out this year is not dissimilar to a Linux distro for what you get (except the price tag is higher). The next version is supporting a VMWare-like project that runs Windows and Linux in a virtual computer.
The trouble that I find with MS scripting languages is that they're unusually hard and restrictive. VBA for example, makes Excel calls by a rather lengthy call, and it's not much use for the causal programer (compared to say, 1-2-3's menu-keystroke command. [Yes, I used both].
So while I try to avoid MS stuff, just seeing how the other guys do it is an eyeopener. You would know yourself, that from your unix/linux experience, you make Windows work harder, and from your Windows experience, you make Linux work harder.
Re:dance on her grave (Score:1)
Re:What other famous woman programmers?!@! (Score:3, Informative)
Re:'a for add' ?!? (Score:1)
Eh? Smalltalk isn't THAT old, its only 20 years old (the original spec was Smalltalk-80 after all). That makes it newer than Lisp, C, Fortran, Cobol, most Algol derived languages, etc.
Common lisp dates back to around the same time as smalltalk (81-82ish), but original Lisp dates back to the fairly early days of computing ('58 ish).
ADA dates back to around 79-80, though the original DOD study goes back a couple of years earlier.
The fact is, very few successful languages have appeared in the last 20 years. Most of the post-1980 languages are extensions/redesigns of existing languages (C->C++, Pascal->Modula/Oberon series, Fortran->F90). About the only sucessful 'language' that is totally original that has appeared in the last 20 years, that I can think of, is Perl.
Re:women and computers (Score:1)
With the secrecy that surrounded Bletchley Park after the war, these women represent the best surviving knowledge of Collosus' operation (except for the blueprints and notes Turing kindly sent Von Neumann...)