Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
News

RIP: Betty Holberton, Original Eniac Programmer 154

DecoDragon writes "Betty Holberton, one of the original ENIAC programmers, died on December 8th. An obituary describing her many achivements as well as her work on the ENIAC can be found in the Washington Post. Her accomplishments included contributing to the development of Cobol and Fortran, and coming up with using mnemonic characters for commands (i.e. a for add). She was awarded the Lovelace Award for extraordinary acomplishments in computing from the Asssociation for Women in Computing, and the Computer Pioneer Award from the IEEE Computer Society for "development of the first sort-merge generator for the Univac which inspired the first ideas about compilation.""
This discussion has been archived. No new comments can be posted.

RIP: Betty Holberton, Original Eniac Programmer

Comments Filter:
  • by Anonymous Coward
    They should errect the ENIAC like the Vietnam Wall somewhere and scrawl all the dude's names on the back of it... ;)
    • it seems only fair, after all the not so glam pioneers of computers still need to be remembered, and in my opinion more so that thier galm counter parts. Iam glad slashdot posted this its good to know they intresting things that are not microsoft related (at all) are still being posted. It occured to me that inadvertaintly slashdot is giving microsoft free advertising by putting them up on a (what seems like) weekly baisis, i dont know about the rest of you but iam a bit tired of hearing microsoft does this and microsoft is doing that in court (doesn anyone care anymore? i dont) heh well keep up the good articles guys
      peace love and free thinking
      VAX

      ..
      • Well, they still have a piece of ENIAC here at Penn in the ENIAC museum. Someone could probably suggest it to the dude who runs the museum. When I was working there it was Paul Shaffer [upenn.edu], and it probably still is.

        FYI: If you want to visit ENIAC (well, a piece of it), the Museum is on the ground floor of the Moore bldg. at 33rd & Walnut in Philadelphia, PA.

    • They should errect the ENIAC like the Vietnam Wall somewhere


      Pieces of ENIAC are on display at the Smithsonian American History Museum.

  • Loss and Gain (Score:3, Insightful)

    by The Great Wakka ( 319389 ) on Tuesday December 11, 2001 @07:06PM (#2689903) Homepage Journal
    Sometimes, the world loses a great person. While her accomplishments may seem minor compared to those of the modern-day programmers, she laid an important stone in the foundation of modern computer science. Can you imagine life without her? One whole section of a computer's logic would be eliminated. Perhaps she made some obscure discovery that tomorrow will change the way we think about computers.
    • Can you imagine life without her?
      What's sad is that she's one of the pioneers who was underappreciated ("semi-professional"... WTF is that?) in her time, and virtually unknown today -- yet she's directly/indirectly responsible for many of the things we take for granted.

      The average John Q. Public neither knows nor cares about people like that, but doesn't think twice when he sorts a column on his spreadsheet. Perhaps he should.

    • Re:Loss and Gain (Score:4, Insightful)

      by selan ( 234261 ) on Tuesday December 11, 2001 @07:15PM (#2689963) Journal
      I realize that your comment was meant as praise, but it really belittles what she achieved.

      How could her accomplishments possibly be minor compared with today's programmers? Today we may code operating systems or apps, but she helped to invent programming. She did "change the way we think about computers."

      Read the obit first, it's very interesting and you might actually learn something.

    • Re:Loss and Gain (Score:3, Insightful)

      by Usquebaugh ( 230216 )
      "One whole section of a computer's logic would be eliminated."

      I doubt it, probably just attributed to another person. There are very few ideas that are not part of the society they spring from. It just depends on who is recognised as being first.
      • Re:Loss and Gain (Score:3, Interesting)

        by Omerna ( 241397 )
        One thing I've noticed is almost always another person is noted as having invented the exact thing at the same time, but weren't officially recognized as "first" so they don't get any credit. The first example that comes to mind is the Periodic Table- Mendeleev was credited, but another guy (anyone remember his name? I can't) did the same thing very slightly differently at the same time (I mean invented a table with the elements arranged like this, not an improvement upon it) and gets no credit because Mendeleev was recognized as first!

        Anyway, the point is yeah, it would definitely have been invented within a few years (months?) of when it was.
    • Re:Loss and Gain (Score:4, Interesting)

      by andres32a ( 448314 ) on Tuesday December 11, 2001 @07:35PM (#2690069) Homepage
      Perhaps she made some obscure discovery that tomorrow will change the way we think about computers.

      Actually she did. We know that software has not progressed as far as hardware. Most of it's relative progress was made by the original ENIAC TEAM. And Betty more than anybody else on that team wanting something that most of modern day programers are also hoping for... make computers fun, user fiendly and a good part of our daily life.
    • While her accomplishments may seem minor compared to those of the modern-day programmers,

      Actually her accomplishments seem a hell of a lot more important than those of any contemporary programmers. I know slashdotters tend to have messianic complexes, but come on, show some humility; she represents the generation that created the computer revolution.
  • Bug (Score:1, Informative)

    by genkael ( 102983 )
    Isn't she also the person who coined the term "bug" after finding a moth in the system that was shorting it out?
    • Nope, that was Grace Hopper.
    • Re:Bug (Score:3, Informative)

      by blair1q ( 305137 )
      That's more attributable to Grace Hopper, but she didn't coin it, she just made a joke of it, pasting the moth in her lab notebook and annotating it "first real bug".

      --Blair
    • Picture of the bug (Score:2, Interesting)

      by bstadil ( 7110 )
      It was Gace Hopper Look at this site [vt.edu] they have picture of the Bug.
      • by Teratogen ( 86708 )
        I actually met Grace Hopper when I was a Multics programmer for the Air Force at the Pentagon, back in 1982 or 1983. She was shopping at the Walgreens in the Pentagon Concourse. I introduced myself to her and we talked for a bit. I kick myself repeatedly for not getting her to autograph a Cobol printout for me. =)
      • Look at this site [vt.edu] they have picture of the Bug.


        Pfft, picture. I saw the real thing. For a while they had it on display at the Smithsonian. :)

    • Full info... (Score:3, Interesting)

      by kikta ( 200092 )
      See the page on 1945 [computer.org], where it says:

      "Grace Murray Hopper, working in a temporary World War I building at Harvard University on the Mark II computer, found the first computer bug beaten to death in the jaws of a relay. She glued it into the logbook of the computer and thereafter when the machine stops (frequently) they tell Howard Aiken that they are "debugging" the computer. The very first bug still exists in the National Museum of American History of the Smithsonian Institution. The word bug and the concept of debugging had been used previously, perhaps by Edison, but this was probably the first verification that the concept applied to computers."

    • Admiral Grace Hopper to you. :-)

      • Rear Admiral Grace Murray Hopper.

        When she retired in 1986 at the age of 79 she was the oldest commissioned navy officer onactive duty. The retiement ceremony was held on the U.S.S Constitution [navy.mil] which is also still on active duty.

        I guess any woman tough enough to do what they did when they did (not sit at home) doesn't like to quit.
        • by devphil ( 51341 )


          Yeah, I remembered the "Rear" as I pressed "Submit," but I wasn't certain how much of a difference that makes to the official rank. I guess it's just like the subdivisions in a general's rank. (I haven't done as much work with the Navy as with the other branches.)

          She also won a Turing Award, didn't she? In '74 or '78?

          • As I understand it the Rear makes a lot of difference. It's like the rank distinction between a General and Four Star General. Admirals as far as I understand command bases and carrier groups. Rear Admirals tell them what to do with those bases and carrier groups.

            She didn't actually win the Turing Award [acm.org]. They made the Grace Murray Hopper Award [acm.org] in 1971. On the whole I think I'd prefer having an award in my name too.
    • Nope.... (Score:2, Interesting)

      by NickFusion ( 456530 )
      The folk etymology of "bug" is that, in the early days of electronic computing, an actual insect flew into the innards of the Harvard Mark II, and caused a malfunction (this did happen), and that is where we get the word bug (in the sense of a flaw in the process). It seems however that the word was already in use in that sense in industrial manufacturing circles at the end of the 19th century.

      (New Hacker's Dictionary)
  • by Matt2000 ( 29624 ) on Tuesday December 11, 2001 @07:12PM (#2689942) Homepage

    From the article "By the completion of the ENIAC project in 1946, work that once took 30 hours to compute instead took 15 seconds."

    Since most of us were born after the advent of computers we take for granted that mundane computation tasks can be automated for fairly low cost and at great time savings. However, for all that technological progress has been hailed in the last 20 years, is there any task that we have received this kind of improvement in efficiency on?

    Are we becoming too focused on the day to day improvements in computing, each one of ever decreasing relevance to people who actually use the computer?

    How can we focus more in the future on finding the areas where our efforts can be best utilized to produce efficiency gains of this sort, rather than Microsofting everything by putting 74 new features into a product just so a new product can be sold?

    These kind of questions stand as the things that can best be answered by open source, where we are not constrained by profit. This should be what we think about in the future, rather than what featuress we can copy from someone else's software just because they have it and we don't.
    • by MisterBlister ( 539957 ) on Tuesday December 11, 2001 @07:24PM (#2690015) Homepage
      This may come off sounding like a flame, but I don't intend it to -- I fully support the notion of Open Source software and have released various bits of OSS myself.

      Having said that, for OSS to foster the giant leap forward that you suggest would require a large shift in the way people look at and create OSS. The simple truth is that 99.99% of all OSS is just reinvention of closed source software to scratch an itch or for political reasons. This is not the type of environment in which such a leap springs forth.

      While Open Source has many benefits, it would take an awful lot for me to agree with your premise that its more well suited than closed source for the type of efficency gain you're looking for. Such leaps are often made by one or very few people, with everyone else following later. Given that, such a leap is just as likely to occur with plain-old closed source as with OSS.

    • Open Source is not some kind of new paradigm of computing, it is simply a development and distribution model, and not a new one either. There's nothing "special" about open source. In fact, considering how open source, by definition, results in lower revenue for the developers, I would expect more innovations to occur in closed source code. After all, the more money that a company makes, the more it can invest into R&D.

      Of course, there are excpetions, but they are only exceptions.

    • Wheelier wheel (Score:2, Insightful)

      by NickFusion ( 456530 )
      So what are you suggesting, that we invent a wheel that is an order of magnitude...wheelier?

      We're talking about a basic shift in the way things are done, from humans adding colums of numbers to an industrial number-adding machine.

      You don't get the next big thing from microsoft, or from open source, or from programming at all.

      You get it from inventing the next widget that automates, streamlines, accelerates some human activity.

      What is it? A better word processor? Nope. Who knows. An automated intuiter? An enlarged and speed up memory core for the human brain? Something that turns dioxin into peanut butter?

      Ginger?

      Damned if I know, kemosabi. But when you're making those kind of calls, you're in the high country....
    • "Computers make it easier to do a lot of things. Trouble is, most of those things don't need to be done."

      (Somebody wanna help me with who said that?)

      • Google (Score:1, Informative)

        by Anonymous Coward
        I found this [brainyquote.com] on Google. It attributes the quote to Andy Rooney.
    • Consider the FFT. (Score:4, Informative)

      by RobertFisher ( 21116 ) on Tuesday December 11, 2001 @08:30PM (#2690339) Journal
      I've heard Cooley & Tukey's original 1965 paper "An Algorithm for the Machine Calculation of Complex Fourier Series" on the FFT algorithm cited as such a vast improvement. (Indeed, it has been called [siam.org] "the most valuable numerical algorithm in our lifetime" by the applied mathematician Gilbert Strang.) When you consider it is an N log N algorithm, as opposed to previous N^2 methods (amounting to a factor of ~ 100 in computational efficiency for N ~ 1000, and even bigger gains for larger N), and just how often Fourier methods are used in all branches of computational science, you begin to appreciate how significant their achievement was.

      One should realize that the most fundamental numerical algorithms do not change very rapidly. The most common numerical algorithms (sorting, linear algebra, differential equations, etc., both in serial and parallel) have been the subject of intense research by an army of applied mathematicians over the last half-century. All you have to do to take advantage of that work is to call your friendly local numerical library [netlib.org].

      Of course, sophisticated 3D graphics methods are still the subject of intense research.

      So in sum, I would argue that as far as "serious" numerical methods go, excellent solutions usually exist. (These methods are "open source", indeed open source before the term existed! They are usually published in the scientific literature. The main gains that remain are in "entertainment" applications. Bob

    • Forget open source for a revolution, next big thing will be affordable computers surpassing human brains in computational power, which could happen in thirty years if moore's law continues to hold, and our estimates about brain's processing capacity is not very misguided. Building one today for just half a million top-end processors, and billions of dollars will do no good, AI researchers must be able access such machines for long time periods.

      The next BIG thing will be actually putting that processing power to use, building machines more intelligent than ourselves. I can't see how that can not happen, maybe it will take much longer than 30 years but I'm pretty sure the number is closer to 30 than 300.

      Anything else happening in the period are just details, minor details.

      • Open Source concepts still apply in an AI future.

        Even though the actual AI developed (and GP evolved) systems, and the AI itself, would grow too complex for any human to understand, that doesn't mean that the slave-owner/author wouldn't still want the "neuralnet source" (or whatever) to be made available free (as in speech & beer).

        e.g. The Open Learned Common Knowledge Base vs. Microsoft StreetSmart(TM). With MS, if they still existed, they would want to hold you hostage with a subscription to a proprietary central AI slavefarm, while the "OSS community" would network their cheap "brains" to greater effect for the common good.

    • From the article "By the completion of the ENIAC project in 1946, work that once took 30 hours to compute instead took 15 seconds."

      Since most of us were born after the advent of computers we take for granted that mundane computation tasks can be automated for fairly low cost and at great time savings. However, for all that technological progress has been hailed in the last 20 years, is there any task that we have received this kind of improvement in efficiency on?

      Yes, the WWW and Google achieve a similar speedup for looking up information. Only for more advanced topics in non-computer related areas we have to resort to libraries or phone calls these days.

  • decompile her code.

    Another one for the bit bucket ...

    while my compiler gently weeps ...

    -
  • by Alien54 ( 180860 ) on Tuesday December 11, 2001 @07:22PM (#2689998) Journal
    While engineers focused on the technology of computing, Mrs. Holberton lay awake nights thinking about human thought processes, she later told interviewers. - - - She came up with language using mnemonic characters that appealed to logic, such as "a" for add and "b" for bring. She designed control panels that put the numeric keypad next to the keyboard and persuaded engineers to replace the UNIVAC's black exterior with the gray-beige tone that came to be the universal color of computers.

    Now we got folks who what their case midnight black.

    But given all of the design issues we have seen, it is interesting to note that the human interface problem was being considered from the very beginning.

    [Insert your Microsoft insult joke here]

  • Betty Picture (Score:4, Informative)

    by andres32a ( 448314 ) on Tuesday December 11, 2001 @07:23PM (#2690008) Homepage
    There is a nice picture a her here. [awc-hq.org] just if anyone is interested...
  • by Eryq ( 313869 ) on Tuesday December 11, 2001 @07:26PM (#2690028) Homepage

    I got my master's in Comp Sci at UPenn in '89 (I used to walk past some of the remnants of ENIAC on display there, every day). And I can't help but be saddened by this:

    She hoped to major in the field [mathematics] at the University of Pennsylvania but was discouraged by a professor who thought that women belonged at home.

    I'm glad she finally got her chance to shine during the war, but who knows what else she might have accomplished, had someone's idiotic prejudices not dissuaded her into working for the Farm Journal?

    Stupid git.

    Then again, maybe he just meant /home...

    • January 2002 Dr. Dobbs Journal, page 18: basically some unidentifiable chick could have figured out the secret of the enigma machine much earlier than everyone else, but she wasn't allowed to follow up on her insights. She's called Mrs. BB because they can't think of anything better.
    • I'm glad she finally got her chance to shine during the war, but who knows what else she might have accomplished, had someone's idiotic prejudices not dissuaded her into working for the Farm Journal?

      Don't dis serendipity. Had she not been discouraged, her life would not have taken the course it had. She's little more than a footnote in history, and computer history at that, but how many of the billions who have lived can even claim that? Had she gone on to major in mathematics, she might have become...a math teacher. Or perhaps she would have gotten her PhD and disappeared into research, occasionally publishing obscure monographs on semihemidemigroup homologies.
  • by Alien54 ( 180860 ) on Tuesday December 11, 2001 @07:27PM (#2690032) Journal
    The Army chose six women, including Mrs. Holberton, to program the ENIAC, which weighed 30 tons and filled a room. The women had to route data and electronic pulses through 3,000 switches, 18,000 vacuum tubes and dozens of cables.

    "There were no manuals," one of the women, Kay McNulty Mauchley Antonelli, later told Kathleen Melymuka for an interview in Computer World. "They gave us all the blueprints, and we could ask the engineers anything. We had to learn how the machine was built, what each tube did. We had to study how the machine worked and figure out how to do a job on it. So we went right ahead and taught ourselves how to program."

    Mrs. Holberton took responsibility for the central unit that directed program sequences. Because the ENIAC was a parallel processor that could execute multiple program sections at once, programming the master unit was the toughest challenge of her 50-year career, she later told Kleiman.

    Now that is a programming challenge.

    Imagine that the first programs were parrallel processing problems from the start, with no manuals or instructions in programing because they had to invent it all first. And the pressure of being in wartime as well.

    very impressive indeed. one of those things that get done because no one knows it is impossible yet.

    • Very early computers like ENIAC were essentially programmed in microcode, so each element of what today would be the ALU could be used simultaneously with all the others. This isn't "multitasking" in the way we think of it today so much as hand-tuned "pipelining."

      Similar tricks were used in machines that had drum and delay-line memories, arranging the instructions such that the next one needed would emerge from the magnetic head or piezo reader just as the previous one was finished executing ... often with other instructions belonging to a different "thread" filling the gaps.

      In those days you programmed to the bare metal because it was the only way to get anything useful done.

  • Female Programmers (Score:3, Insightful)

    by Lunastorm ( 471804 ) <lunastorm@NospaM.myrealbox.com> on Tuesday December 11, 2001 @07:28PM (#2690035) Homepage
    And I've always thought that the first programmers were all men. I do wonder: is there is a higher percentage of female programmers today or has it fallen in time?
    As for those who are belittling her use of mnemonics, you shouldn't take it for granted. Imagine having to type out 'file system consistency checker' instead of fsck among other commands.
    • Regarding mnenotics, Imagine having to type something like JMP377 with the right tape loaded instead of typing in fsck.

      My first computer I used was something like an 8086 (I think). The way you booted it was to load up a paper tape, and manually insert the boot sector at a specific address, and then manually press the go secuence (by loading in something like 377.

      This loaded a driver for the teletype machine, which you could enter assembler codes (as numbers). Mnemotics like "a" for add and "b" for bring would have been an achievement worth speaking of.

      • Regarding mnenotics, Imagine having to type something like JMP377 with the right tape loaded instead of typing in fsck.

        Except, JMP is a mnemonic, so you'd just have to know the code for "jump" (which was 303 for the 8080) as well as the address you wanted to jump to (which would be 377 in your example).

        My first coding class was assembly on 8080 workstations which had an octal keypad for input and 3 rows of 8 LEDs for output. It sucked. When we finally got to use Z80 workstations with keyboards and monitors we had a true appreciation for the blessing of mnemonics! Stored memory didn't seem so cool, though. It was less painful to retype in my program every day than to have to deal with those stupid cassette drives.

        • The point I was making is that in Basic and Assembler, you go to a particular address, not a named function. Had I written 303 377, you may not had appreciated the point.

          In the end, with the Basic RPN calculator, I used magic numbers to control the flow. This allowed for fewer labels, and robust code. The magic cookies were used to control the fan-out, with pre-testing of values (eg divide by 0) before it was attempted.

          When the subroutine was done, it passed back magic cookies to indicate error codes, and its result in PA(0). It was up to the display module to announce the error message and deal with PA(0).

          The message stack was also used for other things as well: you could peek at any register of the calculator.

          But whatever it was, the codes were laid out so that all functions that would crash on DIV0 were together, all functions that required the range -1 to +1 were together, and so forth.

          For the record, I used six magic cookies, in two sets of three. In half the cycle, set A would be controlling the flow, and set B were scratch data, while the other half B controls the flow, and A is the scratch data.

          This greatly reduced the number of GOTO and GOSUB lines. I think in the end, there were 10 different labels of major jumps.

          As for tweaking code: I still do it to great effect. I can still pull 8x increase in speed by selecting the correct algorithms.

    • by devphil ( 51341 ) on Tuesday December 11, 2001 @08:22PM (#2690291) Homepage


      No, it's a question of perceived status. At that time, being a computer -- recall that 'computer' was the title of the person doing the math, not the noisy room-sized thing you did the math on -- was considered something of a drudge job. The men discovered the algorithms, the women did the computing.

      Later, as the idea of working with a (machine) computer as a career became more fashionable, more and more men moved into the field, as it was no longer considered "merely" women's work.

      Remember Lady Ada Lovelace, the first programmer? Babbage couldn't be bothered to do the menial work of actually designing algorithms. Then the act of designing algorithms lost some of its stigma, and men took over. Finally the act of actually coding the algorithms has lost its stigma, and so I (a male) can sit here making a fabulous living as a coder, while my equally-talented coder girlfriend doesn't make as much money.

      The glass ceiling is still there. It just shifts up and down to include/exclude different professions as culture changes. :-(

      • And Grace Hopper, one of the pioneers of high level languages and compilers.
      • The reason the first programmers on the ENIAC were women was that most of the men had gone off to fight WWII. The interesting thing, though, is that all the government propaganda aimed at women in the 40's influenced how mothers raised their daughters -- even if women weren't treated equally, women on a large scale came to believe that they *should* be treated equally. So even though the powers that be put women "back in their place", their daughters grew up with some ideas. That's at least part of what led to the women's lib movement in the late 60's and 70's.

        Unfortunately, it kind of hit a dead-end with many unresolved issues. Now, a great many more women work full-time jobs, but so do men. In fact, the average work-week has grown steady up from 40 hours to 45 or 50. Meanwhile it has grown more and more difficult to financially support a family on one income, but the housework still needs doing and kids need taking care of.

        It's true that women tend to make less for the same work as men, but even if that gets equalized, that doesn't solve the bigger problems. Being forced to work 50+ hours a week is hardly "liberation," and certainly not what the feminist movement was hoping to achieve.

        Ultimately, both men and women need to work together to liberate ourselves (at least those of us in the bottom 99% economically).

    • by Suppafly ( 179830 )
      The original computer programmers were all women, because there was the thought at the time that they would be better at working with computers than men since they would be transisting over from typing pools and from working as jobs at telephone operators, all which were seen as womens job. There is also the thought that women were better at math than a lot of men, which is why women that couldn't get accepted into mathematics programs went into the budding computer field where they were more readily accepted.
      • women were better at math
        To clarify the - women were better at tedious arithmetical calculation. Men were obviously better than women at math(ematics) because there were almost no women mathematicians.

        At least, that was the misconception of the time...
    • I don't think it's realistic to attempt to pin the development of memonic programming concepts on one person--the first mnemonic assembler language was probably developed for the EDSAC at Cambridge, England, some time before UNIVAC ran its first program.
      Programming ENIAC must have been a challenge, because what you had was basically a set of vacuum tube shift registers and whatnot that had to be wired together for a given purpose using patchboards.
    • And I've always thought that the first programmers were all men. I do wonder: is there is a higher percentage of female programmers today or has it fallen in time?

      Actually, Eve was the first programmer. She had an Apple in one hand and a Wang in the other.
  • by Anonymous Coward on Tuesday December 11, 2001 @08:01PM (#2690199)
    Ms. Holberton, this Jolt's for you. You are one of the few early computer geek vetrans of war, an honor that few can claim. Thank you for what you have done for my country, and my profession.
  • First of all, she is one of the first programmers in the history of computing.
    Second, she is probably the programmer with the longest active career: she started before the war, and retired in 1983.
    Third, hey, she had a husband 33 years younger than her!

    I think she got a few things worth of envy, huh?
  • Old time computing (Score:4, Informative)

    by os2fan ( 254461 ) on Tuesday December 11, 2001 @08:28PM (#2690332) Homepage
    Before the computer revolution, computers were expensive and frail.

    My computer at college in 1981 was something nearing the end of its life. It was an 8086 with 4K of ram, and a paper tape drive. To boot it, you load up the tape, and load three values into ram (by a series of eight switches and a "set" switch), and then send a command 377 to the processor. This would jump it tot a location in memory, and then run the commands that you loaded there (effectively JMP address), which would then run the KEX program. KEX was a driver for a teletype. After that, you input through the keyboard by assembler code.

    Compared to that, mnenomics like a for add and b for bring would have been a godsend.

    Of fortran, basic and cobol. In the days of wire wound core, each bit of the byte made the machine more expensive, and there was some comprimise on the size of the bit. Fortran was designed to run on a six-bit machine. Even Knuth's MIX is underpowered by modern computers.

    BASIC is intended to run in small memory. MS made their packet by bumming it into 4K of ram, with a point and shoot interface.

    In effect, you moved a cursor around the FAT and entered on the file you wanted to run or edit, at least on the tandy 1000. Still, I built a RPN multibase hackable calculator in 6K of code.

    Where BASIC comes off the rails is that people start using it as a general programming language. Its inability to pass parameters to subroutines is easily overcome

    Thus var1 = fn3130{x, v, z} can be written as:

    A1=x:A2=v:A3=z:GOSUB 3130:var1=A1

    In fact, once the kernel is written and documented, you can turn a generic RPN calculator script into specific special purpose code. I had mine so that all variables in the calculator start with O, P and Q. The idea was that you could write messy code outside these letters, and use the calculator as an input device.

    And they say girls can't program. Ha. We just do it differently.

    • >It was an 8086 with 4K of ram,

      Hmmm. Sounds more like an 8080...
      • It was years ago. Haven't seen the docos for 15 years. You may be right...
        • I started out, in high school, on an Altair 8080. It did not have the "front panel switches", but the model before it did. The machine booted from an 8 inch floppy and we had several teletypes. One was equipped with a tape reader which I used to back up my (rather simple) programs. We also had a DEC printing terminal. (MUCH faster than the teletypes!) And even a couple of CRT's which a friend of mine kicked up to 9600 baud one day and blew us all away! Ahhh. Those were the days!
  • by os2fan ( 254461 ) on Tuesday December 11, 2001 @08:35PM (#2690372) Homepage
    that, when the mainframes ruled, that computing was associated with DATA (ie bits, bytes, fields and records), as in Automatic Data Processing, Datamation &c, but now that data is easy to come by, it's INFORMATION (eg Information Technology).

    I wonder how many IT people suggest technologies that are not computer-related: eg how many people suggest paper cards as a solution. I know I have.

    You see, once you start fiddling around with the hardware like Betty H did, you start using it wisely. It is one of the reasons that Unix works so well.

    • >> that, when the mainframes ruled

      Mainframes still rule.
      • When most people think of "computer", it's the desktop box, not some dinosaur in its air-conditioned pen with a side order of halon and a number of white-coats preening it. [Jargonfile quote] This is what I meant

        Now, with Beowulf clusters, Distributed computing, and IBM runing Linux in mainframe VM machines, one is seeing lots of little PCs competing on mainframe turf. PCs have largely displaced mainframe terminals. A pc running a 3270 emulator is cheaper than a 3270 terminal!

  • see ya in programmer's heaven...
  • Right down the hall (Score:3, Interesting)

    by r3volve ( 460422 ) on Tuesday December 11, 2001 @10:59PM (#2691160)
    I'm typing this from a Penn CSE lab which is right down the hall from the room housing some of the remnants of ENIAC. There is a large picture on the wall, in which Betty is featured prominently, and she's been mentioned in almost every undergrad CSE class I've taken here so far.. it's nice to know that her efforts won't go un-noticed by kids like me these days, who grew up in the PC (non-mainframe) age.
  • I'm readinga good book rigt now about te attempt at the first computer called, "The difference Engine". It'd b Charles Babbage and is prett good.
  • I never heard of "Betty" before today. Having read her obituary I can only think to say thanks for all that she has left of us with that was uniquely her. How many of us will be able to look back at our lives and say we have contributed as much?

What is research but a blind date with knowledge? -- Will Harvey

Working...