Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
News

Electronic Abacus 129

yoey writes: "Blast from the past in an article at the Economist: There are those who do not believe in the desirability of introducing anything as esoteric as electronics into business routine at all. Others believe that there is a limited field for electronic methods, provided that they fit into, and do not disrupt, established business systems. But there is a third group ... who consider that a major revolution in office methods may be possible. This revolution would involve scrapping the greater part of the established punch card calculating routine and substituting a single 'electronic office' where the giant computor [sic] would perform internally all the calculations needed for a whole series of book-keeping operations, printing the final answer in and on whatever form was required."
This discussion has been archived. No new comments can be posted.

Electronic Abacus

Comments Filter:
  • LEO (Score:3, Informative)

    by trash eighty ( 457611 ) on Thursday November 29, 2001 @08:51AM (#2630382) Homepage
    i've often heard about the LEO (Lyon's Electronic Office) being for the first business application of computing but have yet to see anything about the tech they used and how they developed the system. can anyone point me out to some online info?
  • this "computor" will never come close to the slide rule in efficiency, simplicity, and elegance. The slide rule will never be replaced by such a monstrous contraption. Besides, it really impresses the babes when displayed prominently in my breast pocket. ;-P
  • They'll have to run phone lines all over the country to one big computer that can do all the calculations and print them out as needed. Besides, can you imagine having to train all those people? With the money you'd save having all your technical people in one place you could hire some people to do windows ...
  • 'Although there are no moving parts to go wrong, the law of averages seems to dictate that some of the myriad components will occasionally fail, and scientists who work regularly on computors rely on the machine being available for at best only 80 per cent of its theoretical working time. Repairs are usually quick and easy, but it becomes obvious that should there be a major breakdown,business routine could be thrown into chaos'

    well, they got that bit right ....

    but why spell it 'computor' ?
    • their computor has no speall chekor.
    • This is an English article, and "computor" is probably correct per the Chambers English Dictionary. I thought somebody from Dallas had written it in the last few weeks until I saw the reference to valves. Hollerith punch cards are still used for mounting film negatives. The 1954 date on the article explains a great deal. 1024 byte memories were expensive and 16K was a huge investment. Tape held stored data and Hollerith cards were punched with batch programs. Users and programmers were kept separate. Speed was measured in milliseconds, and cooling made computer engineers wish they had Niagara Falls available. Today pocket calculators have more computing power. Don't get carried away with nostalgia remembering this expensive equipment, considered very reliable only by 1954 standards. These are the good old days.
    • afaik a 'computor' used to refer to a human who had the specialised knolwledge to use a 'computer,' from a time when computers were mechanical and analogue devices ... it's a slightly archaic term for someone/something that does computing that has been dropped as the technical knwoledge needed to operate computers has diminished ...
    • Having experianced full factory downtimes where I work, I can say they are very expensive. However, the level of automation used is a vast improvement over ANYTHING manual. To properly manufacture a part, over 500 individual steps must be performed. Data collection is automated and used in stastical process control. This means that at measurement steps, if the range is trending large, then the tool needs service before any manufactured parts are out of tolerance causing expensive scrapped parts. If measurements trend low or high, a tool adjustment may be needed instead of tool downtime. Data is associated with each tool and each product. With this data we even were able to find our tools in one case were fine, but a batch of raw material was not. In high volume manufacturing, the slide rule people simply could not keep up. There are too many variables. Automation also keeps track of each part so nothing gets double processed or miss a step. The tool will not start a wrong part because the lotfile will be at the wrong step and therefore will not provide the parameters to the tool to process it. This alone prevents millions in losses from human mistakes. Grabbing the wrong part and starting it no longer happenes. It's lot tag barcode is read as part of the start process. No match = no go. There is nothing in the dead tree media that uses a person with this accuracy. I have made the mistake beofore of missreading I's L's 1's O's 0's Q's etc. The barcode is never fooled.
  • in the end, when some assistant accountant accidentally puts in a 20' cat5 cable as $200.17.

    one less set of eyes to notice the mistake, 45 mote minutes on the phone for me.
  • by slamb ( 119285 ) on Thursday November 29, 2001 @08:55AM (#2630405) Homepage

    where the giant computor [sic]

    Has [sic] ever appeared before in a Slashdot article? That amazed me. Granted, it was put there by the submitter, not an editor, but still that's pretty amazing.

    • Re:First "[sic]"? (Score:2, Informative)

      by Anonymous Coward
      In the good old days, "computer" was a person who computed, so another word had to be used for a machine that calculated things. I had a professor who was in charge of a WW2 lab of "computers" who were mostly attractive young women. He was of the opinion that things had gone downhill in the computing field since then.
  • Electronic Abyss (Score:2, Interesting)

    by Mentifex ( 187202 )

    We may smile complacently at how inscrutable the future was a few decades ago, but we ourselves are incapable of seeing beyond the Technological Singularity [caltech.edu] that will make our A.D. 2001 era seem even quainter than the time period of the referenced article.

    Tempora mutantur, nos et mutamur in illis.

    • Note, however, that the article was remarkably perceptive about the potential for business efficiency improvements, although it sounds as if there were sufficient existing examples to make such predictions straightforward.

      The Economist [economist.com] contains some of the finest reporting and analysis anywhere. I've wanted to subscribe to it for years; with luck, I'll bite the bullet soon.

    • [B]What[B] singularity? We're seeing right now what happens when computer technology begins to reach the point where it overwhelms the users.

      [B]People abandon it.[B]
    • A super-intelligent, omnipotent being in the future? Bah, we created that millennia ago. We called him God.

      Incidentally, while Vinge mentions a few 20th century minds who have envisioned the singularity which he describes, his super-intelligent being is not unlike Neitzsche's idea of the Uber-mensch or Superman.
    • When I read the people who think nanotechnology and/or computers are going to lead to artificial life and/or super-human intelligence, I can't help but feel that the authors don't live on the same planet as me. I mean, 30 years is a long time, but it's not that long.

      Here are a few things to think about.

      We have never designed machines that reproduce themselves, and as far as I can see, we are not even moving in that direction. Thirty years to self-reproducing machines that can either survive without humans or at least compel humans to support them? I don't buy it.

      For the most part we don't even have machines that can take care of themselves for periods of months! Cars can't drive themselves, and even if they could, they certainly can't change their own oil or anything of that sort. Factories can't send out machines to forage for raw materials, electronics don't have the foggiest notion of reproduction.

      The closest things we have to autonomous machines are probably space probes, and they certainly cannot self-repair or reproduce.

      What about a super-human intelligence? Think about this: we can't even come close to simulating a human being. We don't even know what it would take to simulate a human being. We barely understand how human beings work. And much of what we do isn't really conscious thinking. For example, a great deal of processing power goes into playing sports. We can't even design autonomous robots to play a decent soccer game.
      Sure computers can play good chess, but can they also walk up a flight of stairs? Do they have the sense to run from a burning building? Could they do anything to stop me from unplugging them?

      When it comes to making thinking machines, our current capabilities are not even as advanced as a dog. (Chess-playing computers aren't thinking machines, because they are really just ordinary computers running a program) If the "thinking being" also had to be mechanically resillient and reproduce and have useful self-preservation instincts, I doubt we could do as well as a nematode.

      To design is human, but to design a human is another matter altogether. While I believe the singularity could happen some day, I don't buy 30 years. No way.

      MM
      --
  • Thats all very well and good, but what happens when this megomanic computer encounters technical difficulties? Or needs updgrading? Or just rebooting?

    What OS could it possibly run? Anything by Microsoft is out for obvious reasons, and even UNIX based systems aren't up and fully running 100% of the time...

    "Had a terrible day at the office, hunny, the computer went down and we all had to *think*!"

    - Phil
  • HAL (Score:5, Funny)

    by jodonn ( 516010 ) on Thursday November 29, 2001 @08:57AM (#2630414)
    Good morning, Dave. What are you doing?

    I have to go to the bathroom.

    You've already been twice this morning, Dave. Perhaps you should cut down on the coffee.

    Hal, let me in. I really have to go!

    I'm sorry, Dave, but I can't let you do that. Please go back to your desk.

    etc.
    • HAL, I have access to the computer room. And last time I checked, liquids and computers don't mix. I'm thinking about killing two birds with one stone!
      • Dave, please don't aim for my power supply. Any introduction there of your electrolytic micturations will surely cause me to malfunction.
  • This is probably the business revolution that we've all been waiting for - specialized machines that are built to do one task and handle it with frightening speed. Why, of course, waste valuable disk space and memory with a bloated operating system when all you need is a ridiculously fast scientific calculator?

    -Evan
    • Although specialised machines to do a single task - your "scientific calculator" sound like a great idea, there is one major problem - people.

      A pen is a specialised machine to do a single task.
      A ruler is a specialised machine to do a single task.
      Even a calculator a specialised machine to do a single task.

      My folks can operate those, anybody can - even the new trainee office assistant fresh out of school. Embedded, even "simple* embedded systems are going to be more complex than these time- honoured tools.

      Sometimes you just have to take a step back and realise not everyone is as technically au fiat as the Slashdot populous.

      - Phil
  • It will never happen! What do they think this place is? NASA?
  • by bluGill ( 862 ) on Thursday November 29, 2001 @09:04AM (#2630442)

    This is excellent for those highly trained people needed to keep those things running. Even if someone invented a valve that was 100% reliable, tax laws change often enough that many operators will be needed to keep this computer up to date.

    • Actually, computers have a downright evil role here: They made it possible for our tax code to become so complex that most companies need outside firms to compute payroll on their behalf. Without computers, the computations now needed to produce a paycheck would be impossible.

      Almost makes me want to be a luddite :-).

      D
  • But there is this drawback: a full-sized computor carries 4,000 to 5,000 valves and thousands of other small electronic components. Although there are no moving parts to go wrong, the law of averages seems to dictate that some of the myriad components will occasionally fail, and scientists who work regularly on computors rely on the machine being available for at best only 80 per cent of its theoretical working time.

    There were several big problems for all such devices before the invention of Solid state components. onw of these was reliability. Tubes (valves) burning out, etc. And the other was the hand manufacter nature of these devices. The end result was that the was an effective limit to the size that you could build these things before it was down for repair and maintenance most of the time. This was ultimately true, even for transistor devices.

    Solid state devices, being able to combine all of these thing onto a chip solved the problem.

    • Transistors *are* solid-state devices, and there is no reason for discrete transistors to fail any more often than does a single transistor out of the many that make up an integrated circuit.
      • by Alien54 ( 180860 ) on Thursday November 29, 2001 @10:46AM (#2630934) Journal
        Transistors *are* solid-state devices, and there is no reason for discrete transistors to fail any more often than does a single transistor out of the many that make up an integrated circuit.

        This is all documented in the book "the Chip" by T. R. Reid, which I literally have on my desk as I write this. It is briefly summarized here [dotpoint.com]:

        In those days, electrical engineers were aware on the potential of digital electronics, however, they faced a big limitation known as the "Tyranny of Numbers." This was the metaphor that described the exponentially increasing number of components required to design improved circuits, against the physical limitations derived from the number of components that could be assembled together. Both, Kilby at Texas Instruments, and Noyce at Fairchild Semiconductor, were working on a solution to this problem during 1958 and 1959.

        [First Integrated Circuit] The solution was found in the monolithic (meaning formed from a single crystal) integrated circuit. Instead of designing smaller components, they found the way to fabricate entire networks of discrete components in a single sequence by laying them into a single crystal (chip) of semiconductor material. Kilby used germanium and Noyce used silicon.

        In other words, it wasn't just solid state as in a single transistor, but solid state, as in entire cirsuits, the integrated IC that was the solution.

        The problem was that transistors still had to be interconnected to form electronic circuits, and hand-soldering thousands of components to thousands of bits of wire was expensive and time-consuming. It was also unreliable; every soldered joint was a potential source of trouble. The challenge was to find cost-effective, reliable ways of producing these components and interconnecting them.

        The Tyranny of Numbers was quite real, and occupied minds for most of the 50s. The solution of this basic and fundamental problem made possible the computer age. They are probably as important as the binary logic that runs on them.

        You can also read more about this [ti.com] on the Texas Instrument Website.

        • Right, if an ancient computer takes twenty thousand vacuum tubes, and a vacuum tube lasts two years on average, how long will the computer run between breakdowns? You do the math.
          • Right, if an ancient computer takes twenty thousand vacuum tubes, and a vacuum tube lasts two years on average, how long will the computer run between breakdowns? You do the math.

            Remember, especially if you look at the TI site, they are describing the engineering problem they had. You can argue if this is sensible or a fraud of whatever, but these engineers were describing the problem they actually had.

            Your reluctance to accept it is not their problem.

            here is a google search link to help you out

            http://www.google.com/search?q=Kilby+tyranny+numbe rs+engineering [google.com]

            Simply put, your objections are not all that relevant because they are contradicted by facts. It is nothing personal. It is just some sort of a blind spot that needs to be sorted out. The facts of the matter are as real as the millions of 1950's dollars that were spent searching for a solution.

            • I must be missing something here.

              2 years/tube / 20,000 tubes = less than an hour between repairs. 1 year/tube = less than a half hour between repair on average. This is an example of the tyrrany of numbers, especially w.r.t. the proliferation of ever more vacuum tubes in "new" models of computers.

              What were you thinking I was saying? Your reluctance to be on the same planet as me is not my problem.
              • The implication I got from your comments was that the Tyranny of Numbers was nonsense. A sort of "nah, couldn't be possible type of thing". And what you said implied to me that the tyranny of numbers was not relevant because the MTBF as about 2 years, implying that it was not an issue because if you started out with all new tubes, it all should run for a year or so. which is of course wrong, as you have since demonstrated.

                Which is why I pointed you to the source materials, etc. I was getting frustrated.

                ;-)

                That said, of course the basics of this issue were not tubes so much, but the exponential increase in hand soldered connections, increasing wire length with attendent signal lag and interference problems, and exponential complexity of design in the smallest box possible. These issues continued to exist even with the discrete standalone transistors.

                So tubes were not a core part of the issue at hand, even though they were still widely used. Looking over the previous posts, you seemed to miss things like this paragraph:

                The problem was that transistors still had to be interconnected to form electronic circuits, and hand-soldering thousands of components to thousands of bits of wire was expensive and time-consuming. It was also unreliable; every soldered joint was a potential source of trouble. The challenge was to find cost-effective, reliable ways of producing these components and interconnecting them.

                which got ungodly when dealing with thousands and thousands of components. (vs the many dozens in a radio or Stereo)

                So Basically, I said RTFM (ie, Article)

        • Actually "monolithic" means "single, or one, stone or rock". In this particular case it's a single silicon or germanium crystal. Solid state means not a liquid state or a gaseous state or etc. Hand soldered connections are external to both discrete semiconductors and integrated circuits and if they fail it isn't the transistor's fault. I'm not saying that integrated circuits aren't a great idea, but problems with early discrete transistors (as opposed to the circuits of which they were a part) were due mostly to semiconductor fabrication and packaging (your average TO-5 can being much, much larger than the actual transistor inside) being an infant technology at the time and not to their not being part of an integrated circuit.

    • There were several big problems for all such devices before the invention of Solid state components. onw of these was reliability. Tubes (valves) burning out, etc.

      Running the tube heaters (filaments) at low power increased their reliability enormously, but this type of computing still required external checks for errors.
    • a full-sized computor carries 4,000 to 5,000 valves

      My how times change...

      I'd like to see any /.'er get their work done on a machine with only 5000 transistors!
  • For the sake of irony, I hope that the Electronic Abacus has calculators for the "beads".

    Having an electronic abacus sounds as about as useful as an "electronic sliderule". Anyone else see the irony there?
  • ...printing the final answer in and on whatever form was required.

    Forty-two.
  • by bodland ( 522967 ) on Thursday November 29, 2001 @09:12AM (#2630484) Homepage
    The intentions of business to increase productivity and reduce costs by utilizing electronic devices was wrough with good intentions in 1954. People were still agog with the value of the computers to tackle boring tasks during the war. (artillery trajectories) It only seemed natural to extend that to tasks in the business place that were always considered a royal pain in the ass...payroll...

    What noone figured was the effect of personal computers on business. People still believe they increase productivity and decrease costs. This is the biggest lie out there. The use of the PC in the business has reached and passed the point of dimishing returns and really manay people could better serve companies by shoving the PC aside and getting out a good old pad of paper. We have so lost touch with reality. How many of you do nothing when you can't login or access the network?

    Man was doing business for thouysands of years before computers and in reality much of business is still done without them. We (us folks with PC in our face) have experience in business without computing...shame on us.
    • What no one figured was the effect of personal computers on business. People still believe they increase productivity and decrease costs. This is the biggest lie out there. The use of the PC in the business has reached and passed the point of dimishing returns and really manay people could better serve companies by shoving the PC aside and getting out a good old pad of paper.

      And thus is Vinge refuted. Seriously.
    • Excuse me?

      If anything we are LACKING sufficient computing to
      allow for efficient operation of our businesses.

      I recently went to refinance my car ( at a bank that will remain nameless ). This bank held my original car loan. I spent an hour filling out
      paperwork ( all of which had been filled out with
      my original loan ), having the loan officer call my
      insurance company to get my insurance information
      ( even though they had all of my insurance info on
      file ), etc. All of this totally redundant. I had to come back the NEXT day because the guy
      the loan officer calls to do credit checks and fax
      them to him was busy.

      All of this should have been accomplishable over the
      web. There is NO reason that it had to be that
      hard.

      Oh yes, and then they proceeded to automagically
      debit my old loan payment ( several days after
      the old loan had been paid off in full by the new
      loan ) because it takes about a week for the PAPER
      to work it's way through channels.

      It took almost six weeks for the bank to
      restore order to my account ( I will not recount the full ins and outs of that, but it was bad ).

      All of this was unecessary. If they had proper computer systems handling the back end of the bank I should have been able to go to the web page for my account arrange refinancing there in under 10 minutes.

      Note that every one of the bank employees I dealt
      with had a computer on their desk. What made this
      experience so inefficient ( and frustrating ) was
      not their lack of computers, but the lack of competent back end systems for them to access with those computers. That is were the efficiency comes.
      • I agree. Banks must have the most expensive, yet most poorly designed computer systems imaginable. Either that, or its the misconceptions of technophobic management. Example: I deposit cash at my bank after 3 PM on a Friday. Well, I suppose the human element there precludes having it deposited that day, so I expect it it not to be available via ATM until Saturday afternoon (their office is open). Nope, I don't see it until Monday EVENING! It's cash, dammit! You didn't even need to deal with another bank's check! Imagine if I had a loan with them and paid them two days late every time.. . let's see how long I get away with that.
        • Well, it would depend on their schedule for opening and matching up all the ATM envelopes, which might very well take a day or two (especially if one of the days is Saturday.)

          As far as checks go, they do indeed immediately deposit your check in their own internal accounts (with other central banks) and don't wait for it to clear. Of course, some checks bounce, and they have to pay the penalty to the central bank, but the interest they get from depositing everyone's checks (but not crediting your account) for those couple of days outweighs the loss by a factor of a thousand.
        • Banks are funny like that. I was working at one doing some consulting and these little ladies would suffle down to the IS department and sit in the conference room with the ATM and after hours deposits and open each envelope and record each transaction on paper. They took that paper upstairs to process.

          Banks have to have some "official" person to check the tiller's daily transactions and as usual deposits made after 2 p.m. are posted on the NEXT business day. Because that "official" is gone golfing after 2 p.m. After all they don't call them banker's hours for nothing....
    • People were also doing business for thousands of years before the slide-rule and the abacus. Or the internal-reservoir ink pen. Heck, the pencil, for that matter. People in the fertile crescent back in the days of Sumeria and Akkadia did business without the use of paper, and before that there were people doing business without even a form of written language. Does the fact that it was possible make it better? Do you want to physically perform search queries on the reams and reams of paper that would encompass Wal-Mart's database were it in physical format? Or perhaps you'd like to handle all the people who want instantaneous order-tracking? Oh, I know, you want to sit down in the basement and operate the switchboard for the company's 1200 internal telephones, right?

      People also sucked the marrow from bones and ate raw meat for thousands and thousands of years before fire was put to use in food preperation. There're still places where people do it. Does that mean we should all rush out and chase a buffalo off a cliff?

    • The use of the PC in the business has reached and passed the point of dimishing returns and really manay people could better serve companies by shoving the PC aside and getting out a good old pad of paper. We have so lost touch with reality. How many of you do nothing when you can't login or access the network?

      You are forgetting the orders of magnitude change in scale of the amount of business being done. Shut down all PC's and sit around for three days a week and you will still get twice as much work done in two days as you did 50 years ago w/o computers.

    • The computer is the favorite "busy toy" of someone who doesn't really want to work at work. That, combined with the efficiency gains, has lead to a lot of people doing a lot of nothing aside from maintaining the appearance that they shouldn't be fired. The pointless meanings, aimless memos, unnecessary and ineffective initiatives, etc. Thus the gains of computers are deliberately concealed.

      The second of the problem is computers being untrusted and used as a parallel system. There is no such thing as a paperless office. Paper records are kept parallel to computer records, sometimes requiring all of the old work, and all of the computer work. This is more often an apparent problem than a real one, as it moves a smaller amount of administrative work to an earlier time. While it's less efficient than it could be, it's better than not using computers.

      The third part, aside from being a problem in itself, is half a cause and half an effect of the second; they grew up intertwined. As home computer systems were adapted to office use, and the first generation of programmers raised with computers from a young age appeared, reliability was sacrificed in favor of attractiveness, apparent ease of learning, and flashy feature checklists. Software is replaced every few years, and old data is lost or damaged. Many users stopped considering computers infallable and started considering them unreliable, dramatically limiting their use.

      This last one is indisputably a real, serious problem in software design. Software is rushed to market, then thrown away just as the bugs are worked out. But this is more directly a problem of poor software purchasing; the market demands features, useful or not, and the software industry can only comply to that demand, or be unprofitable and unappreciated.
  • Old joke (Score:3, Funny)

    by ch-chuck ( 9622 ) on Thursday November 29, 2001 @09:24AM (#2630520) Homepage
    IBM has come out with a machine that can do the work of 20 office clerks. The only problem is it takes 50 technicians to operate it.
  • Looking forward (Score:2, Insightful)

    by rootmonkey ( 457887 )
    Kinda makes you wonder where we'll be in another 50 years. They couldn't imagine a small computer, what is it that we can't imagine now?
    • How about an OS that doesn't drive its admins absolutely nuts, users of said OS that realize things like hitting the print button repeatedly will not clear the printer jam, a call to tech support that is moderately less painful than oral surgery without anthestic, a web page without banner ads, cheap broadband access, and an EULA that finally manages to slip past the legal department with the header of:
      "
      By opening this package, you transfer ownership of both your soul, and first-born child to us, as well as acknowledge that all your base are belong to us"
      .

      How about a MS press release that states that they have had a change of heart, and now will be concentrating on producing quality Open Source software, because they've finally run out of room to put the money?

      Any one of these would do quite nicely. Although personally, I'm still holding out for a little liberty, justice, and free beer for all...

    • What can't I imagine (Score:2, Interesting)

      by Nf1nk ( 443791 )
      A short list of things that won't work right. (please prove me wrong)
      Voice recogniton software (that works)
      Good search engines
      high speed internet access at home (no really)
      flying cars (its 2001 where the heck is my flying car?)
      cheap household robots
      wet wired computer hardware
      traveling by car faster than by bicycle (traffic issues)
      and many many more wonderful items that Iam too wacked out on caffine to think of
    • Somebody throws you an open line like this and all your can to is ask for is bug fixes, new versions of existing stuff, and BillG's head on a platter?

      What happened to your imagination?

      Aristoi Walter Jon Williams.
      The Diamond Age Neal Stephenson.
      Dreaming Metal Melissa Scott.

      And fer fuck sake! Neuromancer!!!! (William Gibson, of course)

      Can't we just wish a little?

      How 'bout a completely new programming language? Not just different syntax, but completely redesigned from the basic theories on communication on up.

      How 'bout the next generation OS built on this new language? Not Linux, not WinXP, not Java, not "some new flavor of" but the next step in OS evolution? Think cyberpunk, guys. (And, no, not the RPG, but the real stuff, Gibson, Sterling, Williams, Scott, Delany, etc). Only MS could make a 3D interface boring. A Hallway? Give me a break.

      How 'bout computer circuitry embedded into clothing? Hell, how 'bout computer circuitry embedded in your head?

      How 'bout Salma Hayek?

      How 'bout pulling your thinking cap out of the closet, dusting it off and running wild with it?

      Moekandu

      "The object is not to bring your enemy to his knees, but his senses." - Ghandi.

      "That man has a mind like a steel trap. It tends to slam shut at the slightest quiver, and things tend to come out mangled." - Me.

      Of course, if your enemy has no senses, a proper use of WhoopAss will do the trick!

  • by trb ( 8509 ) on Thursday November 29, 2001 @09:41AM (#2630592)
    is As We May Think [theatlantic.com] by Vannevar Bush, in Atlantic Magazine, July 1945. They have a web page Prophets of the Computer Age [theatlantic.com] with more interesting flashbacks.
  • I remember buying some goods at a local catalog sales store in the 1970s and seeing a collection of ancient looking IBM mechanical punched card equipment in their back office, still in active use. No computers were evident, just specialized electromechanical devices processing Hollerith cards.

    I've always wondered what tasks were performed by these predecessors to the modern computer. I assume that the catalog sales store was using them to keep track of inventory and/or sales.

    Sometimes you can see these machines in 1960s spy or science fiction movies. Look for scenes where "the answer" to a question is delivered on a Hollerith card.

    • Sometimes you can see these machines in 1960s spy or science fiction movies. Look for scenes where "the answer" to a question is delivered on a Hollerith card.

      Holy 1960's, Batman! Who needs spy or sci-fi movies when the BAT-COMPUTER(tm) did that?
  • by Anonymous Coward
    it's called Microsoft Excel

    e.g. MSExcel is the glue which holds together the banking sector's myriad specialist systems
  • by markmoss ( 301064 ) on Thursday November 29, 2001 @10:28AM (#2630845)
    Where this article talks about "scrapping the greater part of the established punch card calculating routine", it isn't about scrapping the punch card keyboarding machines, but about using the cards as input instead of as the databases themselves. From the 1880 census until the 1950's large databases consisted of boxes (and sometimes cabinets or rooms) full of punch cards. To process the data, there were a number of specialized machines:

    Keypunches: a keyboard that punched holes into cards. Good ones also typed the data along the top of the card so it was human-readable, and could copy part or all of a card. The ones I worked with could be programmed by typing control codes onto a card and wrapping it around a spool inside the machine -- this gave you tab stops and let you set it to automatically copy headers from each card to the next one, until you hit an escape key to let you change the headers...

    Sorters & mergers: Sorting and grouping was accomplished by machines that would physically shuffle the cards into order. The operation was counter-intuitive; to alphabetize a 20 letter name field, you'd start by sorting into 26 bins on the _last_ (rightmost) letter, stack them up and sort on the next letter (which left cards differing only in the last letter in order), and repeat for 20 times through. Searches were done by setting the sorter to set aside cards matching the criteria and running the whole set through. And after doing a search or other operation that split the deck into two categories, it was nice to have a machine to merge the two decks back together into order without requiring a full-scale sort.

    Tabulators: Would read the sorted, grouped, cards and add up the columns. Also could perform calculations on a card (like hoursworked * payrate = grosspay) and punch the answer into the card, or onto a new card. Tabulators generally did not type human-readable text on the cards, so...

    Printers: One kind would read cards and type text along the top. Some of these were still in use in the 1980's, because mainframes still could output to card punches, and those punches did not type text... The other kind read cards and printed the report on paper.

    When I started hanging out at the college computer center (1971), the databases were kept on removable hard disk packs, and punch cards were mainly for data input. However, even though they'd keep 3 copies of each database on different disks, the reliability was low enough that for really important stuff they'd also store the punch cards as a backup, or sometimes have the computer punch a backup into cards. The machine that printed on those cards was kept running, just in case. At least a half-dozen keypunches were in continuous use (and the card reader on the computer had to be overhauled once a week so it could continue reading all those cards). The tabulator was just gathering dust, but the sorter was used frequently -- batch database updates run faster if the input is in the same order as the disk file.
    • So in this punchcards world, a hacker tool is an icepick?

      "Oh this guy is good!, see here? where he introduced the virus? see how round and aligned those holes are? this guy is a first class hacker man!"

      Thank god they didn't write Swordfish back then...
      • Viruses weren't a big concern in those days. Maybe the trojanned computer could punch multiple copies of the program onto cards, but tricking the staffers into mailing them to other computer centers was difficult... 8-)
    • by epepke ( 462220 )

      The operation was counter-intuitive; to alphabetize a 20 letter name field, you'd start by sorting into 26 bins on the _last_ (rightmost) letter, stack them up and sort on the next letter (which left cards differing only in the last letter in order), and repeat for 20 times through.

      This is the radix sort, which all hackers should learn while teething. When you apply certain rules that ensure that only columns that might be significant are compared, it is an extremely efficient sorting algorithm.

      Worst-case, the complexity is O(l n), where l is the length of the longest string and n is the number of strings. With completely random data, l is effectively log n, so overall it goes to O(n log n). The extra rules substantially reduce the effect of lack of randomness in strings, so it's likely that the algorithm will almost always run in O(n log n)

      Compare to a merge sort, which is O(n log n) worst-case, the best you can get, but that assumes that the comparison step is constant. With a string, worst-case comparison is O(l), resulting in overall performance of O(l n log n) or, with random data, O(n log^2 n). QuickSort is even worse, with a worst-case performance of O(n^2 log n), though still an average performance like the merge sort. (Too bad I can't use superscript on this board.)

  • Sure, but I still remember having, as a sysop back in the '70s, dropping stacks of punch cards (on a number of occasions) and having to get them back into sequence by eyeball.

    Have we come that far?

  • The idiosyncrasies of British currency...

    Note. This was written before Decimalisation came along in 1971. before then, instead of the current system of 100 pence to the pound, we had Pounds Shillings and Pence commonly known as LSD (from the latin, Libra, Solidi, Denarii).
    There were 12 pence to every shilling, and 20 shillings to the pound.
    • Don't forget about halfpennies and farthings (1/4 penny) coins as well
      • Re:LSD (Score:2, Interesting)

        by tubs ( 143128 )
        Could you imagine if we had kept Pounds, shillings and pence and tried to change it now.

        I wonder what the Suns headline would be? "decirubbish" followed up by "We don't need no fangled eurocratic decimalisation here!"

        Strange to think, One and two shilling were still legal tendar about 10/12 years ago - as 10 and 5 pence pieces.
  • They have a point (Score:3, Insightful)

    by joshv ( 13017 ) on Thursday November 29, 2001 @10:44AM (#2630921)
    Overall I think that the application of computers has lead to a remarkable increase in efficiency. But I think you have to keep your business processes simple to realize true efficiencies. Computers do not handle exceptions well, they do not make judgements well. If your business processes are riddled with judgement calls and exceptions you might want to think about replacing your back end system with the good old fashionned pen and paper (or spreadsheet these days).

    I recently interviewed at a company that had 400 employees. They had an terribly complicated year end bonus structure. They spent millions of dollars and many man years automating the bonus calculation process. For 400 people. Think about that for a minute. You could hire temps to do the calculations for the next 50 years for what it cost to automate the process. To top it off the rules change every year, forcing a recode of the calculation engine.

    But the root cause is needless complexity. Whether you do it with computers or will people complexity adds cost, with usually very little benefit. I have seen executive bonus systems that jump through torturous calculations that end up in a net difference of $50 compared to a simple flat percentage scheme. People just don't think about why they are making rules, and what the cost of those rules will be to the business.

    But anyway, my point was, past a certain level of complexity you are better off doing it with people, instead of building fragile and intracate rule based automated systems in an attempt to handle every eventuality.

    -josh
  • by JJ ( 29711 ) on Thursday November 29, 2001 @10:44AM (#2630924) Homepage Journal
    The articles mentions that when the computer broke down the employees might get very upset. The fastest way I know to push employees into panic mode is to screw up payroll. Thus, the employees would be slaves to the machine much more than any conventional bakery. Is this a wise direction for society to be heading?
  • by evilandi ( 2800 ) <andrew@aoakley.com> on Thursday November 29, 2001 @10:45AM (#2630927) Homepage

    For a wealth of information on the computer mentioned in the article, the LEO, see:

    www.leo-computers.org.uk [leo-computers.org.uk] [i.hate.square.brackets] [probably.already.slashdotted.to.hell]

    What you have to realise is at the time, my Dad and other people working on the LEOs genuinely believed that these were the world's first computers ever, not just the world's first business computers as they later became known.

    You see, at the time, all the World War Two computer developments were covered under the millitary Official Secrets Act.

    When these secrets broke to the general public in the 1970s, needless to say my Dad was somewhat dissapointed to discover he was not a great computing pioneer after all!

    My Dad fondly recalls being able to boil a kettle and fry bacon & eggs on these monsters.

    • At the time most people including those in power, really had no clue about the likely impact of Information Technology. My Dad (a retired diplomat) has a copy of a memo he found from the mid 1950's in which the then foreign secretary was considering the subject of demolishing the Foreign and Commonwealth Office buildings in Whitehall to make way for larger "modern" offices better able to accommodate the numbers of typists, clerks and archives that had been increasing steadily over the past fifty years. The plan was postponed on the grounds of expense - they completely failed to foresee the IT revolution, yet were only about a decade away from it. The buildings are now, if anything, too large.
    • What you have to realise is at the time, my Dad and other people working on the LEOs genuinely believed that these were the world's first computers ever, not just the world's first business computers as they later became known.

      Your Dad mustn't have hung around the University of Manchester, then; they had a stored program computer running in 1948 [computer50.org], but the leo-computers.org.uk site [leo-computers.org.uk] says that, although the directors of Lyons "decided to take an active role in promoting the commercial development of computers" in October 1947, the LEO wasn't operational until 1951. Were the Manchester SSEM or Mark I military secrets?

  • Forward thinking (Score:3, Insightful)

    by JimPooley ( 150814 ) on Thursday November 29, 2001 @10:54AM (#2630992) Homepage
    Lyons must have had remarkably forward thinking management to go to all that trouble. Design and build their own 'computor' and basically invent business computing from the ground up.

    Some achievement for a bakery and chain of tea shops!
    • "... have had remarkably forward thinking management to go to all ..."

      And then remakably daft thinking for selling it off, imagine, we could all be running LyonsOS sugar coated edition, or NiceStep.
  • ObJoke (Score:1, Offtopic)

    by ucblockhead ( 63650 )
    Although there are no moving parts to go wrong, the law of averages seems to dictate that some of the myriad components will occasionally fail, and scientists who work regularly on computors rely on the machine being available for at best only 80 per cent of its theoretical working time.

    If only my WinMe box had that kind of uptime!

  • Desk Set (Score:2, Interesting)

    by flufffy ( 192294 )
    if you like this sort of stuff, you should go and rent 'desk set,' with spencer tracy and katherine hepburn, 1957. it's normally classified as a 'zany romantic comedy,' hepburn is head librarian for a reference library at a large tv station, tracy is the efficiency expert brought in who threatens to replace hepburn and her librarians with a 'new computer' (read, large room sized box with flashing lights and whirling wheels ...).
  • There's no way these newfangled 'computors' will ever get my business! Just try to imagine an economy fifty years from now where all business is handled by these here 'computors'. Why, it would be possible to run an entire business without stock or inventory, all you'd need is one of these 'computors' to calculate as if you had them. If people found out about the lack of real, material products, the economy would colapse!
  • Let me state that all of you are absolutely wrong.

    Doing a quick search on Microsoft's web site for "giant computor" yielded no result.
    Hence, in the light of the axiom: "Nothing exists 'till Microsoft invents it" I'm forced to draw the conclusion that there is no such thing as giant computor.
    Damn y'all geeks for a second you almost fooled me in believing all those blinking computer lights I've seen in the movies are for real.
  • I am definately tired of supporting a whole bunch of apps that the accounting department uses. ADP eTime for swipe in/out, ADP PC/Payroll for actually running the payroll, Salomon for general financials, and Crystal Reports for running reports off of the MS SQL backend of Salomon. No fun at all. They NEVER play nice together.

God made the integers; all else is the work of Man. -- Kronecker

Working...