Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
News

Generations 63

Generations no longer last a generation. Whether software applications or people, the length of a generation is decreasing, making communication across different platforms and languages difficult. "Sometimes I feel like a legacy system," goes the old blues song ...

Back in the old days, it was exciting when new software came out. Every day, we hurried to Computerland, hoping it was there. I remember a new version of WordStar with a million control-everything commands. I remember new interactive fiction games like Hitchhiker's Guide to the Galaxy from Infocom.

I don't remember the first time I skipped an upgrade on a software application, but now I skip them all the time. I seldom need the less-than-essential new features that require close perusal of an eight hundred page manual to master.

Same with life. Living life at different speeds, we inhabit different temporal niches. Generations no longer last a generation.

I wrote an article - "In Search of the Grail" - in 1993, describing the impact which playing that Infocom game with my oldest son on an Apple II had on my understanding of what would happen to the world as the world played games with distributed networks.

I believed that interacting with the different world of symbol-manipulation in a context of distributed computing would change how we thought in fundamental ways. In retrospect, my intuition was correct. But six years later, it is also dated, at least three (digital) generations removed from the present.

A generation now in its teens or twenties has been so thoroughly socialized by interaction with the digital world that it doesn't see the lenses through which it sees. What was revolutionary a few years ago is ho-hum, the stuff of wild-eyed speculation now the platform on which that generation stands.

Last week I delivered a keynote speech for a web-based training conference. I said that the symbiotic relationship between networked computers and networked humans had spawned a large number of people who think they're working for the human side but in fact are working for the electronic network. "You're working for HAL," I said, "teaching people how to speak HAL's language."

A woman approached me after the speech.

"Many people in the audience," she said, "don't know what you mean by HAL."

Or what I mean by an Apple II. Or interactive fiction. Or Infocom.

No narrative chronicles the social history of popular computing. The way it came to us like an unexpected birthday present. And nobody seems to want one.

My wife came upon an "ice box" yesterday as we toured a Victorian house. She told a guard that she remembered a real "ice man," how she waited as a child until he had hacked ice into blocks for delivery, then picked up the shattered splinters to eat as a treat.

The guard listened politely and looked away, checking his watch for closing time.

They said it would happen, but they didn't say it would happen again and again, faster and faster. But it does. The points of reference that define the shared experience of a generation are changing more rapidly than ever.

"The Big Picture changes," a friend once said, "about every ten years." I discovered that, indeed, every decade or so, I transitioned into a new developmental stage which re-contextualized everything that had come before.

Now, I am finding that I must reinvent myself, that is, revise the points of reference of how I think, every eighteen months to two years. The leisurely pace of an evolutionary life cycle that changes by the decade is a vanished luxury.

The fact of history itself as a shared point of reference has morphed into an indifference to the historical perspective entirely. History as a discipline, threaded through textual narratives and how text defines time and causality, has morphed into a world of hyper-textual images, in which our personal interests determine the path we travel through images of meaningful events. The patterns of our explorations either connect at intersections or they don't. A shared vision is less important than the machinery which enables us to search in the first place.

I can hear a dissenting voice, pointing out that people ALWAYS did that. We ALWAYS chose which books to read and created a unique pattern from our study. But - and this is a huge "but" - readers in a universe of printed text did not know that's what they did because they shared a vocabulary with which to discuss their experience. That vocabulary imposed what felt like a shared perspective. Only in retrospect - only after images and words had been reorganized in digital space - did we see our former experience as computers have taught us to see it.

The singular prism that bent all light in a print text world has been shattered by a hyper-text world that perceives that prism as a prison.

The excitement of my vision in 1993 is gone. Merchants, circumspect and wary, prowl the digital world. They have taken the gold from the pioneer miners who had to use it to buy food, shovels, and hovels. Merchants are always the pragmatic parents of the next generation, defining the real possibilities of their offspring. They even sell their children uniforms sewn with symbols of rebelliousness, the symbols each generation needs to pretend to break new ground.

So what is the value of experience? A broader perspective? Patience, as Yoda suggested ... what? Who, you ask, is Yoda?

Yoda is a puppet invented many years ago by a film-maker to represent purveyors of ancient wisdom. Yoda articulates wisdom in sound bites that we can snatch on the fly.

I remember diving on the reef, chasing the quick fish and never catching any. One day I swam out over the reef and sank in thirty feet of water. Then I just sat there, waiting, and all sorts of fish, wondrous and strange, came to me.

The digital world can be exploited or pursued, dreams of stock options feeding our greed. But it can also simply be observed. We can just sit there, under the ascending bubbles of our deep breathing, listening to the subterranean clicking. Not even learning the wisdom of not doing. No. Not even that.

This discussion has been archived. No new comments can be posted.

Generations

Comments Filter:
  • by Anonymous Coward
    From Kevin Marks .
    This showed up in a few places on the net last week:

    Annually, Beloit College in Wisconsin puts together an information
    sheet to remind faculty of the mindset of its incoming freshmen class. Here
    are some points of
    reference on the next generation:

    1.Students starting college this fall were born in 1980.
    2.They have no meaningful recollection of the Reagan era.
    3.They were prepubescent when the Persian Gulf war was waged.
    4.Black Monday 1987 is as significant to them as the Great
    Depression.
    5.There has only been one Pope.
    6.They were 11 when the Soviet Union broke apart, and do not
    remember the Cold War.
    7.They have never feared a nuclear war.
    8.They're too young to remember the space shuttle Challenger
    blowing up.
    9.Their lifetime has always included AIDS.
    10.They never had a Polio shot, and likely, do not know what it
    is.
    11.The expression "you sound like a broken record" means nothing
    to them.
    12.The compact disc was introduced when they were 1 year old.
    13.They likely have never played Pac Man, and have never heard of
    Pong.
    14.Star Wars looks very fake to them, and the special effects are
    pathetic.
    15.Blue M&M's are not new.
    16.They have always had an answering machine.
    17.Most have never seen a black & white TV.
    18.They have always had cable.
    19.They cannot fathom not having a remote control.
    20.Roller-skating has always meant inline for them.
    21.The Tonight Show has always been with Jay Leno.
    22.Popcorn has always been cooked in the microwave.
    23.The Vietnam War is as ancient history to them as World War I
    and World War II, or even the Civil War.
    24.Kansas, Boston, Chicago, America, and Alabama are places, not
    musical groups.

    I did one for my son's generation (born 1995):

    1.Students starting college this fall were born in 1995.
    2.They have no meaningful recollection of Queen Elizabeth II.
    3.They were toddlers when the Balkan war was waged.
    4.The net stock crash of 1999 is as insignificant to them as the 1973
    Oil Shock.
    5.There has only been one Dalai Lama.
    6.They were 9 when the EU broke apart, and do not remember Europe at
    peace.
    7.They have never feared AIDS.
    8.They're too young to remember Concorde.
    9.Their lifetime has always included Computer viruses.
    10.Wearing spectacles as anything other than dressing up makes as much
    sense as wearing bowler hats to work every day.
    11.The expression "CD Quality Audio" means nothing to them.
    12.The iMac was introduced when they were 3 years old.
    13.They likely have never played Quake, and have never heard of Tomb
    Raider.
    14.Titanic looks very fake to them, and the special effects are
    pathetic.
    15.Blue skin tinting is not new.
    16.IP addresses and telephone numbers seem equally arcane to them.
    17.They don't understand the distinctions their parents make between
    computers, televisions, radios and newspapers.
    18.Most have never seen a postage stamp.
    19.They cannot fathom everyone watching the News at 10 - it is as odd as
    the concept of newsreels.
    20.Commuting to work daily by car for 2 hours is as alien as accounting
    using paper and ink.
    21.Any notion of scarcity of memory, processor speed, storage or
    bandwidth is on a par with 12 pennies to a shilling.
    22.Chips have always been cooked in the microwave.
    23.The Gulf War is as ancient history to them as World War II.
    24.Cranberries, Fish, Cookies, and Meatloaf are foodstuffs, not musical
    groups.
    25.Owning physical media to play music from it has to be explained to
    them very carefully, but they still don't get it.
  • Computing generations move faster and faster, but that is actually a RESULT of an American generational trend -- a trend away from common experiences and towards diversification.

    In the 1950s, the trend was towards commonality of experience. The nation was forced to come together, in a way, to get through the crises of the depression and second world war. As a result, differences between people were extremely de-emphasized. Consider that during WW2, Japanese Americans were systematically rounded up and put in camps. This would not be tolerated today - thank goodness.

    That same attitude towards emphasizing common experiences, bringing the country together, etc. helps explain why there were only three broadcast networks. We all watched the same TV shows, ate the same foods, wore uniforms to work (or traditional black suits with black hats and black umbrellas). Movies such as "It's a Wonderful Life" emphasized the value in community thinking.

    During the 60s, a generation rebelled against this sort of conformist attitudes, and slowly the subculture became the culture, permitting diversity. Today, having only three broadcast channels would be unthinkable. For many of us it would be an unimaginable horror.

    It is this celebration of the individual and the diverse that has permitted computing to move boldly ahead. It is a sort of "glorious anarchy" that permits, for example, the publishing of individual websites without the okay of some sort of certifying board.

    Going back to "Pleasantville" would be a horror, but there is an odd danger in our reductio ad individual. There will surely be a time when society has to come together again -- to save itself and hopefully improve itself. Strauss and Howe, in The Fourth Turning, suggest that every four generations there is an inevitable crisis period that requires common ground and brings people together to weather the storm. This crisis could be anything: international war, disease, monetary crisis, social/governmental system breakdown. In such times, once again diversity might be seen as a natural enemy.

    These generational effects are quite evident if you look back in history. The same feelings you mention in your column were seen in the 1920s. Here's what Strauss and Howe say about that period: Up until the fall of 1929, American still inhabited a decade then known as "an era of wonderful nonsense." through the 1920s, America felt increasing wild, its daily life propelled ever faster by a stream of thrilling and innovative technologies.

    Wow. So: the "speeding up" of technologies and lack of common experience ARE our common experience. Enjoy them now, and hope that the economic improvements and innovations come fast enough to provide us all with a good place to be once the crisis hits.

  • by Skyshadow ( 508 ) on Wednesday April 14, 1999 @09:53AM (#1934464) Homepage
    This addresses an underlying fear a lot of us entering the tech industry just now have.

    I sat at my desk during my internship at Cray last summer and watched a few of my coworkers become (or who had already became) seriously outmoded. I watched, just as intently, as a round of layoffs came around, mandated by SGI. I saw people who, once their contributions were laid out, had done some awfully exciting things back 10-20 years ago when Seymour and Friends were making the fastest computers anyone had ever dreamed of. They retired before they could get axed -- I ate a lot of retirement party cake.

    It made me seriously wonder about my future. It's easy to start out at the cutting edge and at high salaries, but it can apparently be near impossible to stay there. The pitfalls are everywhere -- starting a family and not devoting 90% of your time to computers anymore seems to be a rather significant one. Others had just gotten stuck in a rut, missing one too many upgrades.

    Those few whom I looked on as having led a worthwhile life (and this scares the shit out of me) were the pointy-hairs. They all had families, good vacations under their belts (backpacking in Europe, not visiting Comdex), secure retirements. Many of them had started out as tech people and moved "up" the ladder.

    So, is it possible to be a technical person all your life and still live? In the case of the first generation of computer techs, I'd have to say almost certainly not. That frightens me.

    ----

  • "Those who forget the past are condemned to repeat it." -- Santanya (I think)


    ...phil
  • Posted by The Mongolian Barbecue:

    We should make a buzzword bingo site somewhere for these guys. The article basically said the same thing 3 or 4 times, in each instance constructed from a different linear combination of buzzwords. This proves at the same time that the set of all bzzwords is not independant, and that Thiem's writing skills are not that great.
  • I don't agree. Although you may be right about Computer Science degrees which seem way way to specific or just plain "far out"....but as I'm just a jaded engineer that's probably just prejudice ;)

    The thing with a good degree is not the specifics but the basic techniques and methods of working. The things that don't change... What it takes to be a good engineer now, 50 years ago or 50 years from now will still be the same...the names will have changed and the boxes will do different things but basically it will be the same job.

    My BEng(Hons) degree was in Mechanical Engineering. When I graduated I couldn't find a sufficiently interesting job as a mech so I decided to become a softy. Apart from the narrow minded nature of some "trained" software people its proved to be pretty similar...
  • I could care less about buzzwords, it's the message that's important. I liked this much better than *any* of Katz's articles, buzzwords aside.

    This is someone with a sense of history, a perspective that I don't feel Katz has in the same way I do. I remember playing the HitchHiker's Infocom game on a Commodore 64, and learning BASIC on the Apple II's, and all the different little variations that no one needs anymore.

    How do you clear the screen?
    Is it CLS, or HOME, or ? CHR$(147)
    or clrscr(); or clear, or xlock -nolock -mode blank or do you choose the blank screen in the
    screensavers and test it?

    I have a personal fondness for many of the old ways of doing things, and as long as something still supports it, I will try them along with the new ways. People still write in FORTH, because sometimes they need to... I've written scripts in both BASIC and Perl, and benchmarked them.

    (last night, I did a simple test of everything from shell scripts to compiled languages. C beat C++, Perl beat out BASIC, bash was faster than tcsh, ash lost because expr wasn't compiled in... Some benchmarks are useless, but boy are they fun.)

    We may learn the "new ways" of doing things, but that doesn't stop us from using the old ways. Especially when they work better. :)
  • Testify!
  • One of my points was, of course, that it didn't have any "style" or "flair" in the first place. (Oops, sorry, I corrected your spelling. Heh.)
    The reason that I didn't take up a counter position was that I assumed that the counter was obvious to everyone who read the essay. Apparently not. Rather than the "help, slow down, I'm having a crisis!" opinions stated by the author, I would suggest that life has ALWAYS seemed this way. In fact, I already feel this way about "youth today" and I'm only 20 years old. I will readily admit that we have some new ways of acquiring and processing information that we've developed in recent years. The same has been true of every generation, however. Now we have PCs everywhere, before that we had new musical styles, before that we had color TVs, before that we had black and white TVs, before that we had radio, etc, off into the mists of time and the first legbone being beaten against the first drum.
    The majority of each generation fails to understand its children. This isn't new.
    I definitely feel out of touch with people who are younger than me. However, I attribute this mostly to my own superiority, rather than anything new occuring in the world. My intellect and income are both in the top nothingth of a percentile. That statement is a fact, and nothing other than a head wound and a demotion will change that.
    I feel that this essay is just restating the endless lament of everyone who has woken up to their 40th birthday: "I just don't understand THESE DAMN KIDS ANYMORE!!"
    Admittedly, most people my age still recognize the characters of HAL and Yoda, but how many of us know who Abbot and Costello were, much less are able to say what they are remembered for? People forget the culture of their parents. Get used to it.

    I think it's great that you are dedicated to improving your education, and I applaud you for it. You should know, however, that just by thinking the phrase "Gee, I wonder how that works." you set yourself far beyond the norm. The average American goes to work, flips burgers, and admires the boss's new Trans Am. Make your money, retire early, and spend the last 50 years of your life learning.
  • Wherever this person went to high-school/college, please do not send your children there. Let me quote something from the essay:

    "History as a discipline, threaded through textual narratives and how text defines time and causality, has morphed into a world of hyper-textual images, in which our personal interests determine the path we travel through images of meaningful events."

    Let's deconstruct it. First we remove the useless comma-delimited buzzwords:

    "History as a discipline has morphed into a world of hyper-textual images, in which our personal interests determine the path we travel through images of meaningful events."

    Still doesn't really mean much, now does it? We'll try again:

    "The study of history has morphed into a world in which our personal interests determine the patch we travel through images of meaningful events."

    Now it makes some kind of sense, just barely. Basically he's saying that since we don't read about history in books anymore (his premise, not mine), we only learn about what we want to learn about? Gee.. That's a pretty bold statement.. It's not every day that we hear some inane truism shrouded in "new media" buzzwords.. Or is it?
    Maybe Katz should sue for damages.. Obviously this guy is out to steal bread from the mouth of our resident Master of Buzzwords.
  • This sounds similiar to an article I remember reading in IEEE magazine around 1993 or 1994. The author postulated four ages of mankind. The first age was the hunter-gatherer age and lasted about 10000 years. The second was the agrarian age and lasted about 1000 years. The third was the industrial age and is lasted 100 years. The fourth is be the information age and will only last 10 years. After this society begins living in "real time" when the rate of change as accelerated so rapidly that no one can possibly keep up. Intellectual property becomes irrelevant because by the time another company can copy and manufacture what you are selling, you don't care as you are no longer using it. I believe we are close to this point, as I remember reading in Fortune magazine that 90% of Intel's 1998 revenue came from products that were not even in the marketplace in 1997. The author of the article (I believe it is the same person) now was a web-site to sell a book [harpercollins.com] he wrote about this The Friction Free Economy [friction-f...conomy.com] by Ted G. Lewis. Some of this stuff is hinted at in the preface which is available online [friction-f...conomy.com].
  • > How soon will it be until Linux is considered a legacy system?

    Right now? After all, it carries around lots of UNIX legacy, like those dreaded tty drivers, X, strange configurations etc.

    But at least it works.
  • You've really hit the nail on the head. What's so great about history? History is littered with instances of one stupid little group persecuting another one for no better reason than their existing as distinct groups. I've studied history and seen enough prejudice, subjection, and genocide to make me sick for life.

    Strangely, all these petty differences don't seem to matter to people on the 'Net. Is it because we are more educated, younger, and more idealistic than society at large--in other words, because of simple demographics? --or is it because of the very nature of the medium, which makes you smart and critical by letting you select what you want to read, and allows you to communicate with individuals instead of groups, removed from a cultural context?

    I'm betting it's both. At any rate, I like the present better than any other time in history.

    PS: Read Kurt Vonnegut's latest novel, Timequake, for a more detailed and eloquent desconstruction of the idea of learning from history than I can hope to give. It's a fast read--only about an hour long. It's not the best Vonnegut, but it's worth the time.
    Beer recipe: free! #Source
    Cold pints: $2 #Product

  • Of course, you've taken all the style and flare out of his comment by breaking it down. You knew what it meant, so did other people, deal with it. And another matter (yes! I'm starting a sentence with "and"): If you're going to belittle a statement you might consider taking up a reasonable position counter to it rather than just being rude.

    Buzzwords or not, the statement has merit. I'm 16 years old and living in the US. I've grown up in a public school system that could care less about me or anyone else. The result of this is that until I take initiative in my own education I A) Don't learn anything of academic value. B) Focus instead on what I find interesting. I've lived this for years and it takes work to break out of it.

    I'm still a student, and will be (with luck) for another 10 years. I now live and work in an environment that requires that I learn something extra about the forces that created the world I live in today. There is great value in this, history is important. But what of people who aren't "forced" into this? Unless one has a great sense of curiosity (not a totally uncommon thing) and then takes an interest in how things got to be the way they are (not that common in my experience) how will people learn?

    Not that this means the loss of knowledge and education in our society, but it is something to think about if we are working towards a high level of universal education.


    --Michael Esveldt
    "Here I stand with all my lore -

    poor fool, no wiser than before"
  • Ok. But I dont see much of a problem there. Except for that communications on a higher level are harder to persue as it's more likely that the one you're talking to doesnt understand what you mean (I think you went into that in your first editorial). But that is more a matter of people wanting to learn/understand rather than a slight increase in what there is that actualy has to be learned.

    I'm of the opinion that being dumb must probably feel pretty good. You dont have that much fun but you cant worry about anything. And there's always been dumb people. And that's not because they wouldnt be able to learn or wouldnt have the access to information but because it seems to be easier. You can definately survive without knowing Yoda.

    The only problem is that you start to feel pretty lonesome if you're in a room full of people with noone "knowing Yoda". And it gets more frustrating if you try to get people to teach themselves so you might someday have the possibility to have a decent conversation with them. That's not "it's lonely at the top" but more "its sad that so few actually want to get higher" (no puns intended).

    All in all the rules dont seem to change that fast though. It's still not cool to kill people and sex is still the first thing I think of when seeing an appealing woman.
  • No, no; this is much more along the lines of
    Vernor Vinge's Singularity; that as humanity
    gets more and more connected, and the things
    they use to interact get smarter and faster,
    generations will begin to pass in minutes. A
    good example of his thoughts on this are the
    reminiscences of the people who lived closer
    to the Singularity in Marooned In Realtime -
    a *great* book, if you can find it.
  • Adapting to new environments is the only way to stay anywhere close to the current digital generation. But being a little fammiliar with some of the older technology helps you succesfully use the new nechnology 'like assembly programing can help you port kernels to new architectures'.
    The best use you can get out of studing history is to not make the same mistakes that others did. How many people hate those BIOS limitations on PC's? And you can learn from what was done right because some of the new stuff still can't come close to the grace and elegance of the past.
  • [With all due respect to the generation that coined that phrase -- the one that begat we X.]

    In the beginning there was Babbage, and many the years his ideas and others lay dormant. But then lo, there was ENIAC, and it was good! But the goddess Insectidae saw this and was angered by the light of this ENIAC, and all its children, produced without air. And they were attracted to the light, and there was a great inward Popping sound, and the minions of ENIAC and her successors ran diligently around her hulken skin replacing her airless lights incessantly.


    Then there was BellLabs, and thus was born the 3-Legged Spider that could kill the minions of the goddess Inscetidae! Slowly but surely the airless lights disappeared, and the Children of ENIAC grew smaller and sleeker. And they consumed the 3-Legged Spiders, and it was good. But still, the Children of ENIAC were large and slothy, and still required many, many 3-Legged Spiders, which they then called Transistor, by which to live.


    Then the Transistors started to form many-legged collectives. The Transistors were packed tighter and tighter, with only a few legs exposed, Integrated into a colonies which they called Circuits. And these black Integrated Circuits further reduced the Children of ENIAC, until some underdeveloped offspring formed small, hand-sized entities, which could do the ubiquitous functions of Add, Subtract, Multiply and Divide with inhuman speed. And these small Children begat others and begat others and begat others and so on until they called themselves such as Palm, but that was yet many years to come, and we have not yet reached the Generation of X.


    Then it was 1969, and Man walked on the Moon and HAL 2000, a descendant of ENIAC, was foreseen in a Vision by the late Kubrick, and a Great Bird reminded us to live long and prosper and the world rejoiced! [The King William V version of this document has the additional line: "And thus they knew the Cybernetic Men would roam the cities and the Pepperpots cried an extinguishing tone, first with a mission to EX-TER-MIN-ATE, and then more horrific still, with mass purchases of Piston Engines all across the land."] And thus was begat the Generation of X. And those of X learned of what had come, and saw their parents and their Holy Cards and their ancient tongue of Fortran, and were thankful it was not them. For X had avoided the lurching progress of the last quarter century and were grateful they had not suffered it. For they were now on the vanguard, with a perspective like no other! Young enough to have escaped the slow years of ancient progress, yet old enough to see the coming changes in perspective.


    And so it was that X had its War of Stars, its Striking of Empires and its Jedihical Return. And it understood when the Reagan spoke of the War of Stars and again when the Clinton did, and it knew the nature of the Phantom Menace like no one since. And it knew Apple; it knew XT, TRS-80 and C64, all the Children of ENIAC, now small enough to live in our human homes. And we invited them, slowly at first, but in ever increasing numbers, watching, as they became as ubiquitous as the ancient Telephone and Microwave. They knew Atari, they knew Intellivision [nod to a certain moderator there...] and they knew the Connecticut Leather Company -- and they saw it all disappear.


    "I knew that company." Spoke one of X of those days of yore. "I remembered passing it many times from my home in West Hartford. Coleco was what we called it. So lively a place, we watched its cabbage fields rot as we drove by in those bygone years. Then it was just a husk. Gone forever, and all we had left was our Commodore and our GWBasic for DOS. Little did we know what a virus the latter would spread..."


    So it was that the phoenix rose again, festering in the Park of Xerox for many years. Thus came Macintosh and Nintendo and Sega, and another generation we watched go by. And it was good. But the virus did not lay dormant, for the goddess Insectidae grew restless at her near defeat and helped the virus grow. It gathered strength and saw the success of Macintosh and said, "I must have that!" And so it plotted and conspired to develop the Window of Oblivion, as the Children of ENIAC continued to contract.


    "I had seen pictures of old hand-held Children of ENIAC, what we used to call Calculators. What a thrill it was to own my own! Not only did it Add, Subtract, Multiply and Divide, it took logarithms, cosines, and plotted graphs and was even programmable using Reverse Polish Notation. I also had a Wizard to help me remember things, the Original. It even had expansion cards, which could do anything from add more memory to translate Happy Birthday into Chinese. Little did I know what I was holding was the ancestor of a being many generations removed." So it was that the seeds were being sown.


    But while the Windows of Oblivion began closing in on germination, a secret was wrought, known only to kings and their minions. And they called it ARPA. And the discoverers, like the breeding at Xerox, knew not what they had created, and had not the tools to allow it to grow. So it lay dormant, when the Window of Oblivion finally took hold!


    "I remember the Third Generation of the Window. It was all shiny and new and fit so nicely on my Third Generation XT, which we called 386. Finally I had escaped the confines of the DOS. Little did I know the menace lurking within." The tables had turned and the pace accelerated.


    3.0, 3.1, 3.11, 32, 95, OSR1, OSR2, 98: faster and faster they came, wave upon wave. Killing our OS/2 and our Macintoshes like a worm to the core. And so too those ancient Integrated Circuits got denser and faster, and we had 386, 486, Pentium, MMX, PentiumPro, Pentium II, Celeron, Xeon, Pentium III and K6-3DNow swarming, each generation trying to extinguish all others. The legions were afoot; the stage was set, and there seemed like no one left to fight.


    "I remember installing Linux pre-1.0 on my 486. I loved it! It was great to finally escape from the gaze of the Window of Oblivion. Finally, there was a third way. We all thanked the generations that came before, the AUX, SunOS, BSD, but now there was a generation for us." And X knew, slowly but surely, there was a light in this generation.


    Then came the Genf-people, who formed an enclave called CERN. And they plotted and planned and knew that information needed to be transmitted, indexed and cross-indexed again, like the net of a spider. And they saw ARPA, which developed far away and was now called Internet, and they knew it was good. And thus they developed a new generation across the Whole Wide World, and they called it Web.


    "I was living not far from Genf, as the locals called it -- Genève or Geneva as I have sometimes heard it called. Little did I know that in a Canton not far from mine, a new Generation was in the making! I returned to the Royal Mountain of La Belle Province where I was studying, and knew that I needed to have my own slice of this Web. It took me a year to get access, but in 1993, I finally premiered The Original George Harrison and Tomorrow People home pages. Now, if there was only a better browser than Mosaic...." The stage was set.


    But the virus returned, this time with a Spry Air about it! And it saw Mosaic and Mosilla and was again jealous. So it created an Explorer and integrated it into its 95. And the world at once cheered and feared. And finally the ARPA, now the Internet, was free to prosper. No longer did the people want their machine-telephone connections to be to simple BBS. And it was that everyone wanted access to this Internet. And the generations again rejoiced. [Until pornography took over and made everything seem smutty, but that's another story...] And soon the secret of kings was as ubiquitous as the Children of ENIAC were, and it was good.


    Yet as another generation was born, but a few years later, a new war was beginning, a war between the Palmists and the Sea E of Oblivion. "Look, I played Phantasy Star, PS II, PS III, Gaiden and all the rest, and I can tell you, they are not Palmists, Parma-people or the lost race of Palm. They're Palmans, from planet Palma, like Alis, got it!? Jeez, kids these days with their Play Stations and their PDAs and their iMacs! What's the big deal; it's been done! And they're called Computers, ya yoink!" And so X became more outspoken and scoffed at being left behind as Generation after Generation built upon her [After all, this is Generation X we're talking about, not Generation Y or XY.] work. And X joined the ranks of the old foggey generations before, and could but watch as the world changed around her. But one thing is for sure, X knew her Windows were right!
  • This is one reason to get a degree (to connect this with an earlier post). While it certainly is true that you can land a great job in the industry without it, the base knowledge gained at a CompSci department or similar will last a lot longer than proficiency in whatever the latest buzzword technology, and it also makes it easier to live through a technology transition. While this is no panacea for obsolecence, it will make you employable longer than without it.

    Then, of course, there's those of us skipping that race altogether by entering academia instead :)
  • Wow, fascinating meme. Really.

    In case, it wasn't clear, I'm being sarcastic. This is the time-honored practice of old coots the world over - to bemoan the fact that the icons of their lives are forgotten by most everyone they meet.

    Yes we age, our favorite shows go off the air, our favorite appliances rust, our bellies/boobs sag, etc.

    So what of it? Live for today. Nostalgic sentimentality exists for those who have otherwise no reason to wake up.
  • "Pull the footstool a little closer child,
    and settle the shawl around my back!"

    Reasons why I didn't like this article:

    1) Longwinded
    2) In love with itself and its spongy prose
    3) It's not a new subject
    4) Its point is invalid - I know plenty of
    old IT types who've kept abreast of
    technological development over the past
    30-odd years.

    One more article like this and it's into the
    filter with you!

    K.
    -

    --
    To the extent that I wear skirts and cheap nylon slips, I've gone native.
  • Ok, I am a tech and I am 31. I remember first using a computer when I was 13 when my dad got us a Sinclair ZX81 (z80 based machine 4kROM, 1kRAM), so like I have been around for a while. :)

    What I have found and the others who are 'old' and still do tech is that, the most important skill to have is the ability to learn. The ability to find out things, to know where to look. That is it.

    Another thing is you build experiance in getting into the head of of the programmers. If you are trying to integrate systems and so forth there are certain ways that things get done etc so you can again make good guesses where to look to find solutions.

    So I don't spend ages learning the ins and outs of stuff I don't need, I can always look it up. If I need to do something I can find out or sus it out.

    Also experience is worth more to me when looking to employ than knowledge in a certain tech, as the latter can generally be learned in short order and a good general tech head is better than a dumb expert.

    Ok I'll shut up now

    IceTiger
  • I got it when I was 4. I loved it. I remember playing Boulder Dash (the first one). Those stupid flashing square monsters scared me more than all of the fps games combined (well, I was only four afterall). Now I have my 366 with 64MB of ram and 15GB of drive space. I remember being happy when I got my super-fast 8086.

  • I think you're being a little condescending there. Who's to say that future generations of computing will even be based on what is taught in computer science? Will quantum computer code be anything like C or C++?? Probably not.


    But then again, it's possible you've not studied enough cs. Math is the basis and while quantum computing may change some of our process, what Dijkstra proved will not be any less important. That's the nice thing about fields involving science and math, you can't start anywhere but the beginning. Sure the roads branch here and there, but it doesn't change the facts.

    If anything, what has been presented in the past will only be expanded.
  • by Saint ( 12232 ) on Wednesday April 14, 1999 @09:41AM (#1934486) Homepage
    I would agree with you. The world is moving faster and there is no time to enjoy it. We are lured on by more ... ever searching for something that is slightly faster, more exotic or more powerful. I think the "cyberpunk" authors of the eighties were prophetic in spirit about the attitude and direction of the world today. How far are we from jacking in? How much time will we have then to sit and watch the fish?
  • For an excellent example of this in science fiction, read Vernor Vinge's Marooned in Realtime. The ever-accelerating development of mankind eventually leads to a vertical asymptote, a "technological singularity".
    Interested readers might also try Vinge's A Deepness in the Sky, where technological development has a horizontal asymptote.
  • I don't think he's saying what you think he is.
    I think he's saying that we don't read hypertext the way we read a book, following the thread of an authors meaning from the start of the book through to the end, but that we read it by zig-zagging from document to document, picking up snippets here and there, like eavesdropping on a thousand educated conversations.

    It's a different style of learning, and one that includes a variety of contradictory 'facts'. Unlike book learning, where my school books never disagreed with each otehr becuase they were all chosen by the same people.
  • I think that far more prophetic than the cyberpunk authors was John Brunner (The Shockwave Rider, The Sheep Look Up, Stand on Zanzibar). Writing fiften or twenty years earlier, he much better captured the social, as opposed to the technological, changes we're talking about here.

    cjs

  • I took a few minutes to respond to this article, and all of a sudden, life is passing me by. Isn't this always the case, has it not been the point of some of the greatest literature of all time?

    Care Diem! Go live out by a Massatchussetts pond and watch ants battle it out for dear life.

    This is nothing new at all, and we've always known that computers serve to accelerate things. Hell, that's what they're for.

    The invention of fire made life faster. Suddenly Groag couldn't understand why his children liked antelope so much..
    The printing press increased literacy - I'm sure the clergy needed some time to figure that one out.
    The industial revolution is the single event most responsible for the growth of global economy, the development of cities, and the pollution explosion. Suddenly the old folks just didn't get why you lived in a one room apartment, fed a machine 16 hours a day, and got by without milking a cow.

    Especially notable is that the author writes from a perspective inside the generational box, and bemoans how things have changed. "I have to reinvent myself" says he. Right. And his folks had to reinvent themselves when rock&roll first appeared, when man walked on the moon, and when we nuked Japan.

    Hypertext is not a big deal. Technological advancement is a part of human life, and the accelerated rate thereof is a natural consequence. So rather than piss and moan, let's go and learn something new.
  • I agree with what you said, in that people are bound together less and less by their common history, though I'm not sure that's a Bad Thing. What I see happening instead is people being bound together more by their present instead, and their history taking a back seat.

    The Internet is a perfect example of this - I've met people from all over the world, young and old, and from a perspective of history, we have nothing in common. However, we do have something in common, and that is what we are interested in. Comparing my life now to what it was 5 years ago before I was on the Internet, I think my life and the friends I've made in it have changed for the better, as rather than being confined to those in my physical area who may have a common history but dissimilar interests, I find myself interacting with people who are interested in the same things that I am.



    I'll shut up now. :-\

  • I found that article to be very 'dumbed down' on the technical end of things, but what he's essentially saying rings true: 'Generations' are no longer defined by who you are, but by how you think. Now more than ever, because of the long time (To me, 10 years is a long time) I've spent communicating online and through simply having the resources to work through interpersonal relationships more quickly (telephones, cellular phones, pagers, eMail, as well as computers), I feel less and less at home with people my own age. I can personally attest to the effect he notes in his article. People who try to guess my age often guess as much as ten years too old, but never less than five. I'll be nineteen Sunday. A girl I'm talking to now is 9 years older than I, but people ask me how much older I am. Advancements in technology have really allowed us all to grow at our own pace, and it's going to turn our society inside out before we can make sense of it.
  • So. I'm not the only one of our generation (17 personally) who's noticed this.

    I've seen some real life examples of this. High schooler asking in history class, "Who won World War II?" Different person, asking who we fought during it. And these are the people who actually CARE enough to ask! Forget the fact that any literate person should know these answers. Just think of how many high schoolers are secure in their ignorance: "I wasn't alive, so it doesn't matter". Bah humbug. Hell, how many teenagers read the newspaper beyond maybe the sports or comic sections. TV news is no substitute, and an understanding of current events is necessary for anyone.

    On the other hand, this 'the younger generation is doomed!' sentiment's been around for a while. I read a reference to a text condemning the teenagers of the period as 'lazy and shiftless, each generation knowing less than the previous generation' along with dire prophecies of the doom of society. Only thing was, the author was Babylonian. Our generation will pull it together sooner or later. Those who do will grow up to make money and/or live fulfilling lives. Those who don't will sit on their ass and watch TV. And the cycle will repeat itself for our children..

  • OK Let me tell you about myself. I started my career as network engineer from my CNE training school ( no I don't have a degree :) ). And my Netware 3.x skills are already outdated by at least 3-4 years :). But now I know enough about networking in general to quickly learn whatever is going to be the next buzzword technology in my field whether it is Linux, NT or any other. When ethernet switches appeared I took a manual and now I know this technology. If some strange and mysterious new shit comes out tomorrow, I will figure it out just the same. I always do, why should anything that can be expected be different? This what is nice about being a geek :)))
  • The high-level summary, if I understood correctly -- "The rate of technological (and cultural) change is fast and ever increasing. This makes communication between people ever more difficult."

    I have to agree with this. It seems so obvious to me I don't know how it could be argued. Not that this is a new observation -- Alvin Toffler made exactly this case decades ago with Future Shock [amazon.com]. It's practically a truism -- witness the current buzzwords of "web time" and "internet time."

    But Thieme stops there. The question that desperately needs to be asked, and thought about, and debated, is "Is this a good thing?"

    Otherwise, we're stuck in the grip of a boosteristic technological determinism, where our "options" with each wave of change are simply Sink or Swim. I hear lots of rah-rah about how "enabling" the Internet is, but it rings kind of hollow if I am not "enabled" to Just Say No.

    (From this perspective, the Amish are some of the most technologically enabled people around. As a group, they have a certain set of values and priorities, which do not include being current and hip and up-to-the-minute. Instead, when a technology presents itself, they ask "Is this good for the community? Does this help or hinder accomplishing our highest priorities?" If the answer is "no," then the technology is rejected. They are masters of the art of Just Saying No.)

    So, is this shortening of "generations" a good or a bad thing? I'm inclined to think it's not good. What good is it to email around the world if the price is that we are cut off from our parents and our children, and even from our own "generational" peers who are outside of our particular circles of interest?

    Are these drawbacks inherent in the technology of global internets, or can they be mitigated or eliminated? I don't know. Thieme either doesn't know or isn't saying. It would not bother me if he doesn't know the answer. What bothers me is that he isn't even asking the question.


    Zach
  • According to Tannenbaum, Linux is ALREADY a legacy system.
  • So the late-twentieth-century $200K+ technology generation is facing it's first burnout. Wow, kinda dates me, I guess. I've forgotten more burnouts I've had than some people I know remember.

    I guess eventually you learn to pace yourself and get more out of life. There really is longevity, stability, and personal life in this biz, you just have to know where to find it and you have to really want it once you do find it. Most people I know right now don't want it.

    It was a good read, but in the immortal words of Roger Daltry, "This is no social crisis, just another tricky day for you..."
  • Okay... so maybe in many decades it will be outdated, but Linux has an advantage over all the early PC/OSes... unlike Apple DOS and ProDOS, unlike the Amiga OSes, unlike OS 9: Linux is platform independant. I prefer the Caldera distro, but I admire Red Hat for its support for non-Intel platforms.

    And yes, before you reply, I know it was originally keyed to the 80386 hardware, but that's no longer true.

    The key is that new hardware and software concepts can be either integrated or run on the OS. This gives far better flexibility the Win CE/9x/NT family - they are all seperate code projects, as opposed to general purpose Linux/X Windows and Routers and embedded palm systems and mpeg car sterios. The Linux family are all based on the same actual OS, not just a user interface.

    --
    Evan E.

  • I think you're being a little condescending there. Who's to say that future generations of computing will even be based on what is taught in computer science? Will quantum computer code be anything like C or C++?? Probably not.

    I'm in an engineering program now at university, and given the chance to decide again, I might have taken a community college electrical engineering course instead. It is always up to you to keep yourself updated, whether you have a BSc. or not.
  • I'm an anachronism.

    One of those odd creatures that pops out of time and lands in a place it ought not to be.

    You see, I am only 18, and I have noticed this trend from the beginning of my days.

    I was a voracious reader, and I got my first computer in a box. Or rather, my father did. It was a good box, a nice box, a box filled with little things and bigger things, all pretty colours. I watched and read out the instructions as my dad soldered the components together and built an Apple ][. An Apple ][ is a comforting thing; it exudes this sort of coziness and warm familiarity. I pounded out stories on WordStar (in mostly gibberish of course, I was only three at the time.) Even my little sister likes to play with it from time to time, we haul it out and stick a television on top whenever we get the urge to play "Zaxxon" or "Mystery House".

    I remember nearly drooling (okay, I guess that's normal for a kid of five) when I got to play with my first IBM PC (look ma, no "XT"). Eventually I got my hands on an AT and I am typing on it right his moment.

    What saddens me is the fact that few of my generation remembers these things. They seem to drift dreamily through life, noticing only flashes and spurts of the images they pass. The present flutters past them and the past is never known, much less forgotten. Their only experience to the Bard is in school, they have never heard of Homer and the Illiad is foreign to them. Even more present media is unknown: Hitchcock was a nobody, Humphrey Bogart is not a name they know, references to Audrey Hepburn draw a blank. Even quips from television shows but ten or twenty years old draw blank stares. Quoting poets like Tennyson, or Dylan Thomas, or sages like Lao Tsu draw shocking looks from passer-bys.

    We are nearing another Dark Age, where past knowledge will disappear just as it did when the Goths sacked Rome; because those that care about the old will be few and far between. People are innundated with messages spouting rhetoric such as, "Forget the old! Come with us, we're new and we're fashionable! Don't do what your parents did!"

    This seems to be an unstoppable trend, the media juggernauts racing towards the goal. But we will recover. Historians note that we regained the ancient Greek and Roman epics in the fifteenth century, when old scrolls and sheets of papyrus were found hidden in monestaries and libraries of ancient and old cities.

    We must preserve the past, so that the people of the future will be able to find it again, and listen once more to the words of the ancients.

  • Five or six years ago (before the big Star Wars resurgence) shortly after Halloween, I was talking with a friend. He had dressed up as Darth Vader for Halloween that year, and a little kid came up to him and asked:

    "Are you the Shredder?"

    I was appalled.
  • I think that the idea of a "generation" as put forward by this posting is perhaps a little hazy. At least it seems that way to me. I count myself at twenty-eight as having seen a huge raft of products and platforms come and go and occasionally see myself slipping into nostalgic old-timer mode, wasn't it great when "blah blah blah ... "


    Software upgrades aren't generational improvements by their very nature. They're a way for software companies to regenerate fading streams of revenue for legacy products.


    Hardware platforms and architectures are evolving and being discarded at what seems like a frightening rate from here, close up by the trees but if you step back a little and look at the whole forest we're not all careering madly along on a technological roller-coaster at all. ( what a mixed metaphor :-) ) People are using information processing appliances and networking facilities in a pretty much gradiated linear evolution dating back to before the telegraph, the calculator and the printing press.


    Sure, the techier aspects of the "Information Revolution" seem to be exponentially shooting out everywhere and at a faster and faster rate, but I suspect that this is just because we're slap bang in the middle of the earliest growth stages of a new-ish science / technology offshoot. I imagine the "crazy" proliferation of factories and then railroads around Europe during the Industrial revolution looked very similar to industry observers of the time.


    Myself,expect this particular curve to flatten out eventually. From my vantage point evolution seems to go in stop start binges. Overall you can view social / technological innovation as a smooth growth curve. Close-up you can see the wild spikes and line-noise. Its a fractal thing.


    Everything changes, everything stays the same


    Whilst it is true that many things about your own particular "good old days" of computing were great fun, it is equally true that its great fun now.

  • I agree somewhat with what people have said here, in that it's not impossible to stay in one field for the rest of your life. But I disagree that your career has ANYTHING to do with the degree you received. As some people say, it's just a piece of paper. FAR more important than which degree you have is how well you learn. I think that it is much more important to learn how to learn something quickly, efficiently and effectively. Of course, you have to have a somewhat good foundation in the basics, but even Mathematics is not really a huge necessity.

    If you're still reading this comment after that last sentence, let me clarify that. I think that you must not be afraid of Mathematics, and must be competant in algebra/trig, but that's about all you need for programming. I think much more necessary for general programming is a solid foundation in logic. Which I guess is a field of mathematics, but also philosophy.

    Anyway, I think that mostly what you need to succeed as an IT professional is what you need to succeed in an office environment anyways: the right attitude. You need to be disciplined and open to new suggestions while remaining skeptical. At least that is my experience so far (in the two years that I've been in Tech Support and now Software Development).

    This has been long, rambling, and far too general, but just wanted to give my opinion.
  • I am only 25 and have already lived through more generations of computing than my father has calculators! With the rapidly expanding knowledge base of the tech industry, generations of computing can now be legacy before they ever hit the consumer market... a frightening thought indeed.

    So where does this leave us? The people who can't afford the new systems, but yet need to keep up with the rapidly changing climate among the revolutionary new advances in order to keep some semblance of job security? We are caught in the middle. The lucky few actually get their companies to pay for the hardware and brainware (training), but the majority are lost in the ebb of the tide...

    How soon will it be until Linux is considered a legacy system?
  • by Utoxin ( 26011 ) <utoxin@gmail.com> on Wednesday April 14, 1999 @09:36AM (#1934505) Homepage Journal
    A fascinating article. Kudos!

    I especially liked the section talking about how software updates have been speeding up. I know that I run software that is almost all at least a year old. I even run a few programs that are more like 4 or 5 years old, simply because I like where they were at that point, and felt that I didn't need anything else. They do everything I need, and I get lost in the new programs.

    As for that joke about not knowing who Yoda was... Please tell me that was just a joke? If it wasn't, then I'm very worried about society.
    --
    Matthew Walker
    My DNA is Y2K compliant
  • Well, at least hunting for food would be more interesting than
    "You go hunting...You found food!"

    But you wouldn't care for the hours of sitting on the wagon_bench location, staring at your oxen_weapons killing the trail_miles.

  • >- Saint said:
    I think the "cyberpunk" authors of the eighties were prophetic in spirit about the attitude and direction of the world today.


    I think they were prophetic not only in that manner, but also physically. Already down at Emory University in Atlanta, GA, researchers are pluggin quadrapalegics into a computer, using probes to allow them to manipulate a mouse and keyboard (a fair amount of success with the mouse, not so much with the keyboard). If they can do this now, then physically we might end immersing ourselves in a digital world not unlike that of Gibson's (or, worse, the Matrix). I simple electromagnetic device like that would change the entire social structure as we know it; something to ponder...
  • >- Skyshadow said:
    [big snip]
    So, is it possible to be a technical person all your life and still live? In the case of the first generation of computer techs, I'd have to say almost certainly not. That frightens me.


    Well, I am not sure of the current trends; I only personally know of three people who have been around that long. One is indeed in management, but the other two aren't. One, my boss, is in title a VP of the corporation I work for, but he functions as the senior system/network administrator, something he has been doing for hei entire career. The other man, my father, has been programming in various languages and on various operating systems since 1971. Both of these men are willing to learn the new technologies and "fads" that permeate any evolving technology, and then filter out the trash. If you are willing to constantly learn, and then (most importantly) constantly do you best, a long career (with huge paycheck every two weeks) can be had; come to think of it, that is what you should do for any job.

    -G.
  • Didn't Doonesbury have a couple strips on this? I recall some computer guy who falls way behind the competition because he went on vacation.
  • If you base your life on the technology, you will lose. The technology has a short half-life. If you base a career on a technology (i.e. "I am a Unix sysadmin") and live by that, you will become obsolete as soon as the technology does. So what is a poor geek to do? First off, don't base your life on the technology. There is a lot more to life than ones and zeroes, no matter what you can do with them. You can make technology your life and possibly make a lot of money, but you will have missed out on the good stuff. Basing a career on technology is certainly doable; it's basing a career on a particular technology that is problematic. If you base your career on technology, you are building your house on a fault line. It is better to base your career on the unchanging, and then to add the changing technology skills. You certainly need to know the current tech, and you need to learn the future tech. But companies look for things beyond the technology. Things like a can-do attitude, a commitment to quality, an understanding of business communications (aka talking to people in your company), and the ability to juggle multiple products. These aren't pointy-haired mumbo-jumbo; these are skills that make companies money, and thus skills that they want you to have. Those skills don't go out of style, and companies will pay big bucks for them. If you have those permanent skills, some companies will even hire you and train you to get the techno skills that you need. If you have the techno skills and not the permanent business skills, you may be tolerated--or you may not be. If you have both the permanent skills and today's hot tech skills, you can basically write your own ticket.
  • wow I remember those days... strangely enough those days were not that long ago... My high school had apple IIe's till 5 years ago... hmm ok maybe some people think 5 years is a long time...
  • I am a 27 year old and own a Wang LVP that still works, I could count the changes since then but I won't bother. I am too busy trying to keep up with this ever changing technological world we live in. I am currently a project manager for a large multi-national companey. We are implemting a enterprise wide backup standard using DLT. I occasionally look at my old Phoenix platters and 8" floppies and get a little teary eyed.

    The social implications of the fast advances in technology enforce a survival of the fittest arena. Only the fast survive, one day all of us will be phased-out, down-sized, or out-sourced, probably by someone younger and faster, and we will be the ones that sound old when we talk about things we know and love.
    ___________________________________________ _____________
    Can We trust the future - Flesh99
  • Great article! Like many others, I've worried about the issues raised therein. But, more recently, I've become preoccupied by a related malady: the rewriting of history.

    I read computer history books. I love it. I like reading about the stuff I was there for, but it's also great to read about the true pioneers from the 19th century, the WWII years, and so on. I read a lot of those books, so I have at least some idea of what happened and when it happened, especially when it comes to who invented/discovered X, who first put it into production, and when it first became widely accepted.

    Time after time, though, I'll read an article in a popular magazine or newspaper, or see a segment on a news show, and I'll see that Microsoft, the IBM PC, or some other relative latecomer to the game has been given credit for something developed long before by someone else. It drives me nuts.

    And when I try to correct these folks, I rarely get anywhere. As far as they're concerned, their sources are gospel. If the sources say that Microsoft or Apple invented the GUI, then that's what must have happened. If they say that high-res graphics started with the first VGA card, then that must be right.

    Which isn't too bad, until you start thinking about all the people who read those articles or watch those shows. Those people don't know any better, so they believe what they're told. And eventually it becomes a form of truth, because it's generally accepted as such.

    And it gets worse. How many articles on other subjects about which I am less knowledgeable have I read and believed, when in fact they are grossly inaccurate? I must end up believing dozens of inaccuracies a day because I simply don't know any better than what the 'journalists' tell me. And you do, too. We all do it. There's just not enough time to learn enough to make sure that you're not being mislead. That's why journalists are supposed to do fact-checking.

    I don't know what the solution is to this problem, but I thought I'd bring it up, since it's at least tangentially related to the concerns raised in the article.

    -Joe

  • actually, I think that most (intelligient) people do not look at a degree as "proof of skills" but rather as "proof of dedication and/or discipline". I have no problem getting jobs in IT and I have a 6 year quantum chemistry degree. My degree doesn't say a thing about my IT skills, but it does show that I am able to suffer short-term hell-type situations to attain a higher goal.

  • I suddenly find myself reminiscing lunch breaks (junior high school) spent sitting in front of the Apple IIe with the cool green monochrome playing oregon trails....heh

    wonder if anyone ever thought of a hybrid quake and oregon trails....instead of your oxen dying from lack of food, you get pissed off and launch rockets at them and turn them into dinner!

    :D
  • Yes, there are (arguably) rapid social changes going on, and this will cause problems for some people.
    However, isn't the important thing not having data and knowing information, but what you can do with the data and whether you can use the information?

    A teacher friend of mine likes this quote:

    "In a time of drastic change it is the learners who
    survive; the 'learned' find themselves fully equipped
    to live in a world that no longer exists."

    Eric Hoffer


    If we can learn how to learn, we have no reason to fear.

    --
  • I find this discussion slightly strange. Here people keep saying that they have lived through many generations and yet no one has taken the time to define what a generation is. It would seem that if you wish to hold the discussion that it would be a vital piece. After all that /is/ what is being debated. Computers evolve but do not necessarily end as far as generations.

    A generation usually refers to a group of individuals at approximately the same age or holding the same values. They generally change about every decade or so real time. Computers however evolve at a more constant rate. We think more on computers and as such we don't generally think about how they are affecting our lives. We are now in a generation of hackers and crackers and programers.

    Software becomes outdated because instead of continuing the development of one program the software industry now pushes out more products that it completely develops. Case and point Microsoft Windors 98. The industry now pushges a computer program out at a rate then creates updates so that it runs more efficiently.

    To say that many generations have passed is not quite adequate. The Pentium class computers are truly or mixed generations. The Pentium and Pentium II being with in a time span and increasing the speed. The Pentium III however might be said to be of a different generation because it would seem to be a different computer.
    It is also not right to assume that the present generation does not know who Yoda is. Star Wars transcends many generations in its appeal. ;-) Especially after the rerelease of the original trilogy and the upcoming trilogy that is to be released during this generation.

    However, when exploring this particuliar subject one has to keep in mind that the human mind is in a constant state of development and so is everything they are making. So it would be appropriate if the computer systems also developed at the same rate.

    Thank you for listening.

  • Yes, but what _you_ said was nothing more than a usual practice of the young ones who think that "icons of life" doesn't matter anything.
    But it does. We are human beeings, and not digital computers, yet. And we need other human beeings around us, because we were created so. Community and the symbols of this community is very important for everyone.
    For istance I am Hungarian, but currently in Germany. And felt very uncomfortable even at home, until I spoke with one of the german poeple, and he said, that he had read the Hitchhiker's Guide to the Galaxy as well. And then I felt a little bit comfortable. Like if I were at home a litte. I had someone around me with the same "icons".
    Yes, we age. Yes, we live. And do you think, that the simple fact, that you can say that sentence : "Yes, we age", makes or and aging easier ?
    You asks : What of it ?
    It hurts. At least it hurts even me, when I cannot follow the fast advance in life, and cannot smile to others knowing that we share the same background.
    In the time of C-64s, it was enough to be a computer-fan to feel yourself home among other computer-fans. Now you have to be a Quake-fan, or even especially a Quake-II fan to feel the same.
    I liked playing Ace-II, but don't have enough time beeing a QuakeII fan. I've tried and missed. And Quake fans wouldn't accept me as a company. Because I am not a full-time Quake-fan.
    So what I think this article was about, that you have less and less poeople around you, with whom you can speak comfortable. Others just chatting something, that wouldn't matter a bit.
    Live for today, yes, true, I agree. I really agree. But I can live for today only WITH people, and if I meet someone who just cares about Quake and thinks, that Quake is what matters, what can I do with it ?
    Replace Quake anything you want.
    Now I feel disappointed, because I wrote down one of my most important faith about life, and you will just skip it. Because I write it in electronic form.
    Ah...

"When the going gets tough, the tough get empirical." -- Jon Carroll

Working...