Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
News

Philosophies of IT 48

Lion Templin writes "Despite the extremely dry topic, "Philosophical Changes in 1990's Information Technology" covers a broad history of IT and makes a strong suggestion as to why the industry is what it is today. A main focus is the differing philsophies of traditional "academia" related 'mainframers' versus the fast-paced 'micros' that predominate most of the IT industry today. Written as university research, it's been read by several people in the IT field and received good responses."
This discussion has been archived. No new comments can be posted.

Philosophies of IT

Comments Filter:
  • by Anonymous Coward on Wednesday March 24, 1999 @08:40AM (#1964841)
    What's the bottom line: greed has crept into the computer industry; if more people would just be "idealistic" computer science folks instead of "mercenary" software engineers then the world would be a better place.

    The one big mistake of Lion's premise that Software Engineering is not new. It's as old as the computer and engineering methodoliges are as old as the stone tools of the neanderthals.

    Sorry, "Lion", science and engineering are not incompatible and both will drive developments in the computer industry for a long time.

    -ac

    P.S. Hey, lamerators : Moderate This!

  • Posted by nokaos:

    the author treats 'software engineering' as something that has happened in the past or may be happening now. i do wonder what definition of the term he is using. it is not in the footnotes.
  • Well, at least it still responds to pings. :^)

    BTW, I see a lot of "What a lamer! I could write a better article than this Lion guy!" I'll say bullshit. If you *can* write a better paper, I want to see it. Otherwise, quit talking out of your asses.

    Ah... feel better.
  • ROTFLMAO
  • I think the paper is pretty darn good, _except_ that the author does not appear to understand the distinction between large UNIX "minicomputer" systems and mainframes, not recognize that the cultures surrounding the systems are different.

    For instance, in the section "Software Engineering: The Death of the Programmer" he states "Contrast this with the mainframer's GUI, X Windows". Working at a shop where all the major software still runs on a set of mainframes under MVS, TSO, CICS, and their ilk, I can tell you for sure that the mainframer's "GUI" is a 3270 terminal emulator 8-). This, and other mistakes, lead me to believe that the author wouldn't know "Big Iron" if it was dropped on him.

    The difference between mainframes and UNIX-based minicomputers is vast, as is the difference between the cultures of the two systems. The true "mainframer" culture is basically exists in the corporate world, and does not have a large academic component. The UNIX-based minicomputer culture is highly academic (although that has been changing due to the free micro-based UNIX implementations).

    - Ken
  • Hey, if you have the testicular fortitude to tout your writings in a forum like /. then you had better be prepared to take your lumps like an adult.

    Submitting ones own paper with a 'glowing' review that does not make it clear one is the author is a smidge over the top IMHO.

    - Ken
  • While the image is still vague, my impression (based my own observation, and the observation of some "clueful" industry consultants I know) of the "new and improved" software developer that companies want embodies the following, in no particular order:

    - Confidence, communication and humility.

    These are important in teamwork - don't let your views get lost because you're shy - but have the humility to concede when you're proven wrong.

    - Being able to see the forest AND the trees.

    To elaborate what I mean by that point:

    - A specialization in an area of technology such that they can perform "high leverage" activities. I.e. instead of "plowing through" a problem that would take a generalist 3 days, it takes them 6 hours, because of their specialized knowledge. An example would be someone who is intimately familiar with distributed systems development vs. GUI interface design.

    - On top of the specialization, a general *understanding* of the issues of management, economics, sociology, psychology, organizational behavior, and entrepreneurship.

    I.e. they don't have to KNOW how to run a business, they don't have to LIKE running a business, but they should UNDERSTAND why a business does what it does, and whether it makes sense in the larger picture of the world economy.

    Note that the above trait is probably the hardest one to find, mainly because there are so few people - even managers - in employment today that even understand basic principles of economics and innovation/entrepreneurship.

    - A genuine belief that software development is first & foremost about *PEOPLE*, not technology.

    That's what I see as the desired software developer of the future. Companies don't want "geeks", who spout hubris of their greatness while ignorantly criticizing business. They want "ubergeeks" who have workable SOLUTIONS to business problems, instead of just incessant complaining, and who can communicate their solutions in terms of RESULTS, not the technical details - but when pressed, can express themselves at ANY level of technical detail.

    What companies want is not the fakery of IT-analysts of the past, who had little technical expertise. They want someone who systematically tries to match technology with reality, instead of someone who lives in a bubble - be it "business" or "programming". They want *flexibility*.

    This is what the IT shortage is all about. A lack of a flexibility in the workforce.

    (end rant)
  • Although the distinctions between the two types of systems are becoming harder and harder to see in some ways, at least on the hardware side. But the difference in general attitude and programmer "culture" between the academic Unix community and the old-school corporate mainframe community is still pretty large, at least in my experience.

    The Unix folks (at least around here) sometimes don't seem to take certain types of support issues very seriously, and support response times can be pitiful. And while C is a good language and all, other languages actually do exist. Really. :-) The environment I work in doesn't even HAVE a C compiler, and our application predates the C language anyway (I think the first implementation was on a Univac 1108 in 1966).

    In our case (at least on the Unisys 2200 side of the fence where I work, since we have IBM big iron as well) it's OS2200, TIP, and UTS20-compatible terminals instead of the mythical mainframe GUI mentioned in the article, but I suspect the same general principles apply: Transaction-driven systems with text-based screens and very fast response times.
    --
    -Rich (OS/2, Linux, Mac, NT, Solaris, FreeBSD, BeOS, and OS2200 user in Bloomington MN)
  • I've got my threshold set at -10000, and it still isnt displaying everything. I think real censorship is taking over this bastion of freedom :(

    asinus sum et eo superbio

  • "Why do you think that the US Missle Cruiser that
    uses Windows NT to control it's engines had to
    be towed to port 3 times (the captain typed 0rpm
    and the engines crashed with a divide by zero). "

    I don't understand why Linux users keep trying to grasp onto this story. Your interpretation is obviously incorrect.

    Attributing an application error to the OS? Come on, you should know better.

    Damn, trn crashed on me. I guess Linux must suck then!
  • Ok, the guy makes a few good points about software, but I think he confuses them within the historical context.... Let me try to simplify.

    If you look at the history, things come in waves.

    Mainframes from IBM certainly came early on.

    But then DEC entered the scene with PDP/VAX running VMS. They billed their systems as offering performance for less money. Of course at first, the Mainframe guys complained that they didn't scale and were unreliable.

    Then later on Sun(and others) entered the scene selling Unix solutions. They billed their systems as offering performance for less money. Of course at first the Mainframe guys, and the VMS guys complained that they didn't scale and were unreliable. Actually as I recall the President at Digital stated that he never expected the Unix market to outgrow the VMS market.

    Then a little later on, the PC entered the scene. It was intended for a different purpose. But guess what? Eventually it grew up and now they're claiming(well Microsoft is) performance at a reasonable cost. And the Mainframe, VMS and Unix guys are all saying it doesn't scale and it's unreliable.

    Do you see a pattern here?

    I fully expect in 5 years, PalmPilots will be offered as the next great solution, offering performance for a low cost! :)
  • Many of these companies depend on their mainframe and AS400 databases for survival!
    The mainframe AS/400 world is nowhere near dead.
    Yes, many of these companies do depend on their iron and midranges for survival, but as you have partially said, an AS/400 is not a mainframe. It is a mid range business solution (and I emphasize `business') and does wonders with `client-server,' X Window, Java, Integrated x86 hardware for NT, and 5250; it absolutely sucks for any intense interactive (non-batch) processing.
    It is my belief that an AS/400 just can't be now considered in the old-school iron.
  • from my POV. He's getting flamed left and right for grammar and some factual errors, but still a good read.

    To some of you that may be lucky anough to have the status to choose your job, you definately have a different perspective than some of us "youngsters." Where I work, it's all about who can pull what strings, and who has what certs. It's very painful and frustrating to know that no matter how good an idea you have is, you might as well toss it into the shitter before you speak it. Actually telling someone just damages your credibility.

    This is the future I see... quicky solutions running on bad software running on cheap hardware. There are some of us that actually have to cope with the shit put in place. And work a few weekends to do it.

    Some of us still love the machine more than the money. But in the place I work, that's treated as a Bad Thing. Are mainframes/UNIX still the norm in the rest of the world? How would I know... here, if it can't be done on NT, they wonder why you'd want or need to do it.

    Hopefully, there's a much bigger world out there than I've seen, because what I've seen bothers me.
  • Actually, I'm seeing more and more spelling, grammar, and usage errors in many different print sources these days that appear to be caused by the software's "advanced features", as they are of an entirely different character than the "typo's" of the 50's, 60's, and 70's. Microsoft Office components seem to be especially laden with "features" that have their own ideas about how your work should be re-written and re-formatted.
    That's not to say that the authors don't contribute their own share of goofs along with the dreck and psychobabble.
    Maybe what Slashdot needs is fewer moderators and more editors.:-)
  • I was assuming this person with the somewhat uncommon name was a guy but leaving myself an out just in case that was an incredibly sexist and out-of-date assumption on my part.
    I am not judging the paper on the basis of the author's name. I was pointng out that instead of the usual "Person A submitted this story written by Person B about..." it instead appeared that the story submitter and the author of the subject of the story appeared to be one and the same which isn't how it's usually done (except for a certain resident gasbag).

  • >>...employees working long hours, often with little or no overtime. Many are even salaried.

    >Ummm yes, I like technology, but I like getting paid too.

    Then why are you working salary???
    Up till about six months ago I was working salary. I worked an average of 60 hours a week and got paid for 40. Then I was offered the chance to change over to an hourly rate at a lesser amount per hour. I jumped on it.

    And even though I am making about the same amount of money now that I was making before the quality of my work has improved greatly. Now when I look at a job I ask myself how long will it take, not how will I finish it in 40 hours. I have found that my entire creative process has changed because of this.

  • I think the fortune file says it best:

    "I don't know what this is, but an 'F' would only dignify it."

    I found this paper full of blatant grammar, historical and conceptual errors. IT personnel that don't (didn't) listen to end users? Yeah, they're just going to turn this multi-million dollar machine over to a bunch of techies and trust that they're going to "do the right thing" and get those paychecks/invoices/quarterly reports generated. A little common sense, please!

  • My real problem with this article is that he paints microcomputer engineers and programmers as idiots for using inferior solutions - while completely missing that the more complex solutions just wouldn't have worked on the limited hardware available.

    Yes, X has a lot of neat features. But it always has been, and always will be, slower than direct-to-hardware systems, especially for graphics-intensive applications like VR and games (thus svgalib's niche). It wasn't until the early '90s that X became even remotely acceptable on most micro hardware.

    Yes, Novell and other LANs are inferior to other networking solutions. That's why TCP/IP has won, now. But at the time, they were the fastest, most reliable, and cheapest way to get a network.

    Over and over, he shows a preference for blue-sky solutions rather than what actually works, and works at a fraction of the cost.

    He even goes off on local application execution, which is a FAR superior solution in most cases - a big expensive server and xterms are more expensive than a cheap file and database server and a bunch of PCs for the same horsepower.

    He needs to get out and get a real job.

  • "A" paper in 10th grade? Maybe at the 'Tard school. Someone ought to ask Lion if he rides the short bus to school...


  • What is wrong with you people? Since when do we judge a paper based on the authors name? Yeah, some guy(?) named "unitron" ... for pete's sake! Look at our logins here!?

    How about Reading the paper before you go shooting your mouth off?

    I've read it now (leo's doing a suprisingly fine job keeping up with ./ so far), so I think I at least am entitled to comment on it....
    Sure, there are gramattical and spelling errors and cliche's, but this isn't a peice for print that has been edited by anyone... Content is what is important.

    Working for a company that makes millions of $$ selling emulation software, i'd have to say that the GUI of choice for "mainframers" is a WinXX solution. Many of these companies depend on their mainframe and AS400 databases for survival! Training thousands of users how to use a dumb terminal just to look up somebodies insurance info is just crazy these days! They use windows, have their Mainframe app's brought to the web or have custom front ends and macros/etc to make it easier to use.

    The mainframe AS/400 world is nowhere near dead.

    As someone who was in the same classes as Lion, we made fun of mainframes and cobol and all that when we were freshmen in the CS program... how many of us are now working or have worked for places that use mainframes every day? of my close friends that i graduated from UMD with?
    ALL OF US


    This post was not checked for correct grammar, punctuation, spelling, or content. Deal with it.
  • > I can't believe he wrote this:
    > ...employees working long hours, often with little or no overtime.
    > Many are even salaried.

    I belive that Lion's intention with that statement was:
    ...employees working long hours, often with little or no overtime pay .

    Some of these people are working incredibly long hours, or "on-call" hours for a salary, not for overtime. If I was going to be expected to work 60+ hrs per week, I'd want to be paid for extra time/work too..

    I don't know where you came up with the idea that people shouldn't get paid...

    This post was not checked for correct grammar, punctuation, spelling, or content. Deal with it.
  • Yep. Maybe Rob should provide mirrors for sites which are to be mentioned here ;-)

    belbo
  • I'm not saying they shouldn't leave him black and blue. I was just tired of reading... "here are the facts, get it straight!" without trying to further the topic and supporting their newly stated facts.

    BTW - "testicular fortitude"
    very well put.

    Cordova
  • by Cordova ( 16031 ) on Wednesday March 24, 1999 @09:35AM (#1964866) Homepage
    I've been frequenting /. for a few months now, and I have never seen such a harsh critique. Every comment so far except this one seems to think Lion's paper should be drawn and quartered.

    I know most people here are techies and linux addicts, hell I'm one of them; but why not actually try to figure out what the writer is trying to communicate instead of tearing specifics of the article apart.

    HUMBLE REQUEST
    If any of you have better opinions on where the industry is going to go, in reference to methods/philosophy/ethics in development mentioned in the paper. I want to read them. Reply to this, or even post your own writings to /., I don't care. Give me better ideas, not "you didn't have every single little fact right in your article".

    Cordova

    "Can't lurk all the time"
  • Hey, Bone Crusher. I'm the guy who had to take over your project after you left. You talk about a "final product". That's a mythical beast; there will always be a 1.01, a 1.2, a 2.0 and so on - unless you're writing code for a space probe that will never again be touched by human hand (and even then, they're likely to tinker via radio).

    Without the documentation that you so kindly glossed over, I had to make guesses about what your code was supposed to do. Don't kid yourself - there were bugs in your code. I could tell what your code was actually doing, but sometimes it was hard to tell what it was supposed to do. But I did the best I could. And because you didn't document what you had done in your 45,000 lines of code, I had to go over every single line, trying to find where you had implemented that function they wanted changed RIGHT NOW. Don't get me wrong - it was fun to have half the organization breathing down my neck, wanting to know why I couldn't just read the specs, find the function, and change it. And your reasons for choosing the original architecture were doubtless sound, but a month or so after you left, the Big Boss came back from a seminar and said we should have done it as a intranet-based centralized client-server structured object-oriented module. And I didn't have the time to analyze the entire application and tell him why he was wrong, and you hadn't left any record of WHY your architecture was better, so I had to capitulate.

    This probably pleases you. You probably feel like Mel, the real programmer, whose code was fast, functional, and incomprehensible. You think if I was only as good as you are, I wouldn't have these problems - I'd be writing my own 1.0s instead of maintaining your code. You may be right, and I hope it gives you warm fuzzies. I hope you enjoy feeling like an artist. But I hope you realize that there is nothing left of your art. By the time we hit version 6.24.06a, not a single line of your code was left. It was like you had painted this beautiful vision of the Mandelbrot set on the wall of the Louvre, only they changed their mind and asked me for a surrealist landscape, so I sandblasted it a bit at a time and transformed your vision into mine.

    I'll be leaving the project pretty soon now - going to work for Microsoft. Bill says it's important - something about how they're going to be the next Red Hat. Anyway, I know that whoever takes my place won't be as good as me, and won't have my experience on this project, but at least they will be able to understand my thought-processes while developing the program. I bit the bullet, sucked it up, and followed the documentation procedures.

    I don't really disagree with your points on freedom or one-size-fits-all templates. But don't skimp on documentation. Those who have to maintain your code thank you.
  • Well, I guess we /.'ed the server...only made it
    through about half then it went south. Oh well.

    Grated there are numerous errors both historically
    and others, however there is a point I think the author is trying to make. Having not exactly finished the article, I am going out on a limb here. But I think the point is one size does not fit all.

    All the IT organizations I have interacted with in recent years recite this singular thought like a mantra.

    I dont know about others, but I get results because I still treat systems architecture and software development as a art. I cut corners in the "software engineering", "process documentation", and other more bureaucratic areas where it makes sense without compromising the final product. This drives IT types crazy.

    I have noticed this more in recent years than in the 80's. Back in the days of PDP's, Vaxen, BSD, etc, each project had its own set of challenges.
    We were allowed the freedom to overcome those by whatever methods were available.

    The 90's were just the opposite...try to do every project using the same template.

    So is the pendulum swinging back? Have we come full circle ? I dont know. But I do know that OSS
    gives me the tools I need to crank out solutions at an ever increasing rate to meet demand. I also know that IT types are watching very closly trying to figure out "How'ed they do that!!!".

  • The paper has dozens of grammatical errors. Cliches are everywhere. I didn't see any evidence of original thought.

    Background research is nearly nonexistent. The author doesn't even refer to basic works in the field by Yourdon and others. In fact, only three actual books are mentioned in the bibliography, and that counts the New Hackers Dictionary! (Although "Microserfs" is quoted, it is not in the bibliography.)

    That was truly "written as university research"?
    It would have barely rated a "B" as a Grade 7 report.

    Faugh.


  • I disagree. I think this would be an "A" paper at least through the 10th grade, and possibly all the way through some of the weaker secondary schools.

    Unfortunately this says more about schools than it does about the quality of the paper...
  • Why is that the majority of these pseudo-academic papers that get posted to slashdot contain SO many spelling and grammar mistakes? You would think if someone was going to write about the IT industry they would at least learn how to use the spelling and grammar checkers in their word processor.
  • Oh how I hate these LaTeX2HTML generated pages. You read three paragraphs, then have to wait for the next page to load, foonotes are on separate pages again. This may be useful for 100 pages documents, but this little thing could have easily been put into a single HTML file. So many authors seem to think LaTeX2HTML is so scientific and makes their documents look more important.

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...