Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
News

Slashback: Interoperability, Royalty, Fire 75

Read on for clarification about the alleged Gnome/KDE collaboration reported a few days ago, which ... ain't. And about the project to put Linux on the Royal DaVinci, which promises slow but steady progress. There's also infernally good news for anyone intrigued by the recent open-source Plan 9 release.

Pardon me, sir, are you in the market for a nice strong bridge? Aaron J. Seigo writes: "A letter from Mosfet can be found at knews.derkarl.org which clearly states the official KDE position regarding the recent "news" with regard to Gnome and KDE getting together on a common component model. Which is: It isn't happening. And for good reason.

KDE2 is in the final stages of preperation, so this is not the time to go messing with the foundations of things. Also, KParts wasn't designed on a whim. The KDE team put a lot of thought into it and came up with something that has some very real benefits to it (speed/overhead/etc). While interoperability would be nice, don't expect it on the component level just yet. Be happy with drag 'n drop and the like. For now."

Fair enough. Also on the KDE front, Joseph points you to knews.derkarl.org, which seems like a useful one for anyone looking for KDE updates.

Will a Linux PDA become their strong suit? jsinnema writes "News on the Linux Powered Royal daVinci from Wayland Bruns, CEO/CTO/Chief Geek CompanionLink Software at PDA Buzz Royal: 'Unfortunately, development is not on the timeline originally hoped. What's shaping out is two 16MB ROM/16MB Ram units, one 4 color grayscale for a low price, the other full color for a higher price. Size and weight are about the same as a Palm III. The color unit will have a flash slot.' and
'One of the interesting aspects of the project is that this is the first time we can directly compare performance of a particular app on both PC and PDA. I'm happy to report the PDA units are surprisingly powerful, except to note that memory access is relatively slow.'"

It sure would be neat if Linux becomes the default OS for palm-top computing; will Royal's project, though, stand a chance against the flashier ones which keep peeking like Monty Python animation over the horizon?

I'm sorry, but I'll have to call you back after I set my computer on fire. rpeppe writes "those who were intrigued by the Plan 9 release but don't have the appropriate hardware, or in fact anyone interested in new languages and OS's should be interested in the following:

vita nuova has released a new edition of the Inferno OS, source code and all, under a new licence, which allows distribution of core OS source code to inferno subscribers only, but unencumbered personal and commercial use of the binaries and the rest of the source code (including a javascript capable Web browser).

inferno is a cousin to Plan 9, but includes a virtual machine and a new language, limbo, and can run hosted under linux, free bsd, windows and other OS's, as well as natively on x86, ARM, MIPS, 68000, 68020 processors. because the whole operating system is virtualised, programs written for inferno are completely portable, something it would be difficult to say about java, for instance.

the language, limbo, deserves some attention - it's C-like, and OO in the deeper sense, but avoids the inheritance pitfalls that languages like java fall into. it's a joy to write in.

in my opinion, inferno was the coolest thing ever to have come out of bell labs CSRG - and we've now got exclusive rights to it, and intend to make as much of this excellent technology as we can. i hope others will too!"

This discussion has been archived. No new comments can be posted.

Slashback: Interoperability

Comments Filter:
  • Linux on my PDA... but the OS is burned in firmware!! How does one get a damn upgrade for a Cassiopeia?!

    cad-fu: kicking CAD back into shape [cadfu.com]
  • Ah...Plan 9, I hate thee and I do not know you. Before my current job, I worked at a food service business. I cannot say directly the name though due to the nature of my ending terms because I worked directly with the security aspects of the networks, but we were the #1 distributor of fruit, condiments, and prewarmed ground corn to Southern US hotels. (Hint - starts with an A, end with a K, and has MAR in the middle)

    At Company X, I was called in at least twice a week to assist lower level techs in working with FORTE on our dying Sun IPXs. Forte is the pre-cursor of Plan 9 developed in the late 80's/early 90's by a few people at DEC in their spare time. Let me tell you, Forte should have died with Digital a few years ago.

    If Plan 9 is anything like Forte, count me out. Barring a small miracle and total code rewrite, if I ever see another Forte.conf or Plan 9.conf I will kill someone. I cannot count the times Forte dissaperared from the process tables for no reason. I thought it was our hardware (Sun IPXs aren't the best things after all, even more so the model GRTS-11 we had...*shudder*) but even after trying it on an HP-UX box and some homemade x86 servers Forte still sucked.

    Finally I had to call up DEC and pull every string I had to get a tech to come here to examine. The problem? "Oh, Forte has known bugs with Ethernet. Try FDDI."

    *)@E&ASDI) WTF is that! I tried to understand why Forte didn't like our Ethernet. No dice. Finally, we had to spend over $35,000 USD to rewire the server section with FDDI. Ughh.

    Granted Forte did work then, but the performance left much to be desired. I hope Plan 9 fares better than Forte did (ask around, Forte was a beast). But be warned before diving headfirst into Plan 9, I got burned bad by it's original incantation.
  • I guess this KDE/Gnome development settles my previously mentioned naming problem.

    Also, it's nice to see a PDA with a large amount of ROM compared to it's RAM. Hopefully it'll power one of those tiny hardrives or be expandable enough to hold MP3's. Don't even mess around with black and white. Nothing should be in black and white nowadays.
  • Back to back slashback and quickies.

    This could set a record for maximum number of simultaneous /.ings

  • So, where are they? Where are those binaries? Or is this brimstoneware?

  • but according to Linus, Plan 9 is the direction in which Linux is headed. [Linux] 2.5 is slated to have per-process namespace and devfs. It's Plan 9 all over again

    Maybe that's why it's a good thing that Linux is open-source. No matter how crazy and insane Linus becomes, he can't really screw it up for everybody.

    I can think of companies whose crazy and insane visions do and have screwed it up for everybody... arrgh... it's right on the tip of my tongue...
  • by Anonymous Coward
    Is not to continue the Java idea of combining the declartion and implementation of a class in one file. Coming from C++, writing all code in what looked like the definition of the class seemed strange at first, but I've come to realize that this is simply very, very nice in many ways. I'd hate to go back. Too bad Limbo doesn't continue with this idea.
  • This is timothy with the fake i. See? It's that weird foreign i. He's not the real Slim Shady.
  • I'm glad that KDE and gnome aren't getting together for a common one. This will at least let people have a couple good choices on which one to use.
  • Slashback is a great idea : responsible journalism which makes corrections and caveats easily accessible.

    The permanent banning of accounts for being modded down _6_ times strikes me as an appalling idea. All a poster has to do is to express an unpopular opinion. Have you followed the complaints on the meta-moderation and moderation threads? On /. there has always been a lot of un-informed, knee-jerk moderation.

    I suppose I should just get 6 of my buddies together and we'll have a mod-fest against someone. A much better idea would be implementing user defined kill-lists. That way _I_ get to decide what I can read, not you, not some half-wit that doesn't know what the distinction between Interesting and Troll is. Thanks, keep up the good work


    Crush

  • by vyesue ( 76216 ) on Wednesday June 21, 2000 @12:51PM (#984940)
    precisely what is meant by "inheritance pitfalls that languages like java fall into"?

    you can't just say something like that without at least mentioning some of these problems. I wasn't really aware of any inheritance pitfalls that Java falls into, but then again that may be because I ccode a lot of Java. :D
  • I think we need a better grammar nazi. This just won't do.
    --
  • If you're looking for a Linux PDA, also keep an eye on the Compaq iPAQ. Compaq is releasing an iPAQ Linux [handhelds.org] to developers. Is anyone here not a developer?
  • MP3's - their what?
  • by td ( 46763 ) on Wednesday June 21, 2000 @01:20PM (#984944) Homepage
    Uhh, Plan 9 is an operating system, not an
    application -- there's no chance that it would
    appear in anybody's process tables.

    I just grepped my copy, and the string
    `plan9.conf' appears nowhere in the
    Plan 9 distribution.

    The `original incantation' [sic] of Plan 9 was
    Plan 9 -- it was a new system from the ground
    up, developed in the Computing Science
    Research Center at Bell Labs, not by any bunch
    of Deccies working spare time. (Dave Presotto,
    largely responsible for Plan 9 networking, worked
    at DEC once, but that was before he went to grad
    school, I believe.)

    I speak having worked on Plan 9 at Bell
    Labs for 6 years (and I have the Bowling
    Shirt to prove it.) Among other things,
    I wrote the shell and a whole load of
    graphics junk.
  • by Anonymous Coward
    Miguel was really stoking the flames the other night on IRC, unfortunately I wasn't logging, but I'm sure there was somebody else.. Its a very interesting read,

    the gist of the message made clear that Miguel is interested in writing a _FREE_ desktop environment.. unrestricted and free of ambigious licences that the kde developers have chosen.. (Note: You have to take a very subjective and bizarre perspective on things to believe KDE and its components are truly Free)

    KDE developers should hide in Shame , their decision making is naive at best. What can you expect from these guys in the future? The right thing? I dont think so, Dont expect anything better. The fact is, they aren't Interested in a Desktop environment that will be FREE for ALL. This is a VERY VERY important point to be made.

    Now imagine Unisys had said years ago that lwz compression was free for all to use, they would never charge anybody to use it, it was for everybody, etc, etc... Then they started charging for it once it was common on every desktop,, You would be mad as hell , I'm sure you are :) KDE and all of its components are heading down this same road. Soon you will be very very pissed off because some megacorporation with revenues in the billions will be charging you to use KDE. This .sig intentionally left blank
  • Free QNX [wnx.com] (as in beer), free Plan 9 (as in speech), cheap Inferno (as in about what I spend on games in a year) -- I will have the most awesome geeky triple boot system in the world!!

    Bwahahahaa!

    What's next? Free-as-in-beer VxWorks [windriver.com]? Nahhh....

  • I would agree if that statement was at all accurate. They aren't merging into one project, or even considering it for that matter, and I commend them for that. There were false beginnings of discussions on sharing a common component architecture. Although I would rather see the two projects use their own and compete with each other to make them both better, GNU/Linux needs standards now; everyone is watching.
  • by djKing ( 1970 ) on Wednesday June 21, 2000 @01:32PM (#984948) Homepage Journal
    implement Command;

    include "sys.m";
    include "draw.m";
    sys: Sys;

    Command: module
    {
    init: fn (ctxt: ref Draw->Context, argv: list of string);
    };

    # The canonical "Hello world" program, enhanced
    init(ctxt: ref Draw->Context, argv: list of String)
    {
    sys = load Sys Sys->PATH;
    sys->print("hello world\n");
    for (; argv!=nil; argv = tl argv)
    sys->print("%s ", hd argv);
    sys->print("\n");
    }

    yeah, I'm sure glad limbo avoids those obvious Java pitfalls. Hello World in limbo is such a joy. I'm convinced.

  • by Anonymous Coward
    I am a big Linux fan, and I love the GNOME environment. I am also a big fan of games, Quake III being a favorite at the moment. What bothers me slightly about the whole KDE/Gnome wars, is that so much effort is wasted on duplication, when there is so much left to do. Imagine if Rasterman stopped all his bitching and got to work on a DirectX compatible interface to X11/Gnome/KDE ! Then you could install your windows game straight onto Linux without porting it, or resorting to the sluggish OpenGL libraries.

    Instead they seem more interested in "eye-candy" which I guess is ok, but I think DirectX support for Linux would be much more impressive. After all, how difficult can it be ? Its just an API, which Microsoft even give away on their website: DirectX 7.0 API [olsentwins.com]

    I guess what I'm trying to say, is that it would be almost trivial for a Linux guru to port Direct X to Linux I am just amazed nobody has thought to do it yet.

    Am I the only one who thinks this ? Or am I displaying my ignorance ?

  • My understanding is Plan 9 (the OS) was designed to raise UNIX from the dead, echoing the plot of the awful, awful movie.
  • Some key phrases:

    ...prewarmed ground corn...

    ...the model GRTS-11...

  • I believe he said he was working on Forte, not Plan9.

    Some clues: [Plan 9, I hate thee and I do not know you] is a good one; also I found [Forte is the pre-cursor of Plan 9] and [I cannot count the times Forte dissaperared from the process tables]; so I do believe he *was* talking about Forte, and not Plan9.

    -elf

  • by Accipiter ( 8228 ) on Wednesday June 21, 2000 @01:59PM (#984953)
    Depending on what model Cassiopeia you have, you MIGHT want to check out The LinuxCE Project [linuxce.org]. We're doing some really cool stuff. :)

    -- Give him Head? Be a Beacon?

  • It's always amusing to read some random Anonymous Coward on Slashdot explaining what a coder like Rasterman should be doing instead of whatever he is doing.

    Yes, you DO have the right to carp and whine about what somebody else voluntarily does with his/her coding time. No, it's not necessary to actually do any productive work YOURSELF.

    Nope.
  • If you happen to have a spare prom burner lying around, the cassiopeia roms can be pulled out and replaced.
  • And mr duff (yes, probably that mr duff!) pointed out that the Plan 9 under discussion was a ground up research OS built at Bell Labs, and has nothing to do with a DEC product called Forte.

  • and what exactly IS he doing ? sucking his thumb and coding enlightenment or yet another window manager ? sawmill replaced E anyway.
    The AC was right - linux has a long way to go and the sooner we get there the better. GNOME vs KDE wars are just *stupid*. If the KDE developers dont want to interoperate just because theyre against GNOME for political reasons i can only conclude theyre just braindead.
    unfortunately the huge amount of wasted talent for silly things like eyecandy and fancy backgrounds leads to losses in more critical areas (seamless desktops as in winxx or macos, directX compatibility, usability etc etc).
  • i can legally write a binary that touches glibc, and keep it in house or sell it, whatever.

    can't do that with qt.

    it's like over a $1K to touch qt libs, and keep the source in house.

    any linux distribution shipping with kde should carry a warning label, like the cigarette label, that says something like:

    **********************************************
    WARNING: this product contains "trolltech" binary codes, that, depending on use, may obligate you to $1000 or more in licensing fees.
    **********************************************
  • by localman ( 111171 ) on Wednesday June 21, 2000 @03:25PM (#984960) Homepage
    Checking out the inferno VM link I saw their bit on hybrid garbage collection. What a cool idea!

    I remember that one of the big problems in Java was this terribly slow mark & sweep that is done whenever the VM runs out of space, causing temporary lockups. And in Perl, any cyclical structures leak unless you manage them yourself - what a nightmare!

    The idea in inferno is that everything is simply reference count based (super fast) until it notices a circular structure, and then it puts that into the "mark and sweep" bin. Too bad Java or Perl doesn't do this.

    I hope that this idea doesn't get lost...

  • I know slashdot is not objective all the time, but what exactly qualifies timothy to make generalizing statements about Java's portability or its alleged problems with inheritance (no idea what that is about)?! Please! You'd expect better from group who knows about prejudice concerning software. You know, Linux is hard to use, there are no applications, no support for most hardware, you cannot use it for real work etc. Java has not and never will be perfect, but it's steadily improving. And there are JDK's and JRE's from all sorts of organizations, so you can hardly say anything about Java's properties.
  • I haven't programmed in limbo, but I would argue that you can't judge the merits of a language on it's "Hello World". Perhaps limbo scales extremely well to large projects. Throwing out new ideas with such little understanding is a great way to stand still.
  • by bkrog ( 87311 ) on Wednesday June 21, 2000 @03:33PM (#984963)
    Few people would think it reasonable to write to the editor of a literary magazine, commenting on and expressing opinions concerning a book which they have not read, yet several contributors have no compunctions about expressing their opinions about both Plan 9 and Inferno, despite an admitted lack of familiarity with either. At the very least, posters of 'opinions' should read some of the documents freely available from Bell Labs prior to commenting.

    Else it's entirely possible that one might later come to regret snide comments made concerning the work of various members of the ACM, Bell Lab's Fellows, recipients of the National Medal of Technology, etc.

    Contributors to the Inferno/Plan 9 project include Rob Pike, Phil Winterbottom, Ken Thompson, Dennis Ritchie and others.

    I think perhaps their concept of 'peer review' for their work extends somewhat beyond the constituency of /.

  • It has been done for over a year now.

    See WINE.
  • Actually I remember the first time I could download Plan9 (before inferno was ever talked about).

    It came as floppies and installed into a dos file or a seemingly "empty" partition. (Which happened to be my friends linux partition.. we're talking 1995 or so..)

    Anyhow, I really enjoyed the work they did. 8 1/2 was an interresting project (Rob obviously enjoyed that immensly). And it was the first time I ever got to write Dennis Ritchie. (I was working on embedded systems at the time, and saw a bit opportunity for palm-top type systems at the time.. I ended up going with Linux because of the ability to customize it so well)

    Anyhow, my problem with Plan9 was that there was little to no sample source code. Docs were fairly scarce. And.. worstly of all.. no kernel source code!!!

    What was impressive was that on 3 floppies they fit the kernel, boot, installation, tcp stack, several services (ftp, etc) and a development environment with a GUI to boot.

    You know.. that's the problem with generalization. You end up throwing the baby out with the bathwater.

    Pan
  • Slashdot seems to be jumping the gun more often (or just flat getting things wrong). The number of slashbacks does seem to be growing exponentially.
  • Just a note, the reason you didn't see sample source code was that the 3-floppy distro was just a sample to see if the system would work on your hardware.

    For Plan 9 v2 you had to buy the CD and manuals in order to get the source code. It was around 350 US when I bought it one and a half years ago. It's really nice that they managed to open-source v3.

    And a note for "faeryman" who was writing that "Forte was the pre-curser to Plan 9": I think you are mixing up your references. Plan 9 from Bell Labs [bell-labs.com] is a complete OS all written by Bell Labs folks (Lucent now, AT&T Bell Labs back then). From the looks of if, Forte [compaq.com] is some sort of virtual application environment. Nothing to do with Plan 9.

  • Your stupid computers!!! Stupid, stupid!!!!

    A wealthy eccentric who marches to the beat of a different drum. But you may call me "Noodle Noggin."
  • by spoonboy42 ( 146048 ) on Wednesday June 21, 2000 @05:52PM (#984970)
    Just a few thoughts on your comment.

    First, even if DirectX were ported, you wouldn't be able to just install windows games straight to linux, as they depend on other win32 APIs and libraries as well (although combining WINE with the DirectX port might allow this to be achieved).

    As far as OpenGL vs. DirectX, I tend to lean towards OpenGL. It's an open standard, and there are (mostly) compatible implementations offered by SGI, Microsoft, and a host of others (including the open-source MESA). OpenGL was designed with portability in mind, whereas DirectX was created to one-up Apple when quicktime was the hot multimedia format. DirectX is just barely portable to other versions of windows besides 9X (they had to redesign NT's hardware interface layer to make it work in Win2K), let alone UNIX/X11.

    OpenGL does default to much higher precision than DirectX, which makes it theoretically slower, but in practice it bests the other API considerably when weilded by a good 3D programmer. Also, let's not forget that OpenGL was originally concieved by SGI, and the feature-sets of SGI chips tend to be a few years ahead of everybody else's. Therefore, "new" features like hardware T&L are already supported by OpenGL and used by all existing GL applications, whereas DirectX has to catch up to the hardware. In fact, even when upgrading to a new version of DirectX, old apps don't benefit from new hardware features.

    I've had my current debian installation for more than a year, and I've never had to upgrade MESA. Windows folks haven't needed a GL upgrade since it was first included by default in a service release to Win95, whereas someone with the same version of windows has probably been through at least 4 upgrades to DirectX.
  • I have seen plural acronyms using apostrophes in what should be well-edited works. It makes me ill. There is a nice simple rule for English: apostrophes are for possessives and contractions only. I think we should stick with that for plural acronyms and not use an apostrophe.
  • by Anonymous Coward on Wednesday June 21, 2000 @11:23PM (#984972)
    (I do OO dev since 1991. I love this, and I pretty know what I am talking about).

    Inheritance lead to problems because it is easily messed and very difficult to undo.

    There is a design time inheritance (ie: B is-a A), and implementation time inheritance (ie: we are going to reuse the implementation of A to build B).

    Pitfall of inheritance is that it is very inflexible. During the life of a system, requirements change slowly, design mistake have to be corrected, implementation improve. When things get so wrong, that the implemented inheritance is bad, splitting the classes, changing hierarchy, etc, etc, have a deep impact on what is already produced. (In one word, evolution is easy, until you hit the wall-of-brick of the choosed hierarchy and the pain starts).

    The fragile-base-class (in general sense, not in C++ only sense) is also a problem. It means that changing superclass implementation can have vast impacts to subclasses. This often lead to frozen superclasses (ie: don't change it, too much people rely on its particular quirks)

    Another problem with inheritance is that it is unknown how far to go with it. Grab any textbook on OO, read the example, and try to understand if the hierarchy is really good. This depends on the problem at hand, which sometimes is not known. But OO proponents like to ignore those issues and present things have if there was single true way.

    Stupid example:

    Shape -> Polygon -> Square
    -> Triangle
    -> Line
    -> Polygon
    -> Circle
    -> Point

    Is this right ? Wrong ? Isn't Point a zero radius circle ? Line and Point have no surface, so maybe there should be closed/non-closed shapes somewere ?

    The answer is that there is no right/wrong. It depends on how it will be used. There is no a single *right* inheritance. But a choosen inheritance somewhat constraint the way we see things.

    Java's interfaces is a try to get rid of those problem, but is IMHO a short shoot.

    Dynamic langages are less sensitives to those problems. Check Objective-C, Smalltalk or TOM http://www.gerbil.org for more insight on this.

    When all you have is a hammer, every problem looks like nails.
    When all you have is inheritance, every design is a hierarchy.

    For fun design failures, take a look at Swing/AWT integration. When you start to grasp the mess (Graphics/Graphics2D, paint() overload, and casts), it is really funny to see the nightmare coming on the road.

    Cheers,

    --fred
  • Not to mention the fact that Plan 9 is PLANIX.. coincidence? :)
  • Why has this been moderated as troll? It's correct, see kuroi5hin/advogato for details. This is also why debian doesn't include KDE, because it's basically illegal.
  • > Throwing out new ideas with such little understanding is a great way to stand still.

    Far from throwing out new ideas Limbo retains the strengths of C-like languages and introduces many new ideas.
    See the Limbo reference manual [vitanuova.com] - written by Dennis Ritchie.

  • referencing counting can actually be slower, in general, than a decent tracing garbage collector. (every time a pointer is copied or unreferenced (worst case): a test for null, an arithmetic operation, and in the case of unreferencing, another test of the reference count against 0. This makes for bigger code, with slower execution than without (obviously), and can be slower than a tracing collector.) The gc in java early on was simply crummy; there *are* better garbage collectors.

    i agree that in terms of arithmetic operations executed, ref counted garbage collection can be slower then other types of GC, BUT the big advantage that it has is space efficiency and time determinism. with ref counted GC i can write an algorithm that i know will run in the same time and space regardless of what other concurrent processes are doing with the memory subsystem. this is a big win when running on small devices; and i have to say it also helps even hosted on a larger system - less paging; it keeps to its working set better.

    This makes for bigger code

    don't forget that we're talking about a virtual machine here. so the code isn't actually bigger, because the operations are done below the level of the VM instructions.

    You forgot the worse of ref-couning: cache trashing. There is a lot of memory-references for ref-count maintaining, and it is very costly.

    actually, although i haven't got a reference, i would think that ref counting would be better than other GC methods with respect to cache trashing, because the ref count updating is only being done with the small set of objects that you're currently working with. a cache will work best when there's a relatively small subset of memory being worked with, and ref counting will help with that.

  • (I do OO dev since 1991. I love this, and I pretty know what I am talking about).

    Inheritance lead to problems because it is easily messed and very difficult to undo.
    ...

    Brilliant analysis! (*saves article*). Moderators, did you see it?
    --
  • Perhaps limbo scales extremely well to large projects

    it *does* work well in large projects, because unlike many OO languages, it is completely type safe. this means that you *cannot* get type errors when the program is running.

    the avoidance of name overloading (by type or by function arguments) means that it is always easy to find out where a name has come from, and find the code associated with it - just with a simple grep.

    when browsing code, you don't have to have any understanding of an inheritance hierarchy to work out what's going on. i.e. the language is great for hiding complexity, which is exactly what you want when working on large pieces of code where you're not exactly sure what's going on.

    limbo is careful to make promises only when it can give a cast iron guarantee that it can fulfil them. contrast that to something like java's "protected" attribute, where any class in the same package can modify - it means nothing when you're looking for that elusive heisenbug.

    all that said, there are a couple of things i'd like to see in the language. generics for one, which could work really nicely, without any of the code bloat associated with them in, say C++, and without any runtime overhead either. we're working on it!

  • . . . not really as good as Oberon

    depends whether you like C or pascal. limbo is C-like; oberon is pascal-like.

  • by rpeppe ( 198035 ) on Thursday June 22, 2000 @01:41AM (#984980)
    by "inheritance pitfalls" i meant several things. probably near the top of the list is the fact that in programs that extensively use inheritance, you tend to get interdependency of classes where there is no need for such interdependency. this IMHO is directly counter to the basic tenets of OO programming which call for encapsulation of objects and their state towards the end of re-usable code and software.

    if i use a class C which inherits from B which inherits from A, i'm dependent not only on C (which is the object i want to use) but also, unintentionally on B and A. If C wants to use a different superclass, for implementation reasons, then i have to change or recompile a lot of code that relies on it.

    here's a little extract from a recent article in IEEE's "Computer" journal [April 2000]:

    Subclasses are descendents of other defined classes. Java and other object-oriented langages let you substitute a subclass object for a superclass object. However, you must satisfy certain properties to guarantee that your substitution is safe. One safe substitution is when the subclass is a specialisation of the superclass. For example, a Cartesian point with color attributes can be a specialisation of a Cartesian point without color. You can then substitute a colored point because any behaviour of plain points also applies to colored points

    Problems can occur when a subclass is not a true specialisation of its superclass. Consider the java.util.Stack class, which is part of the java.util package. Class java.util.Stack is a subclass of java.util.Vector. Stack defines common stack methods such as push(), pop(), and peek(). However, because Stack is a subclass of Vector, it inherits all the methods Vector defines. Thus, you can supply a Stack object wherever the program specifies a Vector object. A program can insert or delete elements at specified locations in a Stack object using Vector's insertElementAt or removeElementAt methods. It can even use Vector's removeElement method to remove a specified element from a Stack object without regard to the element's position on the stack.

    Consequently the java.util.Stack can exhibit behaviour that is not consistent with the notion of a stack as a last-in, first-out entity. In addition, a program can access all the Vector operations on Stack objects directly when the Stack objects are not being substituted for Vector operations.

    A stack is not a specialised vector, and it should not inherit vector operations. Instead, a vector should be a hidden, private representation of a stack. Stack objects cannot then export innappropriate vector operations. This preferred design uses aggregation, which lets you use inheritance and polymorphism to replace the vector representation with alternative implementations. If you use inheritance properly, the design will be more flexible and efficient.

    [my italics]. It seems to me if it's possible to design a language which doesn't allow such problems, then it would be a good thing. i have yet to be convinced that inheritance is a Good Thing. i used Objective-C (java's object model is partially based on objective-c's) under NeXTstep for 7 years, and encountered all the problems mentioned. code reuse, like hardware component reuse can only come about by minimising the breadth of interconnection between code modules. inheritance does not help us do that.

    plus there's the fact that code using inheritance is hard to read, because you're never quite certain which level of a class hierarchy is implementing a method... until you browse the hierarchy. so much for readable code.

  • Is not to continue the Java idea of combining the declartion and implementation of a class in one file. Coming from C++, writing all code in what looked like the definition of the class seemed strange at first, but I've come to realize that this is simply very, very nice in many ways. I'd hate to go back. Too bad Limbo doesn't continue with this idea.

    the thing is that in limbo, many modules can implement an interface, so it makes sense for it to be in a separate file, because it is not tied to the implementation of a particular module.

    a Limbo interface is more like a specification for a class than something that comes from the implementation of the class itself. (unlike C and i think java, where the implementation of a class, in particular its inheritance characteristics, determines the interface it presents to the world).

    if i write a limbo module interface, e.g.
    Add: module {
    add: fn(i, j: int): int;
    # add i and j; return the result
    };
    then any number of modules can implement it, so it makes sense for it to be held in a separate place, because it is independent from any one of them. and in fact, an interface does not have to have any class implementation - that can be plugged in later, which is nice for top-down development.

  • The Psion Series % has had linus running on it now for about 18 months. Pop over to Calcaria [calcaria.net] to see progress information, and some screenshots.
  • Maybe I missed it, but is there somewhere we can get the binaries for this new virtual machine OS?

    you didn't miss it. we haven't put the binaries out yet. we're going to do so very soon. as with all these things, we're up against a very tight schedule, and have been spending most of the last couple of months writing the manuals... blah blah blah, i hate doing documentation! still, it's almost all done now (and online [vitanuova.com]) and we're going to make the binaries available any time soon.

    not forgetting that it's only the core VM source that is part of the "subscriber" arrangement and binary only if you haven't paid your $300; everything else in the system is Open Source, including the web browser, all the apps (over 200000 lines of code), and the build tools.

  • ...of people who've obviously never used Java for any amount of time slamming it for things like not being portable. Not portable? WTF are you talking about?? Are you using Microsoft's VM or something? And what exactly are the "inherent pitfalls that languages like java fall into"? I find it a complete joy to write code in java. Is this just a matter of opinion (and thus not worth a lot) or do you have anything specific to critique the language on?

    Jesus, every new (or at least newly released) language/environment these days takes a shot at Java for one thing or another. Why not just release your product, tout its strengths and let the developers decide what tool they need to use without all this bullshit hype/FUD?
  • I remember reading about Inferno for the first time almost five years ago. I loved the idea; I tried some development and got programs that looked the same on WinNT and an SG Onyx. I was so impressed I told everybody I knew.

    Then one of my friends said he would be doing Java instead... because there were no books available for Inferno or Limbo.

    I wrote it off after that. It was still fun to come up with TV commercials, though. There's just so much potential!

    My favorite was a young couple fleeing from the Microsoft building, with zombies close behind ("Brains! Brains!" One should look like Bill). The ditzy girl, in true horror-movie style, cries, "What do they want from us?" The shotgun-toting hero replies, "They want our intelligent operating system!" as we cut to a close-up of the Inferno box the girl is carrying. After the announcer finishes his spiel, we see the couple cornered in an alley, zombies close and threatening. The hero points his shotgun at the camera, we fade to white, two shots ring out, and blood drips on the screen in the form of the Lucent "brown ring of quality." It bursts into flame and the text "Inferno" is added as the announcer proclaims, "Inferno: Welcome to HELL!" Fade to black.

    Whew! End Creativity. Anyway, if they can just get some books out, maybe they'll have a go at it.

    Judebert

  • bullshit hype

    hmm. like java didn't have any hype? apart from my initial comment, which i admit was a cheap (but IMHO justified) shot, i've just been answering questions the best way i know how. no bullshit hype. i've been interested in inferno/plan 9 for years before i had this job with vita nuova; developing apps, trying to do the best i could with the tools at hand.

    i reckon that inferno and limbo are fine tools, fit for any hacker's workbench, and i'd be happy if others find them useful too... this old unix hacker certainly has.

  • I think it was "inheritance" pitfalls, not inherent. :) As to what they are, I'd love to hear the author's explanation as well.
  • oh yes...

    Not portable? WTF are you talking about?? Are you using Microsoft's VM or something?

    you make my point for me.

    Is this just a matter of opinion (and thus not worth a lot) or do you have anything specific to critique the language on?

    check out the thread on inheritance. somebody (a java developer) said it better than i. if you can get hold of a copy of the april 2000 edition of the IEEE Computer journal, then the article "coping with java programming stress" gives an excellent rundown on things that aren't right with java (by experienced java programmers). there's also: this [mdx.ac.uk] by someone who knows their computer language stuff.

    they say it better than i possibly could.

  • OpenGL is clearly much better than Direct3D for a number of reasons, but nobody is lining the pockets of game developers to write games in openGL are they? If there was a port of Direct3D on linux then M$ wouldn't have any reason to push it so hard and it could die it's rightful death.
  • Well, yes, inheritance when misused, or overused, is a Bad Thing. However, the solution to this is NOT to remove inheritance (can you say, "throwing the baby out with the bathwater"?).

    Inheritance is a powerful tool that can actually increase code readability and maintainability if used correctly. Consider how you would implement an "is-a" relationship in the problem domain with a language that doesn't have inheritance...

    I guess the big question is: Do you want a language which prevents you from doing something in all instances simply because *sometimes* doing that thing is the wrong thing? If that were the case, we should rid ourselves of gotos as well.

    The fact is, poor developers can write bad code in *any* language.
  • Bravo!!!

    Totally agree.. (Begine a non-oop kinda guy, who happens to occaisionally whip out oop code.)

    As far as I can tell, Objective C is really better than C++ for large scale development because of this. Compare and contrast Taligent and NeXT. Boom.

    But then again, I still think C was given to K&R by God himself.

    Not to say that C++ isn't a fantastic language. But I think Objective C is better when larger.
    ( I've never even gotten to touch a NeXT box. ;-( )
    Ofcourse I don't have alot of concrete examples like the wonderful post above. It's just what I think.

    (Strapping on asbestos suite, flipping on cooling switch).

    Happy Hacking,
    Pan
  • I would agree, along with most object-oriented designers (I hope), that Java's use of inheritance between the Vector and Stack classes is Bad. It's a perfect example of when *not* to use inheritance: a Stack IS NOT a Vector, since there are Vector operations which are not appropriate to the Stack.

    However, again, just because inheritance can be abused doesn't mean that it's wrong in all instances. Just about any aspect of any programming language is open for abuse by inexperienced programmers. Experienced programmers learn the pitfalls and move on. We learn that inheritance should be used only when absolutely necessary, and when the problem domain calls for it, and we learn that the inheritance hierarchy should be shallow and wide, not narrow and deep.

    It's also not clear to me how exactly limbo is "OO in the deeper sense" without inheritance. The fact is that there are inheritance ("is-a") relationships in most problem domains, and if your language can't model this, then the language can't be OO in *any* sense, deep or shallow.
  • Uhh, this is hardly a reasonable response. Microsoft's intention was to make their VM not interoperable, which is why Sun sued them.

    Just a tad ingenuous.
  • compared to it's RAM

    Free of grammatical errors, eh? For starters, try learning the difference between 'its' and 'it's.' (Hint: 'its' is possessive, and 'it's' is a contraction of 'it is.')

  • Uhh, this is hardly a reasonable response. Microsoft's intention was to make their VM not interoperable, which is why Sun sued them.

    if's true that microsoft did this deliberately. but from what i'm given to believe, there are portability issues to java, on non-microsoft platforms too, that derive from the fact that the underlying environment has platform-dependent differences.

    inferno differs from java in that it's not just a VM and a set of libraries - those are just components in the operating system, and it's the OS that provides the true portability. i've heard people complaining about differences in GUI behaviour, differing library implementations, etc with java, and not only with relation to the microsoft VM. this, i think, is an inevitable problem with defining the portability layer at the library level, and having several vendors write the libraries.

    it's much less of a problem with ports of inferno, because the interface to the underlying system is so narrow. to port a version of inferno, you have to write some code to create a window and copy bits to it, some code to map the native filesystem into a unix-like hierarchy, and some code to map the devices provided by the system into Inferno device format (e.g. the serial drivers). this is a far cry from re-implementing the entire API. you don't need any guidelines like "100% pure Java" for inferno, because the API semantics are the same, whether you're running under a 4 processor NT box with 2GB of memory, or a PDA with 1MB RAM, 1MB ROM and 2MB of flash.

  • Well...if the M$ Split goes through...do you think DirectX would be part of the OS group or the App group.

    If it's part of the App group, then I could see DirectX for Linux (in addition to a whole line of M$ products "MS Office for Linux" - don't hurt me for saying that, it's just a fact that people in the work environment {not just development} like using M$ Office")

    If DirectX becomes part of the OS, then it still seems that M$ could greatly influence the developers, but only if they force them to use DirectX API.

    I prefer the cross platform capabilities of OpenGL, but I've heard that the development of OpenGL is slower to improve. I am sure OpenGL is more thought out, but I still wonder.

    I don't know specifically what the person I was talking to was referring to about DirectX excelling in certain areas.

    It may be he was referring to how DirectX includes a lot of things that OpenGL requires add ons (GLUT,etc) to do some basic stuff, like menuing systems, sound, support for alternative multimedia formats,etc

    Oh Well...

  • Inheritance is a powerful tool that can actually increase code readability and maintainability if used correctly. Consider how you would implement an "is-a" relationship in the problem domain with a language that doesn't have inheritance...

    i'm not sure that "is-a" is something inherent to many problems. it is a way of looking at certain problems, sure, but i don't think it's an inevitable, or even a necessary concept.

    the way i think of it is that when you're writing a piece of code that uses object A, you are aware exactly of the interface that A provides (or you should be) and the compiler should be able to make absolutely sure that you don't go outside that interface.

    moreover, if i'm implementing object A, i know exactly what interface i want to present. it shouldn't matter in the slightest which objects i choose to use internally in order to implement that interface.

    the main payoff to avoiding inheritance comes at the software maintenance stage. with an inheritance hierarchy, when inspecting some code that uses an object a of type A, there's no way of knowing which code is being invoked when something calls a method on a. it could be a subclass of A, which might or might not invoke its superclass method. i can't tell by reading the documentation for A what's going to happen, because A's idea of reality can be subtly subverted by a subclass. these problems can become really nasty when dealing with a large class hierarchy and a large program.

    if an object is required to implement its entire interface, then these problems melt away. i am guaranteed that the module implementing the interface is responsible for all the bahaviour it exhibits. so you don't tend to get bugs created by the subtle interaction of subclass with superclass invariants. in fact, it's the invariants that are probably the most important thing. if i write some code like:

    x := 0;
    function1()
    {function2();}

    function2()
    { x += 2; }

    where function1 and function2 are part of an object's interface, i would like to be absolutely sure when looking at the code that x is 2 more after calling function1 than before. in a language like java (or objective-C, for that matter), i don't have that guarantee. this invariant, carefully maintained by the writer of the class, can be broken by someone carelessly overriding function2 and neglecting to call the superclass method.

    I guess the big question is: Do you want a language which prevents you from doing something in all instances simply because *sometimes* doing that thing is the wrong thing? If that were the case, we should rid ourselves of gotos as well.

    any high level language is a trade-off between safety and power. java (and limbo) chose to give up the safety of C-like pointers for the guarantee that arbitrary bits of memory couldn't be corrupted. but i don't think you'd find many people that would say that the power of the language has declined drastically because of that. on the contrary, the additional checking that the compiler can now do gives you more freedom to concentrate on the real meat of the program.

    it's the same with inheritance. inheritance gives you the ability to implement some things conveniently (GUI widgets being the canonical example), but doing away with it means that code is vastly more readable, because you can see exactly what a piece of code is doing; there is no need to know your class hierarchy before you can see what the control flow is doing, because control flow is determined locally.

    the same sort of thing applies to local variables in C. consider the code:
    {
    int i;
    i = 99;
    }
    any C programmer can tell by looking at that code that it does absolutely nothing (cpp munging aside :-]); it has no side effects; assigning to i cannot change anything else in the program. that's the power of local variables: they provide a cast iron guarantee that the state of the variable is local.

    when looking for bugs, this sort of guarantee is invaluable. who hasn't spent hours looking for a bug, only to discover it somewhere that it "couldn't" be!? the more possibilities you can rule out based on a quick glance at the code, the more productive your bug hunting will be.

    that's why i like limbo so much. when it gives a guarantee, the guarantee is absolute. and the guarantee that a the meaning of a name depends on the local code, not global state, is an excellent guarantee to be able to give.

    The fact is, poor developers can write bad code in *any* language.

    i completely agree. i've seen some pretty appalling code in Limbo too. but inevitably you're one day going to be asked "go and fix that bug!" in some of that code. that's the day that you bless the language design, because no matter how bad the author of the code, they can't break the guarantees of the language.

    one can write (i think!) good code in any language too. if you're aware of the pitfalls, and write stylised code that avoids them. but inheritance *is* a pitfall (look, even the inventors of the language fell into it - doesn't that say something?!) and IMHO the more pitfalls a language can avoid, without compromising on the power of the language, the better the language.

    PS. limbo got rid of goto too. :-)

  • It's also not clear to me how exactly limbo is "OO in the deeper sense" without inheritance. The fact is that there are inheritance ("is-a") relationships in most problem domains, and if your language can't model this, then the language can't be OO in *any* sense, deep or shallow.

    as far as i know, the idea of inheritance is not fundamental to OO. the idea of OO was to provide data encapsulation, reusable code, and implementation interchangability. Bertrand Meyer's "Object Oriented Software Construction" starts from the "five principles":

    • linguistic modular units
    • few interfaces
    • small interfaces (weak coupling)
    • explicit interfaces
    • information hiding
    i don't see inheritance in there anywhere. it's just a convenient design that people happen to have latched on to a "being" OO.

    by those criteria limbo is just as much an OO language as any other, and perhaps more so. the use of explicit interfaces and the lack of inheritance means that the coupling between objects is weak, and as a result programs tend to much more mutable than i've experienced in OO environments. you want to change the implementation of this object completely? no problem - just make sure you carry on implementing the same interface.

  • Got rid of goto - oh man, now how am I going to get through those if statements?

    10 goto 20
    15 rem why did the first line do that?
    20 ? "Hello World"
    30 goto 10

    oh, wait... head screwed on backwards, thinking in Commodore Basic again (hey, I was 6 at the time...)
  • Far from throwing out new ideas Limbo retains the strengths of C-like languages and introduces many new ideas.

    I think you misunderstood - I was not saying that limbo was throwing out new ideas, but rather that limbo was a new idea that another slashdot poster was dismissing because (s)he didn't like the look of the "hello world" example.

  • If the KDE developers dont want to interoperate just because theyre against GNOME for political reasons i can only conclude theyre just braindead.

    KDE was there before, so rightfully you could say "if the GNOME developers don't want to interoperate just because they're against KDE for political reasons, I can only conclude that they're braindead."

    In real life however, neither of those statements is true. Since history can't be changed, both KDE and GNOME developers (well, some of them ;) do what they can to ensure compatibility where it is practical. Since KDE2 will be released pretty soon, changing the component modell is NOT practical at the moment.

  • Which is only Debian's opinion, and not what the troll above said.
  • localman wrote:
    I would argue that you can't judge the merits of a language on it's "Hello World"

    My response:
    I agree, my post was more about the "Obvious Java pitfalls " that limbo avoided. In hello world it doesn't avoid any "obvious Java pitfalls".

    I was hoping people would point out the obvious things that I was missing. With real code examples, not just genral statements. Also limbo has been a bit of a sore point as I've been hearing about how it was going to kill Java since 95 (as part of inferno), so I might have been a bit harsh.

    Didn't mean to slam limbo as a whole, was hoping to get some discustion at the level of actual code.

    -Peace
    Dave

Marvelous! The super-user's going to boot me! What a finely tuned response to the situation!

Working...