Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Editorial Software Linux

Two Years Before the Prompt: A Linux Odyssey 499

tim1980 writes "Derek Croxton has written a rather long editorial on how he sees the Linux and Open Source communities, and his personal experiences with Linux, the editorial is titled Two Years Before the Prompt: A Linux Odyssey and is over 3,500 words. Excerpt: 'A novice's greatest fear is sitting in front of a motionless command prompt with no idea what to type; or, as so frequently happens, knowing a command that he copied verbatim from a document discovered on the internet somewhere, but with no idea of what it means or how to alter it if it doesn't behave exactly as advertised.'"
This discussion has been archived. No new comments can be posted.

Two Years Before the Prompt: A Linux Odyssey

Comments Filter:
  • .so hell (Score:3, Informative)

    by BoldAC ( 735721 ) on Friday September 10, 2004 @09:40AM (#10211860)
    I agree with this completely! Anybody have a solution? I know it is out there... somewhere... ...In Windows, this issue is known as ".dll hell." In Linux, you might call it ".so hell" ("so" being the extension for these "shared objects"). It has probably caused me more frustration and hair-pulling than all the other issues in Linux combined. In principle, the issue seems simple: you can't install a program if the shared objects that it depends on - its "dependencies" - are not on the system. Any attempt to install the program will generally inform you what dependencies are missing, and it usually isn't too much trouble to go find the needed files on line somewhere. The problem comes if you need, say, version 4 for your new program, but you already have version 3 installed. You can't simply overwrite version 3, because then all the existing programs that depend on it will break. Apparently you can't just have separate copies of 3 and 4, since I have yet to find an installation tool that will let you do this. Instead, you...well, frankly, I don't know what you do. I have yet to solve this problem, and it continues to bother me.
  • Re:.so hell (Score:5, Informative)

    by Xner ( 96363 ) on Friday September 10, 2004 @09:44AM (#10211898) Homepage
    I simply use "apt" or "yum" and let them sort it out. The only times they failed me was with a Fedora test release, but I knew the risk I was taking.
  • by Anonymous Coward on Friday September 10, 2004 @09:47AM (#10211937)
    that's why god created apt-get.

    It's been years since I had to worry about dependancies.

    2 reasons:

    Apt-get from debian has been ported to Fedora/Redhat. I use Fedora. (laptop)

    I use Debian. (desktop)

    That's it.

    What to patch?

    apt-get update && apt-get upgrade

    Wham bam thank you mam!

    Want mplayer, but Fedora doesn't have the ability to play DVD's or Mp3's?

    Head on down to Dag's RPM repositories, follow his directions and go:

    apt-get update
    apt-get upgrade
    apt-get install mplayer libdvdcss xmms

    done and DONER!!!!

    Apt-get IS the killer application for linux.

    Update everything, patch everything. Not just core system like in Windows!

    No MORE DEPENDANCY HELL.

    It's realy quite nice. Install debian, upgrade to unstable. I've been running it for 2 years, no sweat and completely up to date.
  • Education. (Score:5, Informative)

    by Raven42rac ( 448205 ) on Friday September 10, 2004 @09:47AM (#10211938)
    Any halfway decent teacher/guide will include an overview of the "man" command. So if you don't know the arguments/what a command does, just type "man command", that will teach you fairly quickly what a command does.
  • Re:.so hell (Score:2, Informative)

    by Xner ( 96363 ) on Friday September 10, 2004 @09:49AM (#10211948) Homepage
    That's why it's usually not the way it's done. If you remember the switch between libc5 and glibc (libc6?), the two libraries co-existed on the same system with different filenames for quite a while.

    Of course, not every single person maintaining a library is as careful as the glibc people, and screwups do happen. It is the distributor's task to maek sure all the programs that ship with a given distribution work with the libraries included in it.

  • Re:.so hell (Score:2, Informative)

    by wjwlsn ( 94460 ) on Friday September 10, 2004 @09:52AM (#10211977) Journal

    I used to resolve this by having a directory that specifically contained old versions of library files. Try ldconfig... a snippet from the man page:


    • ldconfig creates the necessary links and cache (for use by the run-time linker, ld.so) to the most recent shared libraries found in the directories specified on the command line, in the file /etc/ld.so.conf, and in the trusted directories (/usr/lib and /lib). ldconfig checks the header and file names of the libraries it encounters when determining which versions should have their links updated. ldconfig ignores symbolic links when scanning for libraries...

  • Re:.so hell (Score:3, Informative)

    by ratboy666 ( 104074 ) <fred_weigel@[ ]mail.com ['hot' in gap]> on Friday September 10, 2004 @09:53AM (#10211989) Journal
    first, you can run programs with local libpath. second, multiple library versions can co-exist. third (under solaris) apis are versioned.
  • by savagedome ( 742194 ) on Friday September 10, 2004 @09:53AM (#10211991)
    Check out the old story.

    The Command Line - Best Newbie Interface? [slashdot.org]
  • Re:.so hell (Score:2, Informative)

    by Neil ( 7455 ) on Friday September 10, 2004 @09:55AM (#10212022) Homepage
    I don't recognise the description of the problem at all. Normal shared object handling in Linux and other Unixes seems to have this covered entirely. If you need versions 3 and 4 of shared library "libfoo" then you typically have something like: /usr/lib/libfoo.so (symlink to libfoo.so.4) /usr/lib/libfoo.so.3 /usr/lib/libfoo.so.4

    When you run a binary which which was originally dynamically linked against version 3 or version 4 ldd loads either .so.3 or .so.4, as appropriate.

    The libfoo.so symlink tells ld which version to default to using when compiling new binaries with a "-lfoo" on the link command line.
  • Goto tldp.org (Score:4, Informative)

    by Anonymous Coward on Friday September 10, 2004 @09:56AM (#10212035)

    The linux documentation project is great. Lots of howtos, but also great guides.

    I always recommend for a newbie to read:
    Introduction to Linux - A Hands on Guide
    Bash Guide for Beginners
    The Linux System Administrators' Guide (for "power users")
    Advanced Bash-Scripting Guide (for "power users")

    And maybe the network administrator guide.

    All these are cool because they are generally distro agnostic. Anybody can benifit from their knowledge.

    AND remember GOOGLE!!!!

    The command line IS your friend. It's another form of user interface, and combined with a gui like X makes Linux (and other Unix-like operating systems) have the most flexible and powerfull user interface aviable.

    At times it may not be freindly to newbies, but once you have a decent idea what is going on, it's definately worth it.

    Those guides will give you the nessicary tools to understand and become comfortable with your Linux installation. No more fighting thru layers of obsofacation and a deep bridge becuase the knowledge of MS insiders/advanced administrators vs Windows users. In linux users can be as knowlegable as the best programmer or developer.

    But you don't have to any more due to people like Gnome/KDE/Fedora/Redhat/Suse/Mandrake etc etc. Now it's just a matter of what you want and what you feel most comfortable about.
  • Mirrored (Score:5, Informative)

    by paulproteus ( 112149 ) <slashdot AT asheesh DOT org> on Friday September 10, 2004 @09:57AM (#10212041) Homepage
    Here [jhu.edu], all you freeloaders ;-). I'll take it down later today.

    I just spoke with him on the phone, too; cool guy. I don't think he was expecting anyone to actually call him :-).
  • Off topic: DLL Hell (Score:3, Informative)

    by Davak ( 526912 ) on Friday September 10, 2004 @09:57AM (#10212049) Homepage
    If there are any windows-users actually in this thread... and they get trapped in a DLL situation. I would suggest trying these programs...

    http://www.sysinternals.com/ntw2k/freeware/procexp .shtml [sysinternals.com]

    It'll tell all the processes associated with a running program.

    http://www.sysinternals.com/ntw2k/freeware/listdll s.shtml [sysinternals.com] will show all the loaded dlls.

    Between these two programs, you can sort through most of the dll errors without killing yourself.

    disclaimer: I don't know this guy... I just use his software. :)

    Davak
  • Re:From TFA (Score:5, Informative)

    by fafaforza ( 248976 ) on Friday September 10, 2004 @09:59AM (#10212076)
    I'd agree with the documentation part. From experience, most help comes from stringing together incomprehensible usenet posts and articles found on google.

    The documentation for the most part is poorly written, and poorly laid out. A lot of times I find docs diving straight into each command or option with its own set of triggers, etc, without first giving a broad overview. I do not have specific examples; just an overall feel from a few years of using Linux and FreeBSD.

    Can't really lay blame on anyone, though. People developing software for open source systems would rather create it than write documentation aimed at the greenest of Linux users, or support the software on forums and newsgroups.
  • Re:From TFA (Score:5, Informative)

    by UnderScan ( 470605 ) <jjp6893@netscap e . n et> on Friday September 10, 2004 @10:04AM (#10212125)
    HEY MODS! PARENT IS A TROLL!
    There is nothing in the article about being homo.

    The real article.

    Two Years Before the Prompt: A Linux Odyssey
    Derek Croxton, 09 September 2004

    Two Years Before the Prompt: A Linux OdysseyTwo Years Before the Prompt:
    A Linux Odyssey

    A novice's greatest fear is sitting in front of a motionless command prompt with no idea what to type; or, as so frequently happens, knowing a command that he copied verbatim from a document discovered on the internet somewhere, but with no idea of what it means or how to alter it if it doesn't behave exactly as advertised. Linux has never quite escaped its reputation as an OS for geeks who like those command prompts. I made the plunge into Linux at the start of 1993 under the assumption that things had improved enough that I could get around Linux without the command prompt at all, or at least with minimal exposure to it.

    What I want to report on here is some of the "gotchas" of being a new Linux user. I've tried at least half a dozen different distributions, and along the way I've been hit with just about every problem an inexperienced user could face. Partly this is because I push Linux to do so many things for me - web server, video player, email server, database backend, programming environment - including many that I had not previously tried on Windows. In spite of my title, however, I'm not going to try to make this article into a ripping yarn along the lines of Richard Henry Dana's book Two Years Before the Mast. I'm also not writing a critique of Linux distributions, although I hope some developers out there might read this and get some ideas. My main purpose is to prepare new users for likely sticking points, as well as to reassure them about things that will not be as hard as they had feared. I went into my Linux experience with very little idea of what I was getting into; that makes it an adventure, but it's probably not the best way to go about things. Sometimes it helps to read that such-and-such a process is difficult. At least then you can work your way through it slowly, without the persistent fear that there is some easy, one-command feature that you could be using if only you knew what it was.

    All of the distributions I have used are of the more user-friendly type. The reason I have gone through so many is that I keep discovering different things in them. What works on one might not work on another, but hopefully I can learn enough from one distribution that I can tweak another to my satisfaction. It is, in fact, the very diversity of various distributions that makes using Linux such a challenge. Ask a friend and he may suggest a solution to your problem that doesn't exist on your distribution. Naturally, anything can be configured, but it may be more trouble to get it to that point than to find a different solution. Therefore, I will not go into the details of each distribution, but rather give my overall experience, highlighting where distributions differ.

    Linux distributions have put a lot of effort into making the install process as easy as possible, and this is definitely a good thing - if you can't get it installed, you aren't going to use it. I distinctly recall installing a version of Linux 6 years ago and trying to get XWindows (X11R6, for purists) running so I could escape the command line. I went through a lengthy setup process, but when it started asking questions like the horizontal and vertical refresh rates of my monitor, I knew I was in trouble. Nowadays, installation is often as simple as you make it: if you accept all the defaults, your only decision will be a password for the root user. I have had very little difficulty with any of my installs: keyboard, monitor, mouse, sound card, network card, and other essentials are usually automatically detected and configured without my having to do a thing. I did have one case of a disappearing monitor display and one of a non-functioning keyboard, and it

  • Re:.so hell (Score:3, Informative)

    by LnxAddct ( 679316 ) <sgk25@drexel.edu> on Friday September 10, 2004 @10:06AM (#10212146)
    Who moderated the parent as a troll? Yum or Apt is what everyone I know uses. They just take care of the ".so hell" that once plagued us all. Does anyone seriously not use a package manager now a days? I do install from tar balls and source a bit, but the typicall user doesn't need to, and even when I do, I haven't experienced ".so hell" in years. I actually forgot about it until this artical. Somehow it magically disappeared for me. Anyone else care to comment?
    Regards,
    Steve
  • by tezza ( 539307 ) on Friday September 10, 2004 @10:07AM (#10212152)
    For all thos saying man command as an answer, this can cause more confusion. try man tar. There are a bewildering array of options, some create, some mentioning /dev/td0 [but not /dev/tdr0 that rewinds the device after completion]. Untarring and ungzipping is a fundamental operation, but it takes something like 30 steps to understand.

    Further some man pages say 'this has moved to info', this has a bastardised Emacs commandset with pagination and hyperlinking, and the novice friendly Emacs keybindings.

    Don't get me wrong, I use linux/cygwin/solaris every day of my whole life, but Geez, it took a long time to learn.

  • by hey! ( 33014 ) on Friday September 10, 2004 @10:11AM (#10212194) Homepage Journal
    Let me clue you in on the sly reference in the title.

    It refers to Richard Henry's Dana's autobiographical Two Years Before the Mast, what is hands down not only among the best maritime adventures ever written, but is one of my favorite books of any kind.

    Dana was a Harvard sophmore in the 1830s who came down with scarlet fever. As a result, his eyesight was suffering. The common prescription for this in those days was to take a sea voyage. Dana, despite being a young person of privilege, didn't take the normal route of travelling to Europe as a tourist. He signed on as a common seaman, a grueling, uncomfortable and by today's standards incredibly dangerous job. He joined a vessel that rounded the Horn in July (through the teeth of the Austral winter) bound for the wild and nearly uncharted region of California. The common seamen slept in the foc'sl, the part of the vessel at the bow; thus "Before the Mast". Two Years Before the Mast became historically important when gold was found at Sutter's Mill, setting of the great gold rush. At the time, it was practically the only book available that had any information about California.

    His account is exciting and riveting, and probably unique. Many talented writers have written of the sea, and have gone on sea voyages, but I can't think of anyone else of Dana's literary powers who actually lived as common sailor, did their dangerous job, and slept and ate with them. Dana, who later became a lawyer and great advocate of seaman's rights, comes across as a ready lad, brave, good hearted and adventurous. A fine role model, I think, for people who buck the trend and go with Linux.
  • Re:.so hell (Score:1, Informative)

    by Anonymous Coward on Friday September 10, 2004 @10:14AM (#10212220)
    There is a reason .so's have version numbers on them. A good system should link it directly against it's needed .so- and those can definitely co-exist.. header files, I can understand.
  • by magefile ( 776388 ) on Friday September 10, 2004 @10:17AM (#10212240)
    1.) Apt works not just for debian packages, but for rpms, and can be installed on Fedora.

    2.) If you want that, it's easily done with a few config file edits.

    3.) I can't remember the last time I couldn't find an RPM. Or just built one myself using developer-supplied Specs files. Debian's system has everything (or so I've heard). And you can always compile a program, or roll your own binary package very easily.
  • Re:.so hell (Score:4, Informative)

    by IamTheRealMike ( 537420 ) on Friday September 10, 2004 @10:26AM (#10212335)
    Of course, not every single person maintaining a library is as careful as the glibc people

    The glibc people aren't careful at all. They are quite, quite happy to break software if they believe the users programs to be "broken". They've broken even high profile apps like Mozilla this way, because their interpretation of a spec was different to everybody elses. But no, Mozilla was "broken" and whoever wrote that function should be "punished" according to the maintainer.

    The glibc group, along with many many other free software projects, usually believe they are maintaining backwards compatibility. In practice they are at most maintaining ABI compatibility which is not enough: the interface of an API is so much more than the layout of its structures and prototypes of its functions.

    I've written a guide on how to write good shared libraries. It deals a lot with versioning and how to avoid breaking software.

    I'm hoping it can be peer reviewed and published somewhere soonish, in the next few weeks. I'd post the draft here but the server hosting it seems to be down.

  • by grumbel ( 592662 ) <grumbel+slashdot@gmail.com> on Friday September 10, 2004 @10:30AM (#10212382) Homepage
    apt-get isn't a killer app, its a distribution specfic workaround for a problem that should be fixed upstream. Beside that even apt-get is FAR from perfect, since there are the following problems:

    * stable is way behind everybody else, if some piece of software doesn't support your hardware, say XFree86, you are in deep throuble, hardly any newbie will stand this much longer then a few days

    * unstable on the other side while being more current can do havoc at any point, beside 'apt-get upgrade' isn't all that fun for modem users, hell even 'apt-get update' is already a serious problem for modem users. testing is still more a game of luck, sometimes it work sometimes it doesn't, nothing that I would give in the hands of a newbie.

    * apt-get doesn't fix dependency hell, it just works around it so that dependencies get automatically detected, apt-get however DOES NOT resolve conflicts in a user expected way, if library foo and library bar are incompatible, apt-get will remove everything that depends on the other when I want to install one of them.

    * apt-get can't install different versions of -dev packages in most cases since the includes conflict

    * apt-get is a one-way thing, it doesn't provide a roll-back. Simple example would be Gnome2, once it got released I took a look at it and it sucked, lots of features removed and stuff didn't work, was there a way to get Gnome1.4 back? No, I was stuck with Gnome2 thanks to apt-get. Sure, I could manually track each and every dependecy and install it in a seperate prefix, but thats nothing that a newbie wants to do.

    * apt-get depends extremly on the quality of the repository from which it grabs stuff, unofficial packages often cause throuble and if the maintainers of the repo to something ugly like removing a programm that you depend on, bad luck for you, there is no easy workaround, then to not use apt-get and do everything manually.

    A solution to the dependcy hell would be one that is not distrospecific and allows me to download and install any Linux stuff I want from the net, not just the one that some Debian maintainers thought would be worth packaging. It should also allow to install any number of different versions of the same programm without relying on extra work of the maintainer to handle the situation.

    Sadly many of the problems are caused by the FHS layout, which is now standardized and thus basically unfixable for a long long time.
  • Re:.so hell (Score:4, Informative)

    by Lodragandraoidh ( 639696 ) on Friday September 10, 2004 @10:54AM (#10212652) Journal
    Slackware has a package manager - but I usually don't bother because it uses plain tar balls; I download the tar balls from a Slackware mirror (the distro generally has everything I need) and extract what I need. Haven't run into any compatibility issues - ever.

    I think if you stay inside of your distribution you won't have problems. The problem is when you download something outside of the distro and try to integrate it with your system. Then you get what you are asking for.

    If you want to load a particular application - see if your distro has it on their package list first before downloading it from somewhere else. The distribution creators will have integrated it with your distro elimenating headaches for you.

    If you must download something from outside of your distro - understand that you may have to do some integration work.

    That being said, I have had very good luck loading some packages outside of what Slackware provides. I attribute it to the following:

    1. Slackware conforms to the Linux filesystem standard.
    2. The applications I have downloaded also conform to the Linux filesystem standard.
    3. The applications I have downloaded did not use deprecated or experimental functions within the libraries they call (most libraries are good about staying backwards compatible for standard functions).

    The most problems I have had doing integration was trying to get OSS applications to build under SUN Solaris. SUN packages change the default locations for various things (most notably, apps you would normally find under /usr/local/... end up in /opt/sfw/...) - and the apps I have to build by hand are looking elsewhere. I have also had environment problems - mostly missing path variables for libraries. Given that, the problem is not unique to Linux - but I would suggest it is less of a problem with Linux (provided you try to stay inside of your distribution - and only integrate outside apps when absolutely necessary) than with other operating systems, from my own experiences.
  • by IamTheRealMike ( 537420 ) on Friday September 10, 2004 @10:56AM (#10212669)
    Actually the main problem with making distro neutral packages is the lack of platform stability. In fact there isn't really a platform to speak of. The library your program needs will often not be there, or will be there but not in the right version. Dependency resolution gets you some way but doesn't solve the underlying problem, namely that popular libraries break backwards compatibility all the time.

    The second biggest problem is that of binary portability.

    Where files and such are put isn't actually that big a problem. I know that's what it looks like at first, I thought the same thing. But in practice, it's really not a problem.

    How do I know these things? Because for the last two years I've been working on a distribution neutral packaging format, that installs anywhere and is easy to use [autopackage.org]. Go check out the screenshots! There are demo packages you can use as well, like the Inkscape CVS nightly builds. Be warned, I think the Zile package isn't working quite right yet.

    Autopackage really needs more people working on it. Right now all the 3 main developers are in a time crunch and it's slowed right down to a crawl. If you want to see easy 1 click installs on Linux (easier than apt-get even for newbies ...) why not come on over and help out?

  • by Robert The Coward ( 21406 ) on Friday September 10, 2004 @11:34AM (#10213108)
    You are getting 2 Issue mixed up.

    1) Most of the filesystems under the bulk of linux distro already are lsb 2.x or really close to be that. Witch is why ".so hell" is for the most part gone. That standard say were file are and how they are laid out.

    2) Unless it has changed proper setup of a library at least under redhat & debian they are in the basic format.

    libwrap.so.0 -> libwrap.so.0.7.6
    libwrap.so.0.7.6

    so if you have something that is depend on libwrap.so.0.7.6 and something that need libwrap.so.0.7.7 the you can have both installed and just point libwrap.so.0 to .0.7.7 and link the app to the old version.

    Problem is most software / disto don't link direct to .0.7.6 but to .0 or just to libwrap.so. However the is a the abilty to override the library location in linux using an enviroment varible and a few other way to deal with it.

    I haven't had ".so hell" since I tried to upgrade redhat 5.1 to 5.2 by installing just the RPM's
  • by dastrike ( 458983 ) on Friday September 10, 2004 @11:39AM (#10213157) Homepage

    I strongly recommend installing straight to testing ('sarge') using the new Debian-Installer [debian.org] installer.

    The current stable ('woody') release (3.0) is way too outdated (over two years old now) for being able to provide an enjoyable desktop/workstation usage experience.

    The current testing on the other hand contains up to date versions of software and works very well. It will "soon" become the new stable release (3.1)


  • Available from Gutenburg here [gutenberg.net].
  • by menkhaura ( 103150 ) <espinafre@gmail.com> on Friday September 10, 2004 @12:17PM (#10213525) Homepage Journal
    The ports tools (portupgrade & the like) will do the trick for you; no need to manually chase dependencies. Want to upgrade all your applications? cd /usr/ports; make update; portupgrade -a (with an occasional pkgdb -F if you use an unstable ports tree).
    Go take a coffee, and your system is upgraded from sources. Additional customizations (such as alternative dependencies, compile flags per-application etc.) can be made by editing /usr/local/etc/pkgtools.conf. You can't go much better than this.
  • by dcroxton ( 812365 ) on Friday September 10, 2004 @12:29PM (#10213654) Homepage
    Yikes! I had no idea my review would stir up so much controversy. I don't want to add to the antagonism, but I would like to make a couple of points: (a) it is precisely when using a package management system that one runs into problems with different library versions. I've had a number of managed installations fail because the required library conflicts with an existing one. If there is an easy solution, wonderful! I would like nothing better than to know what it is. (b) while apt is wonderful, not everything is in apt. If you've never wanted to install a program that wasn't available in your distro's dockyard, you're lucky. (Or maybe I'm just too demanding.) (c) I'm not bashing Linux! I love Linux and I have invested many hours reading books, man pages, web sites, and experimenting with it. I would *much* prefer that my effort pay off with being able to switch 100% to Linux and ditch Windows altogether, than having simply to say, "Oh well, it was a nice idea, now back to Windows." I just wanted to point out that some things have been very difficult for me, in spite of the fact that I'm not an idiot (some of you will disagree with that judgment, I realize). I would like to push more friends and relatives into Linux, but I'm afraid to because I know they will run into some of the same problems, and I won't be able to help them. Sincerely, Derek
  • by 3D Lover ( 467981 ) on Friday September 10, 2004 @01:12PM (#10214061) Homepage
    Back in the Windows 95 era, there were a bunch of programs like CleanSweep that would monitor the install of a program (program files, dll's, registry entries, etc) and would build a log of what happened. When you wanted to un-install it, you could get rid of everything.

    In the linux/*nix world, aptget and rpm can remove an installed program, but sometimes I have to install a program that's either not available in a package, or the version in the package has to many dependencies. So I resort to the ./configure; make; make install;

    What I want most of all, is a way to monitor what happens during the make install. I want it to keep copies of old versions before they are over written, I want a full log of every file installed, and I want a quick way to un-install it. I would be one of the last people on this earth to know how this might be programed, but one thought would be to make a version of bash that can monitor the activities of the programs it spawns.

    Maybe this exists? Anyone want to start a SourceForge project?
  • by Anonymous Coward on Friday September 10, 2004 @01:45PM (#10214383)
    Check out checkinstall.
    http://asic-linux.com.mx/~izto/chec kinstall/

    It will monitor the make install, build a redhat, debian, or slackware package, and install the package.
  • Re:.so hell (Score:3, Informative)

    by Fred_A ( 10934 ) <fred@f r e d s h o m e . o rg> on Friday September 10, 2004 @01:51PM (#10214448) Homepage
    It's the old debate between simplicity and completeness.

    The XP way is easy and opaque, the Linux ways is complex but you know what happens. The Linux crowd apparently prefers that option.

    It is however trivial to stick a front end on top that does what that XP gadget does. Mandrake has such a tool in 10.0, I think it was there in 9.x too, and other distributions probably have it too.

    However it is a delicate choice to make. Letting the user fiddle with network settings that have huge implications on the way the network function and the security is handled without really knowing is going on...
  • by Minna Kirai ( 624281 ) on Friday September 10, 2004 @02:36PM (#10214967)
    That's far short of binary compatability across time.

    Wrong. Linux upholds binary compatibility for userspace applications across time. (Watch what he does when someone proposes to remove obselete syscalls... those things stick around FOREVER, in the name of "Don't break userspace")

    What Linus doesn't support is compatibility for binary kernel modules.
  • Re:.so hell (Score:3, Informative)

    by Rich0 ( 548339 ) on Friday September 10, 2004 @02:49PM (#10215094) Homepage
    The application should not be linked against lib.so.

    It should be linked against lib.so.1, or lib.so.1.0.0.3, or whatever. Most likely it should be linked against the major version.

    Usually when libraries make major changes that break compatibility packages are created of both versions.
  • Re:.so hell (Score:3, Informative)

    by Pig Bodine ( 195211 ) on Friday September 10, 2004 @03:23PM (#10215442)
    The most problems I have had doing integration was trying to get OSS applications to build under SUN Solaris. SUN packages change the default locations for various things (most notably, apps you would normally find under /usr/local/... end up in /opt/sfw/...) - and the apps I have to build by hand are looking elsewhere. I have also had environment problems - mostly missing path variables for libraries. Given that, the problem is not unique to Linux - but I would suggest it is less of a problem with Linux (provided you try to stay inside of your distribution - and only integrate outside apps when absolutely necessary) than with other operating systems, from my own experiences.

    Amen. And once you get any path issues worked out there are often deeper problems. I just spent the morning compiling GNU common lisp under solaris so I could compile maxima to work out some rather tedious equations without making any arithmetic errors on the coefficients. After pulling out most of my hair I found that newer GCC versions on solaris have revealed a bug in gcl and I found a patch to gcl that solved the problem. (My undying gratitude goes to the GCL people for fixing this quickly.) This sort of thing almost always happens to me when compiling any large free software under Solaris.

    It may not be fair blaming Sun. I know I bring some of this trouble on myself by having tools that are half GNU/half Solaris. But a lot of software requires the GNU tools to compile. And since Sun doesn't by default bundle a C compiler with Solaris there is no standard Sun platform for people to write free software for. Everyone has a mix of Sun and GNU and when compiling you always have to worry about which you are using. They may have made themselves some money selling compiler licenses, but by fragmenting the solaris "standard" for compilation they seriously undermined the viability of Solaris for scientific workstation use. Any Linux distribution is a pleasure to work with by comparison. Presumably Sun only cares about selling servers.

    My all time most annoying experience was with a patch cluster installation script, downloaded from Sun's own website, that wouldn't run with Sun's own version of awk (not a bug exactly; a limit on the number of records in a line that was exceeded by the size of the patch cluster). Gawk, of course, worked just fine.

    OK. Having ranted, I now feel better.

  • Few remarks... (Score:2, Informative)

    by RWerp ( 798951 ) on Friday September 10, 2004 @06:54PM (#10217470)
    knowing a command that he copied verbatim from a document discovered on the internet

    Like rm -rf *?

    I made the plunge into Linux at the start of 1993 under the assumption that things had improved enough that I could get around Linux without the command prompt at all, or at least with minimal exposure to it.

    He was very naive, then. I wouldn't say that one can get around without the command prompt at all or with minimal exposure to it even now (except if you have , say nothing about 1993.

    I have had very little difficulty with any of my installs: keyboard

    I always bitch at X11 for forcing me to count keys on my keyboard: do I 104, 105 or 106 keys? Isn't there a way for the X server to figure it out for himself?

    To be fair, however, most people never have to install Windows, so they never have to deal with the issue of hardware compatibility and settings.

    Sure. I remember installing sound card drivers for CMI 8380 in Windows 95. It seemed that the order in which I installed them and other drivers mattered, whether the damn thing worked or not.

    power management. Apparently this is not compiled into the Linux kernel by default, which means you have to recompile it yourself

    Don't know what distributions he used, but mine has it compiled in default kernel.

    By default, Linux usually opens programs with a single click

    The usual confusion: KDE does it (as a default), not Linux. GNOME uses double click by default (in nautilus).

    You see, when I right-click on a package in KDE, I get three different options for how to compress it, but nothing for how to un-compress it.

    Some strange case of "ark" misconfiguration. I can uncompress files with a mouse click. But still prefer to do it in the console... I guess it just depends on the personal habits.

Intel CPUs are not defective, they just act that way. -- Henry Spencer

Working...