Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
GNU is Not Unix

Ubuntu 8.10 (Intrepid Ibex) Released 482

SDen writes "Bang on target, the new version of Ubuntu Linux is available for our downloading pleasure. Amongst various changes it sports updates to the installer, improved networking, and a new 'Mobile USB' version geared towards the blossoming netbook market. Grab a copy from the Ubuntu website, and check out Linux Format's hands-on look at the Ibex."
This discussion has been archived. No new comments can be posted.

Ubuntu 8.10 (Intrepid Ibex) Released

Comments Filter:
  • by rikkards ( 98006 ) on Thursday October 30, 2008 @11:49AM (#25570431) Journal

    I have been experiencing a minor annoying bug with Heron for the last while where the wireless connection will drop randomly. Reading the logs it shows that the wireless nic hasn't talked to the AP for a while and assumes it is out of range. I have seen others with this issue but no solution. A workaround is to restart networking.

  • New features (Score:4, Interesting)

    by apathy maybe ( 922212 ) on Thursday October 30, 2008 @12:08PM (#25570775) Homepage Journal

    Personally, despite some wonderful new features, I'm going to stick with 8.04 for a bit, at least until they work out the bugs. Of course, I won't ever be prompted to upgrade to 8.10, because 8.04 is a long term support release. Having a look at the release notes, at least one unacceptable (for me) bug is:

    On laptops with Intel 3945 or Intel 4965 wireless chipsets and a killswitch for the wireless antenna, starting the system with the killswitch enabled (i.e., with wireless disabled) will prevent re-enabling the wireless by toggling the killswitch. As a workaround, users should boot the system with the killswitch disabled. A future kernel update is expected to address this issue.

    Considering the regression from 7.10 with the wireless lights (it used to be a light would flash when transmitting data, now the light never even shows (known bug)), maybe they should have a long look at their wireless system.

    Oh, and the CD eject bug...

    Yeah, I would like to have the latest GNOME, and OOo 3 without installing backports, but honestly, I don't think I'll bother.

  • by Yarcofin ( 1397091 ) on Thursday October 30, 2008 @12:18PM (#25570919)
    I'm a beginner trying to make the switch from Vista to Ubuntu, but for the past two weeks and about 5-10 failed attempts, I've been trying to get my AR5007 wireless card to work in Ubuntu 8.04 to no avail. Will an AR5007 card work with 8.10 right out of the box? Or else I'm not going to bother with it if I can't even get a internet connection. Most user-friendly distribution of Linux my ***...
  • by Anonymous Coward on Thursday October 30, 2008 @12:27PM (#25571081)

    Or how about the "I know the SSID and the WPA key, but I'm going to refuse to connect to the non-broadcasting AP anyway" bug. There's nothing quite like having to type in a 64-character WPA key every single time the laptop boots.

    Yes, the wireless network is listed in the saved list. No, you can't get it to connect from that list, you have to create a manual connection and reenter everything. No, you can't copy-paste the WPA key in, that doesn't work for some reason.

    Of course, the laptop could boot a lot less frequently if only suspend or hibernate worked. Sadly, while Ubuntu is quite capable of shutting the machine off, it can't quite get the hang of resuming.

  • by Aladrin ( 926209 ) on Thursday October 30, 2008 @12:29PM (#25571123)

    I saw this elsewhere and I'll ask the same thing:

    Why would I go with apt-p2p instead of debtorrent? I realize they are written by the same person, but debtorrent is already in the repositories and appears to just exactly the same thing.

  • by Theril ( 606664 ) on Thursday October 30, 2008 @12:51PM (#25571561)

    Again Intrepid seems to take the progression of and Linux desktops in general: abandon the traditional and well proven unix ideology, where simple tools do well defined tasks well. Instead things "Just work" meaning that when they don't work, they "Just don't work" and the only solution is to wait for new release. Yes yes, you have the source, but hacking around all of the "Just works" bugs isn't isn't very feasible even if one had the programming knowledge.

    I've been using the prerelease versions of Intrepid for about a month and have witnessed again few cases of the new bloat-methodology.

    For example the NetworkManager has been around for a while destroying an architecture that could be comfortably tweaked in command line and config files. Of course I don't have anything against GUIs that simplify this, but in the same time the command line usage has been stripped.

    A great example is the new touted 3G automation, which does work quite nicely. However, for more experienced user it seems quite weird as there's no options to set up the network interface or serial device to use. Of course it turns out that this only works on USB devices that are somehow autoprobed probably by HAL (which itself has configurations that few mortals can edit by hand). And this leading the system not supporting 3G over Bluetooth, even if I'd set up the rfcomm serial device myself.

    Another amazing way for "Just works" methodology is to write them in the DE itself, not as separate programs. For example in GNOME there's a applet to kill a misbehaving GUI program by clicking it's window (ie a xkill replacement). But of course it runs as part of gnome-panel process (or something like that). Well, when the kill cursor is on it prevents switching VTs (WHY!?) and also jams the whole screen in the process. Now of course the solution is to SSH to the box (because VT switch is prevented) and kill the offending process from command line. But with the new way to do things, there's no single process to kill.

    And other great thing is the gconf. Sure it's nice to use from programmer's point of view, but of course it's practically unusable otherwise. With the GUI-editor there's change to find the proper configuration field in reasonable time, but using CLI is nearly impossible. Sure it uses ASCII files to store the data, but these are in some horrible illegible non-commented XML format nesting several directories deep with some "overlaying" stuff so that the offending parameter can be where ever.

    It's a fun way to spend time trying to config your screen through gconf when Gnome has decided to screw up your display using XRandr.

    These kinds of situations are already everywhere and getting more common by every distribution and DE release. In no time the big open desktop distros reach the "Just doesn't work" level of Windows and OSX.

    Please don't get me wrong. I like the new stuff that makes computers simple to use (like automatic networking setup etc). But it really shouldn't be done in expense of flexibility and ability to fix things manually _when_ the automated stuff breaks.

  • by zbharucha ( 1331473 ) on Thursday October 30, 2008 @12:56PM (#25571661)
    This is certainly a stupid question, but here goes: Is there any difference between doing a clean install (format and install 8.10 afresh) and upgrading from 8.04 to 8.10 as detailed in the link? Is there any advantage of doing one over the other (other than the obvious you-don't-have-to-install-apps one)?
  • Re:The power of p2p? (Score:3, Interesting)

    by vux984 ( 928602 ) on Thursday October 30, 2008 @01:14PM (#25571941)

    If I'm going to be hungry later, and my apple supply isn't going to be refreshed until further down the road, you bet your ass I'd refuse to share.

    Your apple supply (ie comcast quota) refreshes monthly, ie ... presumably, in fact, the day after tomorrow.

  • by Eil ( 82413 ) on Thursday October 30, 2008 @01:20PM (#25572033) Homepage Journal

    My direct download from Canonical (releases.ubuntu.com) went at full speed for the full 11 minutes it took to download. Plus it didn't break my ssh connections, which bittorrent always does.

    Counter-anecdote: My bittorrent download from users across the net went at full speed for the full 2 minutes it took to download. Plus it didn't break my SSH connections, which are never adversely affected by bittorrent anyway.

  • by Anonymous Coward on Thursday October 30, 2008 @01:26PM (#25572125)

    For the challenge... i thought everybody new it by now..

    Windows break down over time...even if you just use IE and watch some movies/listen music...so you need to reformat to bring everything back to functional...

    Apple is working fine (of course mainly because you are using hw and sw from the same company on almost one/two different hw configurations ...only a couple of drivers to worry..so it works) and then a "new" os version or hardware is out there... then you are not cool anymore...you have to do sth to maintain your coolness factor... so you upgrade and you buy again your acceptance among the fanboys.

    Linux you set up everything (after some pain and search in crazy forums) and it is working great... and then you are BORED... it is sth like simcity..you make everything work and then you use a disaster to try you city....so...you install a new version of ubuntu and start everything over....

    Just that simple

  • Re:Ibex? (Score:2, Interesting)

    by torry_loon ( 1180499 ) on Thursday October 30, 2008 @01:27PM (#25572133)
    Where on the Isle of Man will I find a Jackalope?
  • by edmicman ( 830206 ) on Thursday October 30, 2008 @01:33PM (#25572233) Homepage Journal
    Personally I'm upgrading from 8.04 to get newer software releases. A handful of programs I use regularly (Filezilla, Pidgin, Deluge, Firefox) are woefully behind in the Hardy version compared to what is released on those programs' sites. I'm hoping 8.10 upgrades some or most of them. One of the biggest pluses I read for Ubuntu is the package management and not having to maintain and upgrade software yourself. But then the packages aren't updated at all and here I am multiple point versions behind the ones I use on Windows at work, where I upgdate it myself!
  • by Tekfactory ( 937086 ) on Thursday October 30, 2008 @01:45PM (#25572425) Homepage

    As I replied to Parent Post. I have downloaded p2p files directly from Microsoft, so far this has been support or developer packages, not regular updates, but they are learning.

  • by Vancorps ( 746090 ) on Thursday October 30, 2008 @03:01PM (#25573579)
    Good thing too, it was completely useless for me so I went with WICD. Heron seems to have a lot of little glitches although a lot of it is admittedly caused by third parties like the ATI firegl driver and VMWare Server messing with the keyboard. I have a button I have to click on the panel which resets the keyboard so I can use shift and control again as something about VMWare quick switch doesn't release properly or some such. Never had that problem in the past though.
  • by DerWulf ( 782458 ) on Thursday October 30, 2008 @03:15PM (#25573787)
    I think that the bit torrent protocol is great and reflects the nature of the internet quite well. On top it's very efficient so if you want to do a realistic speed test of your down link there is nothing better then grabbing a linux distro (or any other large free software package) as torrent download. Well, unless your provider uses p2p 'traffic shaping' but then you don't need a speed test ;)

    Anyways, the thing about MS and apple using P2P: The main problem would be publicity, I think. Blizzard distributes their patches via torrents (with HTTP fallback, best of both worlds!) and if I recall correctly people where really upset at first. Reasoning going along the lines of "I'm paying for this so you damn well pay for the bandwidth". Personally I think that's stupid. Yes, a company should not only rely on bit torrent distribution but they should definitely offer it. It's in the best interest of every one: http downloaders have to share the bandwidth with fewer people, torrent downloaders have vastly improved download speed and the company saves money.
  • by spitzak ( 4019 ) on Thursday October 30, 2008 @03:29PM (#25573989) Homepage

    I have the problem that caused a huge amount of discussion here before, where the disk in the laptop repeatedly parks and then wakes a few seconds later to write something. I have used up about 1/3 of the lifetime limit on the disk for these parks. I have updated from Feisty to Gibbon and Hardy and it is still there.

    I have a script to turn off the disk timeout to fix this. However I have to run it manually every time I unsleep the machine. I have followed the instructions to install it in init.d and power on/off but it appears to have no effect. Either it is not run or it's changes are undone after it is run. This is very frustrating and I see absoltely zero information about this.

    I heard there was some analysis about why Windows does not have this problem, and it was determined that Windows reads the disk *continuously* until it sleeps. This led the manuafacturers to adjust the timeout very small so it is, in effect, a fast "Windows is sleeping" detector. Now it appears that Linux is using the disk far less often, but not zero, so the result is worse for the disk. It would be really cool if Linux actually stopped reading the disk at all when nothing is happening, but I realize this is difficult, so I think an acceptable solution is to replicate how Windows acts.

    In any case, despite all the yakking about this earlier, I see no comments. Have they addressed this or not? Is it ever going to be fixed? Or can somebody tell me how to debug the init stuff and find out why my script has to be run manually?

  • by NereusRen ( 811533 ) on Thursday October 30, 2008 @03:44PM (#25574201)

    You were modded funny, but this may be the most interesting change I've read so far. I would have described that bug in exactly the same way... I can't believe how silly some of the button, toolbar, and other interaction semantics in GTK are, compared to Qt. If they're changing them, I'm interested!

  • by KWTm ( 808824 ) on Thursday October 30, 2008 @04:35PM (#25574917) Journal

    Here's another question: does Kubuntu have a LTS (Long-Term Support) version yet? Ubuntu 8.04 (GNOME version) was LTS, but because the KDE developers decided to drop everything KDE3-related and go running after the KDE4 Holy Grail, Kubuntu 8.04 was not LTS. So now it's October, the next version of Ubuntu is out, and KDE4 has been upgraded to KDE4.1 , do we have LTS yet? Or will we have to wait 4 more version still Ubuntu 10.4 (Muckraking Manatee) before we get a LTS?

    Personally, I don't want to upgrade to KDE4 because I just want my computer DE to help me get things done and stay out of the way. I don't want to learn a new way of doing things. I hope Kubuntu8.10 comes with a KDE3 version.

  • by eennaarbrak ( 1089393 ) on Friday October 31, 2008 @01:09AM (#25579901)
    I'm sure using a Live CD would have helped him understand the problems better, but his central point is that even Ubuntu still has compatibility issues, and the primary reason for this is because the Linux community is splintered into different communities, none of which has enough of the market share to justify the resources (by vendours) to fix these problems.

One man's constant is another man's variable. -- A.J. Perlis

Working...