Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
News

Experiences with Replacing Desktops w/ VMs? 442

A user asks: "After years of dealing with broken machines, HAL incompatibility, and other Windows frustrations, I'd like to investigate moving to an entirely VM-based solution. Essentially, when an employee comes in in the morning, have them log-in, and automatically download their VM from the server. This gives the benefits of network computing, in that they can sit anywhere, if their machine breaks, we can instantly replace it, etc, and the hope is that the VM will run at near-native speeds. We have gigabit to all of the desktops, so I'm not too worried about network bandwidth, if we keep the images small. Has anyone ever tried this on a large scale? How did it work out for you? What complications did you run of that I probably haven't thought of?"
This discussion has been archived. No new comments can be posted.

Experiences with Replacing Desktops w/ VMs?

Comments Filter:
  • No 3D (Score:4, Interesting)

    by sarathmenon ( 751376 ) <srm@[ ]athmenon.com ['sar' in gap]> on Wednesday August 16, 2006 @11:19PM (#15924740) Homepage Journal
    There are a lot of complications using a VM - there's no 3D, no good audio etc.. Plus if your base computer does not fit into the HAL, you can't expect much out of the VM. I am actually surprised at this - a VM will give you the benifit of portability, but if that was your goal you'd be better off giving a laptop to all your employees.
    • Re:No 3D (Score:5, Insightful)

      by Jugalator ( 259273 ) on Thursday August 17, 2006 @05:02AM (#15925562) Journal
      "there's no 3D, no good audio etc"

      These two are often not an issue in corporate environments though.
      Sure, some exceptions depending on what kind of work you do, but still exceptions.
    • Re:No 3D (Score:5, Informative)

      by SavvyPlayer ( 774432 ) on Thursday August 17, 2006 @06:21AM (#15925692)
      Under VMWare Player, the video drivers included in the latest version of vmware-tools do support partial hardware-accelerated 3d. From the site:
      Experimental support includes the following limitations: Workstation accelerates DirectX 8 applications, and DX9 applications which use only the subset of DX8. Performance/speed of 3D applications is not yet optimized. OpenGL applications run in software emulation mode. All aspects of 3D acceleration are not enabled. Some 3D features that are not yet accelerated include: Pixel and vertex shaders Multiple vertex streams are not supported. Hardware bump-mapping, environment mapping Projected textures 1, 3, or 4 dimensional textures
      This support is only going to improve over time.
  • Um, wouldn't a ... (Score:5, Interesting)

    by Bin_jammin ( 684517 ) <Binjammin@gmail.com> on Wednesday August 16, 2006 @11:20PM (#15924748)
    thin client be a cheaper and easier solution per seat?
    • by OriginalSpaceMan ( 695146 ) on Wednesday August 16, 2006 @11:24PM (#15924766)
      Plus, on a LAN using thinclients will be just as fast, visually, as a local PC. Hell, I play video's over my RDP thinclients and it works quite well.
      • Re: (Score:3, Interesting)

        by t1n0m3n ( 966914 )
        "Plus, on a LAN using thinclients will be just as fast, visually, as a local PC. Hell, I play video's over my RDP thinclients and it works quite well." Funny you should make that statement; video via RDP on my locally connected 100Mbps link is absolutely horrible. I have several computers, and I use RDP to access them all. Every time I try to watch a video, I find myself copy/pasting the link to my local computer to actually watch the video.
        • by moro_666 ( 414422 ) <kulminaator AT gmail DOT com> on Thursday August 17, 2006 @01:36AM (#15925146) Homepage
          hmm, i used linux debian on this setup, with a clunky realtek 3189 network card, and my video over the Xv extension of the xserver worked flawlessy, sound came through arts over the net, everything just works.

          it's down to the configuration, the network itself can do it.
        • Re: (Score:3, Interesting)

          by quantum bit ( 225091 )
          Funny you should make that statement; video via RDP on my locally connected 100Mbps link is absolutely horrible.

          Ironically, I just discovered that video over X11 on a 100Mbps link worked much better than I expected.

          I recently converted my old desktop into a pseudo docking station for my laptop, since the laptop is a faster machine but lacks the dual DVI connectors of a real video card. The laptop gets connected via a crossover cable to the desktop -- the laptop only has 100Mbps crappy builtin realtek. Eve
    • Re: (Score:3, Interesting)

      Thin client is a much better solution. I devised a kiosk system to serve a web based application to multiple branch offices for the Dept of Finance in NYC. They have about 1,200 thin clients across four burroughs that connect to a handful of windows terminal servers, and it works like magic. The cost of the thin clients is cheap, you can find them for around $200, and they perform just fine at 100mbit. The trick is managing memory and cpu limitations when using a windows solution, but with the new 4x4s, lar
  • Citrix (Score:4, Interesting)

    by Bios_Hakr ( 68586 ) <xptical&gmail,com> on Wednesday August 16, 2006 @11:22PM (#15924761)
    Sounds like you want something like Citrix.

    Although, what you could do is automagically have a standard WinXP workstation login on startup. Next, have VMWare in the startup folder so that it begins as soon as the computer logs in. Finally, have VMWare point to a disk image loaded on your server. The employees will then see a full-screen VMWare ready to authenticate on the network and begin their day.

    If you really wanted to be fancy, have that image automagically map to a network drive on your SAN/NAS as the D:\ drive. Tell employees to use the D:\ drive to store all work-related documents.

    It could work. But you'd be looking at maybe 5 minutes for the morning boot-up. Not to mention all the employees hammering the network for a 2~4gb image at 7am will really thrash the servers.

    If you insist on doing this, go a bit further. Activate that WoL crap and autoboot the workstations at staggered times between 6am and 7am.
    • Re:Citrix (Score:5, Informative)

      by discord5 ( 798235 ) on Thursday August 17, 2006 @12:25AM (#15924938)

      Sounds like you want something like Citrix.

      Citrix (or another similar product) is exactly what he should be looking into. Downloading entire disk images over a network is just a pain in the ass everytime someone boots. However Citrix isn't the solution to all things, yet it beats VMs for most practical applications.

      But you'd be looking at maybe 5 minutes for the morning boot-up. Not to mention all the employees hammering the network for a 2~4gb image at 7am will really thrash the servers.

      See, that's the big negative point in the entire setup. The bootup time is a pain in the neck, but people can live with that easily. They'll fetch their cups of coffee, have the morning conversation with coworkers and will return about 10 minutes after their machines have booted up. The real issue is the server getting hammered every morning, slowing these boottimes as more machines get added to the network.

      I can hear it now: set up a second server, set up a third... etc etc. Yes, set up a bunch of servers that do nothing all day but hand out images, and don't forget about the backup servers (you don't want one of those servers to crash in the morning taking out the entire accounting department). I'm seeing an entire rack of machines at this point doing nothing but handing out images, wired up to really expensive network gear, doing nothing really useful. Don't get me wrong in this last statement, the usefulness of this construction is that you can easily exchange pc's and images not having to worry about hardware, software installed on each users pc, etc. But there's a lot of more cost-effective ways to achieving something that works similar.

      Take that budget for those image servers, and backup servers, VM-software licenses, and networkgear, and buy a single server and a good backup mechanism (or a backup server in failover). Spend some time on setting up profiles and think about what software is present on all machines. Take an image of every machine you install differently, and copy that to the server. Buy software like Citrix (or anything else resembling it) to have special applications available at one server (think backups here), and you have a pretty decent solution that doesn't hammer your network/servers every morning and gives you a headache by 10am because some people aren't getting their images.

      I've seen the concept of VM images on a server, and I've seen people get bitten by it because they didn't forsee the amount of storage and network traffic involved. Most of these people didn't have a need for such an elaborate solution. Hell, I've seen half a serverfarm run vmware because "it was a good way to virtualize systems, and make things easily interchangable" while those people would've been much more satisfied with a "simpler" failover solution (note those quotes, denoting that failover also requires thought, but usualy ends up being a cheaper solution hardware wise).

      On top of it all, using VMs for desktop operating systems uses up a lot of resources. You're running an operating system, that runs software that runs another operating system. Some would say that it's hardly noticeable, but why waste the resources? You'll make todays hardware run like last years, which for most applications is not an issue, but most likely you're going to run last years hardware like hardware from two years ago because you'd have to invest in new desktops for the entire company otherwise.

      Let's talk mobility for a moment. Imagine your salesman with his laptop and flashy UMTS (or whatever standard they've cooked up) connection on the road. He's going to want to be able to check his mail on the road, so he'll have to get an image over a connection that can hardly manage streaming video... Nope, you're going to give him his operating system, install his software and pray to god he doesn't send too many large documents over that very expensive UMTS connection. That sort of starts breaking the principle of having images f

      • Re:Citrix (Score:5, Interesting)

        by Nefarious Wheel ( 628136 ) on Thursday August 17, 2006 @01:20AM (#15925103) Journal
        What about staging the images overnight and keeping the backup image on the user's local C drive? If the network's up, use it to update the local cache overnight, as needed. If the network's down, use the cached image. You don't have to refresh the image daily, just when you want to make a change. The beauty is in ease of rollback when someone stuffs up a change on the client.
      • It's not so much thrashing the servers as it is saturating their network cards. Assuming that you're using a standard image across all desktops, a server with a large amount of RAM cacheing the image and a few of the new 10Gbps NICs into your routers and switches should be able to handle it without too much of a problem. If you're using PXE booting, the DHCP server should be set up with several of the different IP blocks pointing to different addresses on the same boot server (different NICs).

        Like suggested
      • Re:Citrix (Score:5, Interesting)

        by pe1chl ( 90186 ) on Thursday August 17, 2006 @03:57AM (#15925443)
        What you forget with your Citrix solution is that you move the problem from the network to the CPU and memory.
        When you have an entire office full of modern PCs (say with 512-1024 MB of RAM and a 2-3 GHz class CPU) you are wasting a large amount of real estate when you run ICA Client on all those and make the people work on one or a few Citrix servers where they all have to compete for a few CPUs and a lot less memory.

        Citrix is nice, but it is not the answer to everything. When the users run intensive or inefficient applications, it can be a severe performance problem.
        The solution he has in mind does not have that problem, because his applications run locally so they utilize the local resources available on the desktop.

        People actually use wakeup on lan on desktops?

        Yes, we use WOL to wakeup windows workstations in the weekend (or the night, in emergency cases) and install/update software or hotfixes.
        So, the user is not bothered with waittime reboots after application installs.
        • Re: (Score:3, Informative)

          by misleb ( 129952 )

          Citrix is nice, but it is not the answer to everything. When the users run intensive or inefficient applications, it can be a severe performance problem.
          The solution he has in mind does not have that problem, because his applications run locally so they utilize the local resources available on the desktop.

          Some old fashioned roaming profiles and ghost (or some other imaging solution) action would seem to be the perfect compromize. Local CPU gets utilized. Network traffic is minimal. Users get good performa

    • So use rsync to download the images. Why on earth would you use WinXP just to start up vmware? Just do a quick linux install of any distro and use vmware for linux. In my experience the linux version is much better anyway (for whatever reason).

      One caution, with some distro's vmware has to actually interpret a lot of programs (due to guest / host vm spaces overlapping). On these system combinations vmware will be *really* slow... basically anything fedora. On other systems you're talking maybe 5% averag
    • Re:Citrix (Score:5, Informative)

      by hdparm ( 575302 ) on Thursday August 17, 2006 @01:11AM (#15925075) Homepage
      Actually, with VMware workstation you can keep base images on the workstation itself and load only user's plugins / redo stuff over the network. That's what we do and we don't see any network hit on 100 Mbps LAN. This gives you ability to run free (as in beer) Linux distro on all your workstations, which enormously helps with PC support issues compared to any Windows version. With a bit of clever scripting, KDE session retrieves all the necessary stuff from MySQL backend and users have their workstation (Windows, Linux, whatever) running full screen in no time. With good PC hardware (which is dirt-cheap these days) noone can tell that what they see and work on is VM.

      Granted, for large network this solution is probably too expensive (we are .edu so we get really nice discounts on VM licenses).
      • Re: (Score:3, Informative)

        by gmplague ( 412185 )
        I don't know what VMware has, but qemu [bellard.free.fr] has support for it's own COW (Copy-On-Write) filesystem. Essentially, you give it a base disk image, and then any changes to it are written to a special file. When the machine is loaded, this "diff" is applied to the base filesystem, and you have the full altered system. The advantage is that the COW (the diff) image is much smaller than the whole filesystem.
  • by scubamage ( 727538 ) on Wednesday August 16, 2006 @11:23PM (#15924763)
    Get some Sun Microsystems SunRays. Seriously.. thats exactly how they work. Your session can be saved on server and resumed anywhere else you plug in your smart card. One server and all of the terminals you need.
    • Re: (Score:2, Informative)

      by Sampizcat ( 669770 )
      We used Sunrays at my old workplace. They worked fine and were very reliable - just throw in your card, put in your password and away you went. I highly recommend them.

      And no, I don't not work for/am not in any way affiliated with Sun Microsystems - I just really like their product.

      Sampizcat
    • by boner ( 27505 ) on Thursday August 17, 2006 @02:21AM (#15925248)
      Exactly!

      This is brought to you from a SunRay at home, talking to the server in the garage...

      Combined with Tarantella, you can have every Windows application you want. The latest revision of the SunRay server also works on Linux (RedHat I think)!

      I run my Windows apps in QEMU, but that is because only my wife and I share the SunRay server...(2.4GHz P4, 3GB RAM). From a users perspective its just perfect! Power-on in the morning, insert your card, login and last nights session is still there. Just upgraded to the latest Open Solaris build so I had to reboot the machine, but before that my machine had reached 317 days of uptime!

      In an office environment your mileage will vary, but I have always appreciated the silence of my office working on a SunRay.

      Regarding the GP, downloading VM images just doesn't make sense compared to a SunRay, especially if you already have GB ethernet. Make sure the servers have enough RAM and don't let them play Quake!

      (and yes, I work for Sun...)
  • Look at LTSP.ORG (Score:5, Informative)

    by EDinNY ( 262952 ) on Wednesday August 16, 2006 @11:25PM (#15924776) Homepage
    LTSP.ORG does somthing similar. You run X clients on a common "server" and view it with an X server on almost anything with 64 megs or more of memory.
    • by Wizarth ( 785742 )
      I wouldn't even bother with the 64MB's of RAM, I've ran scrap PCs as LTSP clients with 16MB, although I didn't work them hard.

      No aplications run locally (except X) so there isn't even normal Desktop Manager overhead.
    • Look at Edubuntu (Score:4, Interesting)

      by iamcadaver ( 104579 ) on Thursday August 17, 2006 @07:57AM (#15925977)
      The edubuntu distribution is basically a plug-n-play instant LTSP environment.

      I use it for junk laptops with busted hard drive controllers. I just wish wireless network cards had boot proms, I'm using MMC/SD cards to bootstrap.
  • by steppin_razor_LA ( 236684 ) on Wednesday August 16, 2006 @11:30PM (#15924791) Journal
    I'm a vmware/virtualization fan, but I don't think this is the best application. It seems to me that it would be smarter to use terminal services / citrix / a thin client approach

    If you were going to use vmware, make a standard image and push it out to the local hard drives. don't update that image unless it is time to push out a new set of windows udpates/etc. if you need to update the image though, that is going to be *hell* on your network/file servers.

    I think it makes more sense to run a virtualized server than a desktop.

    Also, you might end up paying for 2x the XP licenses since you'd have to pay for the host + guest operating systems.

    • by Vancorps ( 746090 ) on Thursday August 17, 2006 @01:43AM (#15925167)

      First off, I agree with you that this isn't a good application of a VM considering the number of alternative options that exist already. The one area I will disagree with is the licensing since you're in no way required to run Windows as your host OS. Just run a linux-based host OS and problem solved. VMWare runs just as well on both. I'm not sure about other options like Virtual PC or Qemu but last I checked Qemu only worked on Linux so you're still in a good position not to have to throw more money at Windows licensing.

      Side topic, licensing has really gotten out of hand with pretty much every piece of commercial software. I think that's the real reason a lot of people are moving towards Linux. The learning curve required to administer linux effectively is outweighed by the complicated licensing schemes of various companies Microsoft especially. It is quite a challenge staying in compliance these days.

      Back on topic, you could have a file server or three dedicated to the task using a DFS root to link them logically and to keep them sychronized. Then you wouldn't have to worry about pushing images killing server performance. Combined with network load balancing you could scale out as needed.

      • Good points all around. :)

        I've never managed a linux desktop rollout so I don't know how hard it would be to manage host OS deployments (i.e. is the Windows XP HAL more or less resistant to hardware changes than your average distro?)

        I'm not completely sure that I buy the argument of distributing the images across multiple servers solving the "image distribution" problem (i.e. 20+ people simultaneously pulling down 4GB images seems like a lot of network traffic), but I suppose that is just a matter of math +
        • 4GB images over a gigabit link wouldn't take that long to transfer. The obvious bottleneck is network connection for the servers. This is mitigated with 10gigabit links on the servers but now the price of the setup is getting more and more expensive. Still not unreasonable though since you can get 10gigabit modules for all the modular HP procurve switches out there for less than you probably think. I know it shocked me but I went all HP because Ciscos are needlessly expensive and don't provide me any additi

  • Still Windows (Score:2, Interesting)

    by klaiber ( 117439 )
    Well, you'd still be running Windows (if that's your poison), and so your users would still be subject to (say) all the Outlook or Explorer weaknesses and exploits. The main upsides I'd see are
    (a) presumably all VMs have the same device model, so you'd be running the same image everywhere, and
    (b) assuming you carfully partition out the users' data to a different volume, you can give them a "fresh" virtual machine (a fresh Windows registry!) every time.

    Nice and useful, but still not bomb-proof.
    • Re:Still Windows (Score:5, Informative)

      by Vancorps ( 746090 ) on Thursday August 17, 2006 @01:30AM (#15925134)

      You don't need to "carefully" do anything. Folder Redirection in Windows was created just for the task. It's a feature that was introduced with Windows 2000. Beyond that you can use SMS and custom office installs to have everything configured properly everytime someone logs in. Mandatory profiles ensure that everything stays clean and spyware free. Which weaknesses are you referring to?

      Beyond that I'll go and say that this approach is bomb proof and by redirecting files on to the servers which requires surprisingly little overhead you ensure that when users float from machine to machine they have all their application preferences and data. Settings can very from machine to machine with different version of software and whatnot but again, SMS will fix that.

      I think we can all agree this is not a good use of virtualization. It would be very resource intensive and a simpler PXE solution already exists. With PXE you don't even have to have all the same hardware, just the proper drivers. SMS will take it from there installing the rest of the third party apps whatever they may be. Can be done from start to finish in under 30 minutes which is about how long it takes to fully restore an image. Of course over a gigabit link the time might be reduced but Windows will take a good 10 - 15 minutes to install over the network so it wouldn't be unreasonable for everything else to take another 15 minutes depending on how much there is. I know in my basic setup with Windows and Office its about 20 minutes give or take depending on processing speed and quality of hardware.

  • The way we do it... (Score:4, Informative)

    by DarkNemesis618 ( 908703 ) on Wednesday August 16, 2006 @11:32PM (#15924800) Homepage
    Where I work, we have a domain so a user can log onto any computer and have their email & favorites all set up. In their profile, it automatically maps their departmental network drives and their personal network drive (where they're supposed to save their documents to). The normal programs are installed on every machine, and it's not hard to temporarily install any special programs they need on the machine they use in the event theirs in unusable. The only issue we have is that for some reason, no matter how much we tell them to save on the network, they apparently refuse to listen and save stuff on their hard drive. And then subsequently blame us if their hard drive dies and they lose data. But that's another story.
    • Re: (Score:3, Informative)

      by ejdmoo ( 193585 )
      Configure folder redirection. Then the "My Documents" folder will be on the network, and users won't have to know anything special to save there.

      The desktop is still a problem though.
      • by killjoe ( 766577 )
        The whole windows profile thing is just a pile. God help you if you ever want to change your domain for example. How long ago did unix invent the idea of mounting your home directory on a network server anyway?
      • The "desktop" doesn't even matter if you are doing real profile redirection. The C:\Documents and Setings\%USERNAME% folder gets redirected to a location that the domain admin chooses. Put it on a file share on a server and be done with it. You can't really do anything about the local drive, especially if you have sloppy legacy programs that stupidly require local admin access.

        Regular users don't need local admin rights on their computer anymore with most apps, but it possibly may require descending int

  • I'm not experienced with a VM setup like the one you describe, but let me offer this - if you have them download their images every morning you may run straight into a brick wall. Performance testers call this "the 9am syndrome", and you'll need some fairly serious server bandwidth to handle everyone copying such a large file. This will turn your network, and the disc you're serving the images from, into a seething pile of molasses. OK maybe I'm being a shade gloomy, but I'd recommend not going the downloa

    • by grcumb ( 781340 )

      "Performance testers call this "the 9am syndrome", and you'll need some fairly serious server bandwidth to handle everyone copying such a large file. This will turn your network, and the disc you're serving the images from, into a seething pile of molasses."

      One word: Multicast [wikipedia.org].

      I've seen a room full of PCs simultaneously boot and load the same ~1GB Linux partition on a 100Mb network in no time. If they hadn't told me how it was working, I'd never have known they weren't loading a local partition.

  • My experience... (Score:3, Informative)

    by Starbreeze ( 209787 ) on Wednesday August 16, 2006 @11:41PM (#15924827) Homepage
    I needed a quick and cheap solution for some Windows machines for our QA group to test things on. We bought some VMWare Workstation licenses and ran 6 VMs running XP on each beefy machine. (About the limit for a machine with 4GB RAM) Granted, there are better VM solutions than Workstation, but we wanted cheap and quick. Don't count on it for anything mission critical. About every two weeks, VMWare would basically eat itself and the Linux box. However, it was easy from a maintenance point of view, because I could VNC in and see all 6 VMs at once. Also, since VMWare has a cloning feature, anytime QA infected the machines with something nasty, or just pissed off XP, I could re-clone it. Also remember that any VM hogging resources can slow down other VMs on the same host.

    However, for the context that you are speaking about, I would take the advice of individuals below and look at Citrix or roaming profiles.
  • You can have remote profiles, and even link the desktop, my documents etc to remote folders.
    Why go through the overhead of a VM? Citrix is one idea, but the most efficient thing is to just make their profiles remote.
  • Enterprise Desktop (Score:3, Interesting)

    by phoebe ( 196531 ) on Wednesday August 16, 2006 @11:47PM (#15924840)
    Enterprise Desktop was recently announced by VMware it sounds closer to what you are looking for?

    Enterprise Desktop Products

    Support the needs of a global workforce by providing virtualized computing environments to enterprise employees, partners, and customers to secure access and manage resources. Provide a full PC experience without compromising security or hampering users. Improve desktop manageability, security, and mobility by apply virtualization technologies to client PCs and the data center for server hosted desktops.

    http://www.vmware.com/products/enterprise_desktop. html [vmware.com].

    • by mrbooze ( 49713 )
      Yes, VMWare definitely pushes solutions like this pretty heavily. I would recommend contacting VMWare and/or researching their offerings to see how they are architected. (You don't have to *use* VMWare, of course, you can just get an understanding of concepts/tools/practices and look at other vendors or open source solutions as well.)

      I saw a presentation by a representative of one of Cook County's departments that was deploying some solutions like this. VMWare's Enterprise Desktop running on servers in t
  • I can't give you the exact details on how this would be done because I haven't actually tried it, but it should be workable.

    The idea is that all your desktop machines would be running a minimal Linux install that can easily be replaced on short notice using various imaging techniques.

    Basically these machines would just enough to run a graphical login, wherein after a user logs in, it runs a script that fetches that user's QEMU disk image from some network drive and puts it on a local hard disk. It would th
    • by Sancho ( 17056 )
      Everyone goes on and on about how great QEMU is, but frankly, I think it's almost useless on Linux. The free VMWare solutions are faster and more stable by orders of magnitude (with or without the kqemu kernel module) if you need a full Windows environment, and for other emulation applications, there are generally better solutions like Wine or the various DOS emulators.

      QEMU is painfully slow even on a beefy machine and with the kernel module. IO with qcow is pretty bad, too (it's tolerable with raw disk a
  • by maggard ( 5579 ) <michael@michaelmaggard.com> on Thursday August 17, 2006 @12:00AM (#15924868) Homepage Journal

    So a lot of expensive desktops emulating, um, pretty much themselves, using funky somewhat pricy software, running substantial images pulled off of expensive servers over an expensive network (bacause GB'net or not, a building full of folks starting up in the morning is gonna hammer you.) Then comes the challenge of managing all of those funky images, reconciling the oddities of an emulated environemnt, etc.

    Could you make it work? Sure. But I gotta wonder if it'd be worth it.

    Is gonna be any better then a well managed native environment? Or going Citrix clients? Or Linux/MacOS/terminals (chose your poison) boxes instead of MS Windows?

    I hear your pain, I just think you're substituting a known set of problems with a more expensive, more complex, more fragile, baroquely elaborate, well, more-of-the-same.

    It doesn't sound like much of an improvement really, just new and more complex failure modes, at extra cost.

    Though, I guess, if you're looking for a new, challenging, and complex environment this would be it; just take your current one and abstract it another level. I wouldn't want to be the one footing the bill, or trying to rely on any of it, but at least it'd be something different.

    • Re: (Score:2, Informative)

      by GreatDrok ( 684119 )
      We have bought a number of quad opteron machines recently because we do a lot of background number crunching and they need to run Linux. However, everyone has also been using laptops for Windows software. At my suggestion we have been configuring VMware images of XP Pro with Office for each user and installing vmware-player on each of these Linux workstations.

      We have a Linux server that runs Samba for roaming profiles to the current Windows laptops and this works OK as it does mean if a laptop dies the us
    • by innate ( 472375 )
      Could you make it work? Sure. But I gotta wonder if it'd be worth it.

      One advantage would be that you can revert the user to a pristine image each day. Not that this is the way to do it; it sounds like the OP should investigate something more like Citrix.
  • Back in school... (Score:4, Insightful)

    by SvnLyrBrto ( 62138 ) on Thursday August 17, 2006 @12:05AM (#15924884)
    They just used NIS and NFS, and the net effect was pretty much exactly what you describe... Sit down at any machine, log in, and your environment loads exactly the way you left it on the last machine, everything's safely backed up at the server end, and the client machines are pretty much disposable and interchangeable, and so on. Only difference if you're not farting around with virtual machines... ie. you're not quite as "cutting edge" but on the desktops themselves, don't you want a more proven system? So why wouldn't you just do the same thing, and use said proven, if something of a pain to administer, system?

    As an alternative to NIS, Netinfo does much the same thing, only it wasn't designed by people quite so sadistic as NIS. You'd still be using NFS though...

    cya,
    john
  • by prisoner-of-enigma ( 535770 ) on Thursday August 17, 2006 @12:06AM (#15924886) Homepage
    First off, I don't think VM'ing your desktops is the answer. Current VM's really dumb down the hardware. You lose 3D, sound, and most of them run a bit slower than native (some quite a bit slower). Couple that with the size of most VM images (my Vista image is about 12GB) and you're really looking at a poor solution.

    Here's what you should be thinking about:

    - Get some kind of desktop management suite like Altiris. You can push software deployments easily, and it's very easy to lock machines down to the point where users can't fsck them up. I've consulted for companies that do this with hundreds of desktops and it's a very robust, reliable system.

    - Go with a thin client setup like Citrix or Terminal Server. Users run nothing on their local hardware. Instead, everything runs on the big server. Downsides are similar to VM's (thin clients are notorious for very lightweight support for anything but the most basic sound and graphics) but you are at least spared the massive network thrashing of hundreds of users logging on and pulling down VM images at 8AM every morning.

    - If it's users messing up machines that you're worried about, you might want to consider a solution by Clearcube. They take away everything except the keyboard, mouse, and monitor. The guts of the PC reside in a server rack in what is essentially a PC on a blade. The blades are load balanced and redundant, so swapping them out is a breeze. And users *can't* load software on them because there's no USB ports, no floppy drive...nothing! Unless you allow them to download it from the Internet, *nothing* is going to get on those machines if you don't want it to.

    VM's make sense for server consolidation. I don't think they've yet gotten to the point where desktops run on them as a form of protection or reliability. There's too many other solutions that work better and have fewer downsides. The problem here isn't Windows per se, it's the fact that your workstations aren't locked down properly to prevent your users from doing stupid stuff in the first place. Fix that and suddenly you'll find a Windows workstation environment isn't the hassle it once was.
    • by RShizzle ( 983535 ) on Thursday August 17, 2006 @01:30AM (#15925135) Homepage
      "You lose 3D, sound, and most of them run a bit slower than native."

      Not quite true. Yes, with the 3D. But the two main players (VMware and VPC) both support sound, and VMware even USB 1.1 passthrough.

      With the thin-client option, Microsoft Terminal Services (if you're on a windows platform) has good scaling capabilities. Though it might not go into the hunderds or thousands, it should get you into the high dozens. Since most of the microsoft tool's dlls are loaded and shared between the clients, it has pretty good performance.

      For linux, while SSH is always a favorite, look at NX-Servers (http://www.nomachine.com/ [nomachine.com] and http://freenx.berlios.de/ [berlios.de]) which is like X-forwarding with compression and caching.

      It'll be difficult to have a fully virtualized solution. Going with thin clients, or a pxe-served image would be a more viable solution (no matter how beefy your servers and fast your network).

  • Only advantage over citrix is, that each user can be allowed to screw up his daily copy of the vmware machine.
    Otherwise Citrix and thin clients are probably better. Well, thin clients would always be better, also for this.
    Then you just revert to OK snapshot for the user every day. No copying.

    Patching would be difficult, as you would have to patch x VMs rather than x/30 citrix servers
  • Is there some special reason why the users need to have their own XP image? If not wouldn't it be easier to just force them to save their work on a network share and ghost the machines back to the stock image every night?
    • by Sancho ( 17056 )
      I'm glad someone said it. The VM solution solves hardware problems, though. With VMs set up in this way, you don't have to worry about workstation hardware failing--if it does, just give them another computer configured to load the VM, and they're off. With ghosting, the hardware can still fail, and your image might not work on the next piece of hardware (this is mitigated if all the machines in your company are identical, but even then, some incompatibilities can arise because often the guts of two iden
  • An "unsupported configuration"...
  • Smells like X (Score:3, Insightful)

    by Baloo Ursidae ( 29355 ) <dead@address.com> on Thursday August 17, 2006 @12:26AM (#15924943) Journal
    Sounds like you're trying to solve the same problem X11 [wikipedia.org] is designed to solve. Have you looked into getting a bunch of X terminals and one super-powerful machine?
  • What size are you expecting each image to take? Windows XP isn't exactly lax on storage space, and applications for them can take another gigabyte without difficulty. Preloading a few gigabytes does take a bit of time; I suppose after that you'd use Windows sharing.

    I think the previous comments about Citrix or such are a better solution. Terminal services, while not exactly cheap, may also work well for you. For a Unix environment, xdmcp is feasible in many circumstances. But as far as smart clients go, I'd
  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Thursday August 17, 2006 @12:54AM (#15925036) Journal
    Hmm. Your main issue is going to be switching machines. I see three ways of doing this:

    Some virtual machines let you suspend to a file. This is nice if you must run Windows, or some other uncooperative OS. But, that still means suspend to a file, which will take some time. As for the disk, that would be fairly trivial -- your host OS would be Linux over NFS, so your disk image is an NFS file.

    Issue to watch for here: Local cache. I don't care how fast your gigabit is, that server is going to feel some stress. I tried setting up gigabit just for file sharing, and it was never as fast as it should have been, yes I was using Jumbo Frames, and it's just a crossover cable, yes it was cat6. And even if that's flawless, there's the server at the other end. You probably want good local caching, probably local disk caching. InterMezzo would have been good, but they've pretty much died. You might try simply throwing tons of RAM at the problem, or you might try cachefs (never got it working, but maybe...) or maybe one of the FUSE things.

    Second way: Don't use VMs. VMs will never be as fast as a native OS. But "native OS" can still work roughly the way the VM image does above, if your hardware is identical. With Linux and Suspend2, you can suspend and resume from pretty much anything you can see as a block/swap device. So, all of the above caching issues apply, but just run it as a network OS, have one range of IPs for machines still booting and logging in, and another for fully functional machines. Here, when the user logs in, the bootstrap OS tells itself to resume the OS image from the network.

    You could also do this with Windows by copying a local disk image around -- after you hibernate, boot a small Linux which rsyncs the whole disk across the network, including hiberfile.sys. Everything besides the OS itself would be stored over the network already anyway (samba).

    I don't know if this will work -- after all, no hardware is truly identical. But it may be worth a shot.

    Advantage: Both Linux and Windows XP know to trim the image a bit on suspend, so it won't be a whole memory image, just relevant stuff. Truly native speed.

    Disadvantage: If I'm wrong, then you won't be able to properly resume on a different box.

    Finally, you could stick to software which supports saving sessions and resuming them. I know Gnome at least, and maybe KDE, had this idea of saving your session when you log out -- and telling all applications to do so -- so that when you log back in after a fresh boot, it's like resuming from a hibernate.

    Advantages: Fastest and most space-efficient out of all of them. Least administrative overhead -- in the event of a crash, there isn't nearly as much chance for bad stuff to happen. Easily works cross-platform, native speed on any supported platform. Simplest to implement, in theory.

    Disadvantage: Not really implemented. 99% of all software may remember useless things like window size and position, but very few actually store a session. If you mostly roll your own software, this may be acceptible.

    And of course, you could always do web apps, but those won't be anywhere near native speed -- yet.

    All approaches share one flaw, though -- bad things happen when a box goes down. With a VM image (or a suspend image), if you crash, you'll obviously want to restore from a working image -- but what about the files? If they're on a fileserver, does your working image properly reconnect to the fileserver, or does it assume it's still connected (thus having weird things cached)? The third option (saving sessions) is the safest here, because in the event of a crash, programs behave the same way they would on a single-user desktop. But you still lose your session.

    What others are suggesting -- various terminal server options -- is much slower, but it also means that as long as the application server is up, so is your session. If you crash, you can switch to another machine and literally be exactly where you
  • VMware is still a relatively unproven technology firm. Since they are pushing the virtualized desktop environment that you're interested in they should be able to provide some references. VM technology has been around for a long time but desktop side VM's are something I'd be cautious of without the vendor being able to demonstrate that it actually works in a real world environment.

    That being said, I think that the business case could be made. People have been trying to come up with the same result usin
  • VMware ACE (Score:4, Informative)

    by BlueLines ( 24753 ) <slashdot@divisio ... m minus language> on Thursday August 17, 2006 @01:04AM (#15925059) Homepage
    http://www.vmware.com/products/ace/ [vmware.com]

    "With VMware ACE, security administrators package an IT-managed PC within a secured virtual machine and deploy it to an unmanaged physical PC. Once installed, VMware ACE offers complete control of the hardware configuration and networking capabilities of an unmanaged PC, transforming it into an IT-compliant PC endpoint."
  • One of my clients is a small accounting firm... 15 windows workstations, 4 windows laptops and 1 samba file server. I have roaming windows profiles in place and they are trained to save their work to the server. However only a handful of the employees use quickbooks and they have to keep many dif versions of quickbooks installed. Same with other, much more expensive pieces of software.

    If I could virtualize the machines and install only 5 seats of quickbooks, etc they'd save thousands every year. But since I
    • by julesh ( 229690 )
      Question: what are the quickbooks licensing terms? Specifically, what are you allowed to do with an installation? I suspect the EULA grants the rights to install on one "computer". A VM is not a computer, it's a simulation of a computer that runs on a computer. This is fine, but if you then move the VM between computers, an additional copy of the software is created that you don't have a license for. You would probably find that legally speaking, you'd need just as many copies anyway.
    • by addbo ( 165128 )
      Each version of the VM needs to be licensed as well... so you're basically adding at least the cost of an XP Pro license for each VM ... for each version of QuickBooks... plus you still need the license for quick books for each of the VM's. I don't think you can just say that you just need 5 seats, because in essence you are installing it on all the machines still...

      If you wanted to make it only 5 seats... you would probably set up a terminal server with the software installed, but only allow 5 people at a
  • by julesh ( 229690 ) on Thursday August 17, 2006 @01:55AM (#15925200)
    You can't legally do this with Windows. The (bulk-licensing) EULA states that you are allowed to install Windows on one computer and one virtual machine *that runs on the same computer*. Moving the image from computer to computer is specifically prohibited, IIRC (yes, I've considered doing this before).
  • You have two main logical problems.

    1. You still need an OS to run VMware. If it is Windows, you get typical Windows problems, and if it is Linux, you will probably find that your hardware is not really compatible with anything but Windows.

    2. Do you want to use one image, or a different image per user? If you use one image, you will immediately run into license problems with the software. If you use several images, you need a lot of storage space, and you need to copy the images back in the evening.

    But most
  • by bazorg ( 911295 ) on Thursday August 17, 2006 @03:08AM (#15925357)
    Essentially, when an employee comes in in the morning, have them log-in, and automatically download their VM from the server
    Your goals may be better accomplished with a different approach.
    1. Build your standard, clean virtual image to use in all workstations
    2. set the /home dir as a remote share. tell users to keep their files on that share
    3. have all workstations load VMware player on startup, running a local copy of the virtual machine you built

    Now you have most of the benefits you asked for: you can have users switch places at random, you can replace physical computers and set them all up with the same VM... you can even have them all run windows on a linux host if this helps prepare for "the big switch".
    As for your maintenance of the VMs, you can remotely log in to any of the workstations and replace the old VMs with new ones when you need to update something. Ocasionally you can wipe out all files that are kept on workstations to ensure that no kiddie p0rn is found, and to further illustrate that it is essential to keep all work-related files on the server as instructed in 2) :)

  • by mrcpu ( 132057 ) on Thursday August 17, 2006 @03:20AM (#15925376)
    Vmware ACE would probably be a good choice, it allows you to lock down the host hardware, disabling various pieces.

    VM's can run off of network shares if you set things up right. Fast network, and you won't see a problem. I have run VM's off mirrored ximeta netdisks over 100meg with NTFS as the partition type, and it worked great, although it was only about 4 machines accessing it at one time. For office apps and such, it's a piece of cake.

    I encourage people to use vmware for laptops. Create an encrypted disk with the vmware image that they want to run, then if the laptop gets stolen, you have to decrypt the disk before you can get to the really good stuff. Backups are easy, and it makes if necessary, laptop "sharing" something that you can do pretty easily as well. Multiple shifts can PC share easily as well. It's also easier to fix problems test updates and such by just snagging a copy of the image, and monkeying with it.

    Citrix and remote desktop have their places as well.
  • Our experiences (Score:3, Informative)

    by plopez ( 54068 ) on Thursday August 17, 2006 @09:46AM (#15926657) Journal
    I am working for a SME, and we are currently going from remote desktop to Citrix. Having most production applications hosted on either a web server or a remote server are *huge* wins for us in terms of support costs, esp. since we have a number of custom apps to support (we are in a niche market and have yet to find a large vendor who creates useful apps for us). Most of the desktop costs are gone, in that you only have to upgrade the central server or servers, users cannot monkey with the config, everyone ends up using the same versions of the software, we have images of the server loads so if it does fail we can get it back fairly quickly etc.

    For remote users we use Cisco VPN to the remote desktop.

    Citrix licensing is expensive but you should first rough out some numbers as to how much it costs to support the desktops individually versus the same tasks by one or 2 techs on one or two servers plus Citrix costs.

    We are using VMs in our development and test environments on an older AMD 64 bit machine. It still bogs down after 3-4 Vm's are running so my advice is to by the biggest, fastest and most reliable box you can. Lots of memory, fast disks and memory and CPUs. Newer 64 bit hardaware would be sweet as you should be able to set up 32 bit OS's on it and support older apps without having to upgrade everything to 64 bits all at once. Make sure it is not 'cutting edge', rather if it is for critical apps make sure it is stable on the hardware side, even if you sacrifice a little speed. Think in terms of how mainframes do things.

    HTH

  • by GWBasic ( 900357 ) <slashdot@andreNE ... au.com minus bsd> on Thursday August 17, 2006 @01:04PM (#15928276) Homepage

    My employer uses Thinkpads with docking stations as standard issue. For those of us who need more power, we just use Terminal Server (or another remote access program for non-Windows computers.) We use Connected Backup to backup the laptops on a daily basis over the network.

    While I personally would prefer a more powerful laptop, (as I do serious development,) I'd rather use a laptop then a generic workstation. I can telecommute with it anywhere in the world, and I can use it in meetings with a projector. This is more difficult with generic workstations.

Real Programmers think better when playing Adventure or Rogue.

Working...