Forgot your password?
typodupeerror
Ubuntu Linux

Does Ubuntu Now Require More RAM Than Windows 11? (omgubuntu.co.uk) 116

"Canonical is no longer pretending that 4GB is enough," writes the blog How-to-Geek, noting Ubuntu 26.04 LTS "raises the baseline memory to 6GB, alongside a 2GHz dual-core processor, and 25GB of storage..." Ubuntu 14.04 LTS (Trusty Tahr) set the floor at 1GB — a modest ask when it launched more than a decade ago in 2014. Then came the Ubuntu 18.04 LTS (Bionic Beaver) that pushed the number to 4GB, surviving quite well in the era of 16GB being considered standard for mid-range laptops.... Ubuntu's new minimum requirement lands in an interesting spot when compared against Windows 11. Microsoft's operating system requires just 4GB RAM, although real-world usage often tells a different story. Usually, 8GB is considered the sweet spot to handle modern apps and multitasking.
The blog OMG Ubuntu argues this change is "not because Ubuntu requires 2GB more memory than it did, but more the way we compute does." it's more of an honesty bump. Components that make up the distro — the GNOME desktop and extensions, modern web browsers (and the sites we load in them) and the kinds of apps we use (and keep running) whilst multitasking are more demanding... The Resolute Raccoon's memory requirements better reflect real-world multitasking.

Ubuntu 26.04 LTS can be installed on devices with less than 6GB RAM (but not less than 25GB of disk space). The experience may not be as smooth or as responsive as developers intend (so you don't get to complain), but it will work. I installed Ubuntu 26.04 Beta on a laptop with just 2 GB of memory — slow to the point of frustration in use, but otherwise functional.

If you have a device with 4 GB RAM and you can't upgrade (soldered memory is a thing, and e-waste can be avoided), then alternatives exist. Many Ubuntu flavours, like Lubuntu, have lower system requirements than the main edition. Plus, there's always the manual option using the Ubuntu netboot installer to install a base system and then built out a more minimal system from there.

This discussion has been archived. No new comments can be posted.

Does Ubuntu Now Require More RAM Than Windows 11?

Comments Filter:
  • Anwser: No (Score:4, Informative)

    by Samare ( 2779329 ) on Sunday April 05, 2026 @07:50AM (#66078152)

    "Any headline that ends in a question mark can be answered by the word no."
    https://en.wikipedia.org/wiki/... [wikipedia.org]

    • by quall ( 1441799 )

      Ahh yess. I too had learned this several decades ago.

    • And yet the answer is actually yes. Unless all you do is Linux command line stuff or browse static webpages using a browser that last was standards compliant in the early 2000s, 4GB is not longer a viable minimum for anyone who doesn't also spend their evenings self-flagellating. It's masochistic to use an underperforming computer.

      • by Samare ( 2779329 )

        The question isn't "Does Ubuntu Now Require More than 4GB of RAM?", it's "Does Ubuntu Now Require More RAM Than Windows 11?".

    • Given that Windows now need 16GB of RAM to work well then I see that the answer in 'NO'.

  • by simlox ( 6576120 ) on Sunday April 05, 2026 @08:00AM (#66078162)
    was possible in the 90s. Even 8 Mb.
    • by martin-boundary ( 547041 ) on Sunday April 05, 2026 @08:04AM (#66078166)
      My Emacs only needs 8Mb! (but it's constantly swapping)
    • Linux desktop with 16 Mb RAM was possible in the 90s

      No, 2MB was never enough for a Linux desktop. I had 8MB on my 386 and it was only just sufficient.

      • by quenda ( 644621 )

        Linux desktop with 16 Mb RAM was possible in the 90s

        No, 2MB was never enough for a Linux desktop. I had 8MB on my 386 and it was only just sufficient.

        Yeah, Bytes vs bits. But who measures RAM in bits?
        I remember too 8MB being the minimum, but upgraded to 12MB so it was possible to do something else while the kernel was compiling.
        How did we get to the point where 8000MB is considered a bare minimum?

        • How did we get to the point where 8000MB is considered a bare minimum?

          Love of convenience, I guess. I often find it astonishing myself. The software might do 100 times more but it takes 1000 times as much memory...

        • Agreed! I used to RUN a 4KB BASIC interpreter in my T-16 back home!
          • by quenda ( 644621 )

            Yeah, but your T-16 wasn't running a full multitasking OS, network stack, X-windows, a word processor, and a web browser.
            Even if you got the targeting computer upgrade it had only shitty low-res vector graphics.

            That 8MB machine was comparable in function to modern PCs with 1000X the memory.

          • 4K BASIC was in ROM, not RAM.

            As I recall most Disc-based OSes of the ROM Basic era required 16K of RAM to support the disc operating system...

      • That was enough when I used one of the first Slackware versions in the early 1990s

    • by Alain Williams ( 2972 ) <addw@phcomp.co.uk> on Sunday April 05, 2026 @11:17AM (#66078356) Homepage

      The first Unix machine that I used had 1MB of RAM and it supported about 10 simultaneous users. OK: this was the 1980s and it was a PDP-11 (16 bit machine), but how times have changed!

      BTW: Mb means megabit, MB means megabyte.

      • by vbdasc ( 146051 )

        I learned Unix and C on a V7 Unix virtual machine with 256Kb memory under VM/SP on an IBM 4361.

    • by vbdasc ( 146051 )

      I installed my first Linux (Debian 2.1 slink) on a PC with 4Mb RAM and it ran well, although it had no web browsers to speak of.

    • A desktop with only 8 or 16mb ram was also possible with windows in the 90s. That was a common amount of Ram for a Win3.x PC and usually a baseline for a Win95 PC
  • by Artem S. Tashkinov ( 764309 ) on Sunday April 05, 2026 @08:02AM (#66078164) Homepage

    The truth is that these requirements should have been updated years ago, as 4 GB has been insufficient for at least a decade unless you never use web browsers or modern applications using CEF (Chrome Embedded Framework).

    The fact that Windows 11 still "requires" 4 GB of RAM is ridiculous. I recently installed Windows 11 from scratch with no OEM junk, and only installed the Intel GPU driver. On boot, the system RAM usage was around 5.9 GB with no applications running except obviously Windows Task Manager and Windows Explorer. This is all thanks to the PhiSilica Windows AI components that are now pre-installed automatically, as well as the WorkloadsSessionHost.exe application that runs at all times.

    Took me quite a while to delete all this junk and reduce memory usage to just below 4 GB, which still sounds crazy. 6 gigs of RAM wasted just to show your desktop (as most Windows users will never get to the bottom of it), that's what we are dealing in 2026.

    • by pz ( 113803 ) on Sunday April 05, 2026 @08:29AM (#66078190) Journal

      Web browsers are absolute hogs, and, in part, that's because web sites are absolute hogs. Web sites are now full-blown applications that were written without regard to memory footprint or efficiency. I blame the developers who write their code on lovely, large, powerful machines (because devs should get good tools, I get that), but then don't suffer the pain of running them on perfectly good 8 GB laptops that *were* top-of-the line 10 years ago, but are now on eBay for $100. MS Teams is a perfect example of this. What a steaming pile of crap. My favored laptop is said machine, favored because of the combination of ultra-light weight and eminently portable size, and zoom works just fine on it, but teams is unusable. Slack is OK, if that's nearly the only web site you're visiting. Eight frelling GB to run a glorified chat room.

      The thing that gets my goat, however, is that the laptop I used in the late 1990s was about the same form factor as this one, had 64 MB (yes, MB) of main memory, and booted up Linux back then just about as fast. If memory serves, the system took about 2 MB, once up. The CPU clock on that machine was in the 100 MHz range. Even not counting for the massive architectural improvements, my 2010s-era laptop should boot an order of magnitude faster. It does not.

      Why? Because a long time ago, it became OK to include vast numbers of libraries because programmers were too lazy to implement something on their own, so you got 4, 5, 6 or more layers of abstraction, as each library recursively calls packages only slightly lower-level to achieve its goals. I fear that with AI coding, it will only get worse.

      And don't get me started on the massive performance regression that so-called modern languages represent, even when compiled. Hell in a handbasket? Yes. Because CPU cycles are stupidly cheap now, and we don't have to work hard to eke out every bit of performance, so we don't bother.

      • by scrib ( 1277042 ) on Sunday April 05, 2026 @08:51AM (#66078202)

        What if AI coding goes the other way?

        AI is good at writing tons of code. We might actually move away from layers of libraries if AI directly includes all the support functions we've been too lazy to rewrite.
        AI, using its training on all those libraries, might end up in-lining only the parts of the libraries that are needed.

        • What if? In my experience AI generated code is primarily sloppy, lengthy python that includes so many libraries no single person could know all of them well enough to peoperly review the code.
          • What's more, you really have to know what you're doing to coax it into re-using code, rather than rewriting the same functionality with each prompt.

            • by scrib ( 1277042 )

              Why do you think code reuse is good?
              Maintainability? Testability? Comprehensibility?

              AI has the potential to be a paradigm shift in _avoiding_ code reuse. How many times have you come across a function with collection of parameter flags that make the function behave differently? "This is almost what I want, let me tweak it." Those are such a nightmare to maintain or unwind, especially if there are a lot of calls to them. You try to fix one bug and regress another one. You end up adding a new flag and kicking

          • by scrib ( 1277042 )

            I work on the typescript/npm side of the world and what I've seen recently (the latest updates made a big difference) is not the inclusion of a lot of libraries, but a rewriting of boilerplate code that was probably cribbed from all of open source libraries the model used to train.

            I agree with you that it makes code reviews difficult, but I think we'll be relying on tools for that soon enough, too.

            Every time there has been a major shift in tooling, there has been the "old guard" concerned about the loss of

            • I'm not concerned about any of that kind of thing: writing code commercially is not art. What I am concerned with is the amount of unmaintainable code being generated by AI and people handwaving this away with future-looking statements like "real soon now it will be able to ".
        • by ArchieBunker ( 132337 ) on Sunday April 05, 2026 @09:57AM (#66078256)

          Interesting proposition. Have it write a browser in assembly.

        • by pz ( 113803 ) on Sunday April 05, 2026 @11:18AM (#66078358) Journal

          I have not seen AI code that is *more* efficient than human code, yet. I have seen AI write efficient, compact code when pressed, very, very hard to do so, but only then. Otherwise, in my hands, and those of my developer colleagues, AI produces mostly correct, but inefficient, verbose code.

          Could that change? Sure, I suppose. But right now it is not the case, and the value system that is driving auto-generated code (i.e., the training set of extant code), does not put a premium on efficiency.

          • by scrib ( 1277042 )

            What's wrong with "inefficient, verbose code?"
            First, today's AI is the worst we're ever going to have. It'll just get better, or, at least, you'll always have today's so it won't get worse.
            Second, compact, efficient code is often hard to read. Not always, but a good .map().filter().reduce() can blow your cognitive load in a hurry.
            Thirdly, I'm working in TypeScript, so "efficiency" is hardly the name of the game. We have a lot of business rules to parse and a little bit of cryptography, but no real heavy lif

        • I'm already experiencing this. l asked a code assistant to eliminate lodash. Lodash had a CVE and there was no fixed version available. I looked at the change set and it looked good. It turned out lodash wasn't doing much. I intend to use code assistants to eliminate a lot of libraries in the months to come.

          • by scrib ( 1277042 )

            This. So much this. Thank you for posting!

            There are pros and cons to not using libraries.
            Pros: not going to be subject to as many supply chain attacks.
            Cons: probably some new vulnerabilities unique to your code.
            Mitigation: who's going to know about those vulnerabilities? Run the "find vulnerabilities" AI and have it harden itself up.

            Will it be perfect? No, of course not.
            Is that library you were using perfect? No, of course not.

      • Web browsers are absolute hogs, and, in part, that's because web sites are absolute hogs.

        Yeah, I was gonna say... it's probably not Gnome itself that's the memory hog, it's almost certainly the demands from the web browser and / or email client. *

        We have a computing lab which runs Linux + Gnome. Students are in the GUI almost all the time, but they're mostly running various engineering applications - they're not checking their personal email, and typically they're not randomly browsing the web. If there's only one or two students on there (remote access does get used a lot), htop typically show

      • by kbahey ( 102895 )

        Totally agree with you ...

        But it is not only libraries, there are other factors at play, me thinks ...

        Developers don't have a culture of being economical with resources.
        It is not taught, nor do first jobs they get care about those aspects.
        For example, don't get me started on the infinity scroll which eats up RAM like crazy, rather than a pager of Next/Previous page.

        There are also the layers involved, specially with web development.
        It used to be HTML only, then CSS was added, then Javascript was added for ce

        • Developers don't have a culture of being economical with resources.

          That's because in say, the 60s and 70s, computer time was expensive. It behooved you to make your code as efficient as possible - like today's cloud services, they often billed by the CPU cycle. And the run-debug cycle was on the order of a day, so you didn't want to make a stupid error because it meant your job got delayed a day at the least.

          Sometime around the 80s and 90s, this flipped - human time was expensive. Computers were cheap and getting cheaper, RAM was plummeting as was hard drive space. The math started to work the other way - you don't want developers wasting time debugging code so libraries were popular - because it was more efficient (cheaper) to utilize the fact one person presents a well-debugged library that other developers could use and that means developers don't have to write that code, and they don't have to debug that code either.

          That's why we have bloat - because it's cheaper that way. You could have a developer write nice and tight code, but how much are you willing to pay for it? If it takes them an extra week to make their library run 10% faster, was it work say, the $5-10,000 it cost? ($5000 a week is around $250K/year including benefits, or around $150K take home pay plus benefits, while $10,000 is $500,000K/year including benefits, or around $250,000-300,000 without benefits). Will that improvement let the company make back that investment?

          You have to realize that if you want to charge $150K/year salary, spending a week optimizing costs the company $5000, so unless they can save that $5000 elsewhere (e.g., in reduced cloud compute fees, or customers will pay extra), there is no incentive to do it.

          And that's really a valuable consideration. Also, compilers are really good these days. Like, really good. They will often so very strange things to save a few cycles. Some, like Clang, can be "too smart" and apply closed-form mathematical transforms to your computation (E.g., if you attempt to sum integers from 1 to n, and you do the "stupid way" with a loop, Clang will recognize it and actually generate the code to calculate n(n+1)/2 and eliminate the loop).

          So it's a mix between the cost of a developer to optimize their code, the increasing intelligence of compilers to optimize code, and other things.

          If you want to learn more about how compilers generate code, including being able to add in 0 cycles (hint: it uses the CPU's address calculation unit instead of the ALU to do simple addition and subtraction and even multiplication when it can, so the actual execution time is zero since the computation was done as part of operand calculation), Matt Godbolt of Compiler Explorer fame runs through a whole series in his Advant of Compiler Optimization [xania.org] series. (Youtube: https://www.youtube.com/playli... [youtube.com] ). Trust me, it doesn't pay to outsmart the compiler.

    • by dvice ( 6309704 )

      I'm on Ubuntu 26.04, writing this using Firefox and I am currently using little under 2 GB of my RAM (16 GB in total). I'm not sure how much is used during the boot.

    • Look on the shitty side.

      Microsoft in response could fire up an American memory manufacturing plant.

      That does nothing but soldered-on Microsoft-licensed WinMemory for exclusive use on Microsoft systems.

      Don't worry. The Microsoft desktop tax in your corporate paycheck will be small at first.

    • This seems out of touch. When I'm running a desktop and all my applications, including a web browser, it tops out at about 3GB of RAM. That includes media player, e-mail client, web browser, password manager, terminal, and LibreOffice. 4GB of RAM is usually more than enough for most basic usage on Linux.
  • by Rosco P. Coltrane ( 209368 ) on Sunday April 05, 2026 @08:05AM (#66078168)

    on all my machines. Once you get past the tiled window manager paradigm - if you've never used one before - you realize how fast and seamless it is, and it truly is the least common denominator in terms of memory usage.

    I left Mint (which is really a Ubuntu derivative) years ago, and now i3 / Sway let I have the same unified desktop on all my machines, fast or slow, new or old, and they all feel perfectly usable.

    I highly recommend spending the time to create a i3 or Sway config file. It's well worth the effort and it's a one-off.

    And if you just want to try i3 or Sway on your existing distro, install it and simply change the Window manager for your user in the display manager: it lives totally independently of whatever your currently use, so it's risk-free.

    • The Macintosh could handle overlapping windows in 1984. Why would I EVER want tiled windows, or think that low RAM is an excuse for not having them?

      • Re: (Score:2, Flamebait)

        by bjoast ( 1310293 )
        Overlapping windows was a stupid idea in the 1980s and is still just as stupid. Not that this has anything to do with memory footprint, of course.
        • Overlapping windows are a solution to limited screen real estate, that enables to keep context on each view and at window sizes that conflict with side by side viewing. That said, without focus follows mouse, kwin move / raise lower/ resize without needing the window edge (typically using modifier keys), active window not forced to top, and similar features, its use quickly diminishes.
          • I mostly run application fullscreen and switch between them. The only exception is when I'm comparing the content of two windows (in which case I tile horizontally or vertically) and file selection (floating).

            When an application uses the entire screen without the window decorations needed in a regular window manager, a screen's limited real estate is in fact better used in a tiled window manager.

            • The full screen without decorations actually sounds interesting and efficient, and in some far away past I have dabbled with a tiling window manager (ratpoison I think). Due to being stuck at work on Windows 11 it would be a hassle to switch to something too different. I tried but found it annoying. Now I use KDE mover sizer on Windows which makes it somewhat bearable.
          • Practically the first thing I do when I setup a new system is to switch to focus follows mouse. I don't know how the default went to click to focus. So painful with multiple windows. Excruciating.
            • I find most people have no idea there's a way to change defaults. My preferred settings are possibly unusable for most people. I have seen programmers use Apple devices. They couldn't configure them how I said I'd prefer to have a window manager behave. Supposedly I want something I shouldn't want... To each his own. I'm just happy there are options that can support my wishes.
              • I don't think Apple has the focus follows mouse for some reason. I started using window systems on suntools. I think mouse focus was the default, and I think mouse focus was the default for X11 for most wm's when I started using it I think around 89. I don't remember any *NIX systems that defaults to click to focus and I used several. I think somewhere in the late 2000's I started to see click to focus get hard to change.

                It is exasperating to lose the visibility of a window I want to see because I had to

                • I hear you. I used HP-UX and Linux at university, then some Solaris, and various others, coming back to HP-UX professionally then Linux. Having a terminal partially below other windows but active and pasting and typing in them is hard to go without once it's in your workflow.

                  Nowadays, kids don't even learn moving windows around, getting spoonfed on tablets half the time. And crippled windows and Mac the other half.

                  BTW Windows doesn't even have edge resistance on move. It will be a cold day in hell befor

            • I frequently have a reference document in the background that I scroll through using the mouse's scrollwheel while typing in another window. Focus-follows-mouse would prevent me from doing that. As such I never turn that feature on. Though if I had a mouse button that could toggled it on/off I'd probably give it a try.

              • Never said I wanted to prevent others from using click focus. To me I like focus follows mouse, but if others prefer other models, great, just expose it so it can be easily changed. The older systems did make it prominent, but I find I have to search deep to find it in more modern distributions. And I think it should be possible to switch it back and forth via some mouse or key action. Not sure how though, I just never find myself wanting to use click focus.

                Another very frustrating issue I've had with click

              • I absolutely hate focus follows mouse. I remember encountering it back in the '80s and the first thing I did was figure out how to turn it off.

                I can't imagine a use case for it that would ever work for me.

          • Limited screen real estate is never not going to be an issue. I mean, I suppose if I wanted to swivel my head around all the time on a giant multi-monitor setup it might be possible to have enough screen real estate most of the time without overlapping windows, but meh, most of the time I'm on a laptop anyway. And focus follows mouse is just infuriating.

      • Tiled windows don't solve a problem. They're just a different workflow. I've used both for decades and neither is inherently faster or better. It's just what you prefer.

        At any rate, don't knock it till you try it.

  • by Baron_Yam ( 643147 ) on Sunday April 05, 2026 @08:15AM (#66078180)

    The OS is bloated with things you will likely never use, and the apps are ever-more frequently bloated themselves, running in inefficient Edge Webview processes.

    If you want to have more than a couple of things running in Windows 11 and want to be sure it'll run smoothly, you're wise to target 32 GB now with a 512 GB SSD. If you know what you're doing and are willing to spend a lot of time ripping out the unnecessary parts you can get it to run with 4 GB of RAM, but even at today's elevated memory pricing it's not worth the effort.

  • I used to like Ubuntu, but then the unity desktop happened. It was choppy even on a gaming laptop and I had never seen such a waste of desktop space. I left and never went back.

    • There are so many other Linux distros, including derivatives of the main ones - RedHat, Debian, Arch, Slackware, and then, beyond them, several DE options aside from Gnome There's LX/QT, Razor QT, XFCE, WindowMaker, Hyperland, and one can even go w/ minimalist windowing DEs

      Also, if one has 4GB of RAM or less, it's probably a good idea to stick to a 32-bit version of the OS, whichever it is

      • My MX Linux 26.1 64bit on a 2-core Celeron uses 1MB memory, 1% CPU on power-up. Just going to /. home page on fresh Firefox (no add-ons), the memory jumps to 49% and CPU 47%

        • 49% of how much memory?
          I am running FreeBSD with MATE. I am using firefox with two tabs open. I am also running jellyfin, and some other background stuff. And I am also running a terminal, and looking at "top" process.
          My total active memory being used is 675MB. I have a total of 12GB Ram, and I have about 5200MB free.
          Firefox does seem to be quite a hog.

  • by hcs_$reboot ( 1536101 ) on Sunday April 05, 2026 @08:54AM (#66078208)
    First of all, Ubuntu (Linux) reserves buffers “just in case” (for streams, files, etc.). This unused memory seems taken but it can actually be reclaimed at any time if needed. Was that taken into account?

    Then, it seems Windows is built by stacking new features on top of old ones. For example, if you look at how updates work, to go from, say, version 15 to 20, it asks you to update to 16, then 17 it can’t jump directly from 15 to 20, and often a reboot is required between two updates. It’s almost as if no one at Microsoft wants to maintain the older parts of the system anymore. It’s very likely that a good number of memory allocations would no longer be necessary if the older layers were removed or reworked. I’d be really surprised if, when comparing RAM usage between a freshly booted Ubuntu and Windows system (with no applications running), Ubuntu ended up using more.
    • I haven't touched a Windows system in years, so are you saying it would be easier to just install new, bypassing the intervening versions, instead of upgrading from version to version?
  • It's why I daily drive Alpine Linux, and use dwm as the window manager just so I can leave the memory to my applications.
    • by Anonymous Coward
      Russians burning down their own shopping centers for the insurance money again.
    • Bump.
      Just discovered Alpine for use as a virtual machine at my proxmox datacenter... got jabber, iptables, wireguard, bash, installed in no time. Total install, os + application + DNS utils < 300 MB. Wickedly fast ... and I love how slim she looks .... no bloat, all the familiar tools ... I love it

      Looks like it will run xfce too for desktop use.. could be the new devuan.
  • by allo ( 1728082 )

    Try KDE Neon instead. It's Ubuntu LTS with newest KDE.

    • by fr ( 185735 )

      KDE Neon was never really intended for general users. If you for whatever reason want a Ubuntu based KDE desktop Kubuntu would be a better choice.

      • by allo ( 1728082 )

        Kubuntu is a nice effort, but it never was a good KDE distribution. And I trust the KDE devs that their own Ubuntu-Fork (or more Ubuntu + their own repo) works as they intend KDE to work.

  • I have lots of older machines with 4GB of RAM running the latest Linux Mint and perform just fine with Cinnamon + Firefox + LibreOffice for casual use and browsing (as long as it is an SSD). The majority of RAM is eaten by whatever web browser you are using and by how much. That is what will usually dictate your RAM requirements under Linux far more than the OS (unless you are gaming or doing something major like video editing).

    4GB is a bit lean, and has been, so I do agree 6GB is more realistic. But run

    • In either case, if one only has 4GB of RAM, just stick w/ the 32-bit version of the OS
      • Gonna be a problem if you want to run a modern web browser.

        Which is frustrating because there's no reason a script-enabled rich text viewer with networking should require this much f---ing memory.

  • If you give Windows 11 less than 16 GB you might as well not bother booting it. It is painfully slow and that's why you're not seeing 8 GB Windows laptops even though apple is out there selling in 8 GB laptop that's threatening the entire industry.

    And honestly you really really want 32 GB for Windows 11 and 64 GB wouldn't hurt. It's one of the reasons the Damned ram crisis is so bad. Microsoft guzzles RAM for their slop generators and monitoring software. As a consumer if you're stuck running Windows yo
    • by dwywit ( 1109409 )

      My daily driver is an 8th-gen i7 with 12GB and it's fine.

      I'm not compiling linux kernels or doing CAD, but if I were I'd probably need more.

      And the machine I take to site visits is an 8th-gen i5 with 8GB. Again, it's mainly for browsing, RDP, word processing, that sort of thing but it's not hindering my productivity.

  • by dskoll ( 99328 ) on Sunday April 05, 2026 @11:11AM (#66078342) Homepage

    If you run a light desktop environment like XFCE or LXDE, 4GB is probably fine.

    However, the instant you spin up a modern browser or office suite, you're cooked. It's the massive applications that are the problem on Linux, not the OS or desktop environment (if you pick a lightweight DE.)

    • by vbdasc ( 146051 )

      My experience is that the bloated popular websites are the number 1 memory hogs. And certain "modern-looking" apps are the number 2 hogs. Libreoffice in my system behaves. Web browsers, while browsing "classical" websites, behave too.

  • by UnknowingFool ( 672806 ) on Sunday April 05, 2026 @11:32AM (#66078382)
    Windows 11 requires 4GB to "run". In my experience, MS defines "run" as "very low values of 'run'"
  • Let's see...Microware OS9 did real-time multitasked with 64k, and RT68 did it with a few thousand bytes on the MC6800. Ref: https://github.com/linuxha/RT6... [github.com] and https://github.com/nitros9proj... [github.com]
  • Windows and Linux have long taken different approaches to memory and the majority of the linux crowd has never understood it.

    In linux, you've only had to deal with the memory actually requested by and allocated to applications. It's always been focused on servers and not the desktop. Windows has taken the view that unused memory is wasted memory. So, it's pre-fetched a lot of things into memory to keep the desktop experience smooth. These differences have made straight comparisons on memory usage between the OSs a futile effort for a very long time.

    Yes, windows does have a baseline of what it considers a modern desktop experience that linux doesn't. So, you can get by with less on linux than you can on windows. You will also have faster boot times because you're loading less, have less polish, etc. But, at the end of the day, the overall experience is what matters to most people....which is why linux has never been a slam dunk for the majority of people. It's not just how fast the system boots, but the experience of using it. The PCs on the shelves take the current baseline requirements for Windows into account. So, it's usually years after buying the PC that linux becomes a reasonable proposition to some and the rest upgrade to a new PC.

    • by CAIMLAS ( 41445 )

      You can experience over 40 years of UI design differences in Windows still, today: UI dialog panels from 3.1 days still exist in the latest Windows builds, and everything in between.

      I don't think you can honestly say Windows has more polish. It has more bloat - yes. But that's not the same thing.

      Meanwhile, Windows games (newer titles!) run better on Linux and Mac, emulated and passed through additional translation libraries, than on Windows.

      You also grossly misunderstand how prefetch/caching works, both on

      • You can experience over 40 years of UI design differences in Windows still, today: UI dialog panels from 3.1 days still exist in the latest Windows builds, and everything in between.

        Not doubting this, but are there really 3.1 dialogs? I can think of multiple control panels and other screens that haven't seen much change since NT4, but my memories of 3.1 dialogs are getting hazy at this point!

        "Overall experience" is also nonsense - most people don't have the capability or wherewithal to switch. They use what is given to them, and have only mild preference in that they want it to work for what they're doing. Nowadays, that means "a web browser" for well over 50% of all users being the primary requirement, if not the exclusive one.

        Yes! I get the feeling there are more than a few people on Slashdot (and elsewhere) that just don't get this. Over the last few years I've had multiple Gen Z coworkers for whom installing the desktop version of MS Office is _literally_ the first software they have installed outside of a mobile app

        • lol yes! There's at least an Open File dialog from Windows 3.1: https://old.reddit.com/r/Windo... [reddit.com]

          The ODBC Driver interface for configuration is tied to the old dialog.

          The interface for the drivers was designed around GetOpenFileName() as it was at the time.

          One of the features of GetOpenFileName/GetSaveFileName is that the structure passed in can include two special options- a function pointer to a hook routine, as well as a custom dialog template which windows will insert.

          The functions were improved in Windows 95 with the "Explorer style". Even old programs get this style at the very least, because windows will imply the flag.

          unless a template or hook routine is specified. See if a hook routine or template is specified and the OFN_EXPLORER flag is not, then the hook routine or template was designed for the old-style dialog. Windows uses the old-style dialog in this instance so that the program can run and doesn't crash.

          The ODBC Driver configuration uses a dialog template to add the "read Only" and "Exclusive" checkboxes. That is why it shows the old style dialog.

  • All of the idiotic "packaging formats" basically bundle their own complete OS (or at minimum a compete standalone runtime).

  • I have some incredible moneymaking opportunities for you!

  • Of course it requires a lot of RAM and CPU power and GPU power and...
    Luckly, there are a shitload of options out there that use a lot less resources.

  • by joeblog ( 2655375 ) on Sunday April 05, 2026 @04:31PM (#66078736) Homepage
    Last time I used Ubuntu, gigabytes of krapware but not even a terminal or text editor installed by default.
    • A minimal Arch install still takes up an outrageous 3 GB of disk space. Even fedora with gnome doesn't take that much space
  • This is likely a paid advertisement, brought to you by the same people who are trying to avoid the continued fracturing and disillusionment of the remaining non-professional Windows users who aren't hardcore gamers. It's right in line with the "make Windows better again" agenda (I'd argue, propaganda campaign - there's zero chance of it happening) out of Redmond.

    Windows hasn't been usable on less than 16GB of RAM since the tail end of the hard drive era (around Windows 7 SP2/3). Windows Vista was never usab

  • Linux does typically require more RAM than Windows. I mean Linux is a full operating system including network stack and multitasking, while Windows essentially is a shell for DOS... with some added benefits like having a GUI-Toolkit that supports multiple applications running at once due to cooperative multitasking.

    So while your typical Windows system was sold with 4 Megs of RAM, a Linux system typically isn't usable with less than 8 Megs of RAM.

    Of course this is now largely irrelevant as RAM is fairly chea

  • Check it for yourself. Have 64GB of memory? Still Windows uses VM, and you pay for the SSD--all to make it seem less bloated.
  • I'm running the daily release on a 2 core/4 thread Intel NUC with 8GB of RAM. When absolutely idle with just me logged in to the desktop gnome interface, it's using a little over 2GB of RAM. I'm assuming it's still a bit porky due to all the debugging turned on. If I run firefox and load a piggy cnbc.com main page, it goes to about 3.8GB of RAM utilized.

    This is 10+ year old hardware and it's still quite usable.

    I don't see anything to be concerned about.

    Best,

  • My Debian system runs blazingly fast with 4 GiB RAM while running a 1 GiB VM.

    GNOME is rapidly approaching OSX levels of bloat.

  • I tried running Win 11 on 4GB. It did not go well.

  • calculator 1GB of RAM, 4 CPU cores in full usage.
    Text editor 4GB of RAM, 8 CPU cores, because your machine doesn't have more cores and you notice it while typing.
    Web browser 64GB of RAM, to load all your porn tabs.
    Compiler... don't get me started on this one.

  • I have a PC with the latest Mint Cinnamon and it can have a few browser tabs open and uses about 2GB of RAM. Meanwhile, Windows 11 is a dog at 8GB of RAM because of all of the Microsoft and Dell bloatware.

According to all the latest reports, there was no truth in any of the earlier reports.

Working...