Mozilla Enables WebRender By Default On Firefox Nightly 101
RoccamOccam writes: WebRender, an experimental GPU-based renderer for web content, written in Rust, is now enabled by default for Firefox Nightly users on desktop Windows 10 with Nvidia GPUs. The announcement was made on the mailing list.
Lin Clark provides an excellent overview of WebRender and, states, "with WebRender, we want apps to run at a silky smooth 60 frames per second (FPS) or better no matter how big the display is or how much of the page is changing from frame to frame. And it works. Pages that chug along at 15 FPS in Chrome or today's Firefox run at 60 FPS with WebRender.
In describing the WebRender approach Clark, asks, "what if we removed this boundary between painting and compositing and just went back to painting every pixel on every frame? This may sound like a ridiculous idea, but it actually has some precedent. Modern day video games repaint every pixel, and they maintain 60 frames per second more reliably than browsers do. And they do it in an unexpected way instead of creating these invalidation rectangles and layers to minimize what they need to paint, they just repaint the whole screen."
Lin Clark provides an excellent overview of WebRender and, states, "with WebRender, we want apps to run at a silky smooth 60 frames per second (FPS) or better no matter how big the display is or how much of the page is changing from frame to frame. And it works. Pages that chug along at 15 FPS in Chrome or today's Firefox run at 60 FPS with WebRender.
In describing the WebRender approach Clark, asks, "what if we removed this boundary between painting and compositing and just went back to painting every pixel on every frame? This may sound like a ridiculous idea, but it actually has some precedent. Modern day video games repaint every pixel, and they maintain 60 frames per second more reliably than browsers do. And they do it in an unexpected way instead of creating these invalidation rectangles and layers to minimize what they need to paint, they just repaint the whole screen."
Re:but... (Score:5, Informative)
Quite the opposite. If there is no invalidated states on the screen, no painting occurs. It only paints when it needs to. It actually consumes less power overall because of the amount of code that is required to handle invalidation of only parts of the screen is massively slow. This is why the browser renders so much faster. Instead of 200ms per paint of a small section of the screen, it renders the entire screen in 15ms. The rest of the time, your CPU/GPU sits idle. Also, Webrender does all rendering on the GPU instead of CPU, so it has better optimization for painting the scene (CPUs suck at this entirely)
Re: (Score:2)
I hope this is just an example rather than a measured result, because that would be drawing massive power.
Why would it be drawing massive power? This is leveraging the GPU to do things it does well instead of putting them on the CPU that does them poorly.
Re: (Score:2)
Continuously animate one single pixel on the screen and your assumptions break, the GPU will never be idle. Don't get me wrong, I like this a lot, but power efficiency is definitely not a reason. But it isn't a complete power hog either. Compared to 3D games, shaders will be trivial. No matrix multiplies for example and no perspective divides. The fan on the GPU should stay off, and even the lamest integrated GPU should be able to handle it easily.
Re: (Score:2)
Theory and practice are not the same though. Check the actual docs and videos on webrender. The DOM + Compositing process currently used to re-render a single pixel is more expensive than cutting out that entire code path and rendering the entire scene. If you're only thinking of the final piece of pushing the actual pixel to the screen, that's the quickest part of the entire process. Figuring out what value that pixel should have in the first place is where all the CPU time is currently being consumed. Tha
Re: (Score:2)
Theory and practice are not the same though. Check the actual docs and videos on webrender. The DOM + Compositing process currently used to re-render a single pixel is more expensive than cutting out that entire code path and rendering the entire scene. If you're only thinking of the final piece of pushing the actual pixel to the screen, that's the quickest part of the entire process. Figuring out what value that pixel should have in the first place is where all the CPU time is currently being consumed. That's the whole reason they're doing this the way they are.
Check the actual docs and videos on webrender. The DOM + Compositing process currently used to re-render a single pixel is more expensive than cutting out that entire code path and rendering the entire scene.
The actual docs say the opposite. [mozilla.org] "The optimizations above have helped pages render faster in certain cases. When not much is changing on a page—for example, when there’s just a single blinking cursor—the browser will do the least amount of work possible."
This is practice, not theory. Again, don't get me wrong, I like this a lot, but for some cases it will eat a lot more battery than incremental render. Whether that is a problem in practice remains to be seen. My guess: not a problem even
Re: (Score:2)
And if you continued reading beyond that line of text, it explains why that worked in the early days of browsers, but doesn't work in modern browsers due to increased complexity of web pages.
Re: (Score:2)
Could you quote some text please.
Re:but... (Score:4, Funny)
Sure thing, buddy.
There you go.
Re: (Score:2)
If you saw me you would compare me to a polar bear, not someone from Africa.
Re: (Score:2)
Didn't miss it. Z-culling will improve the blinking cursor case, but not fix it completely. Again, note: I like this, but let's not attribute magical powers to the technique that aren't there.
efficiency frontier hijack (Score:2)
CPUs suck at this, in much the same way that decathletes suck at high jump.
The only event that your GPU really loves is the marathon. Marathon anything—so long as it demands less cognitive agility than a biathlon.
Efficient frontier [wikipedia.org]
So put that universal boner tingle back in your pants—everything in life is a tradeoff, if you so much as back the lens 1 mm
Re: (Score:2)
Why do I suspect those numbers predate the 1950X Threadripper (which would tear your GPUs arms off in the hexakaidecathalon).
I really don't think the word "sucks" should be applied to an elite athlete with 4% body fat who runs for a living, just because his pipes are too studly to ace the ultramarathon.
The reason you don't use your CPU to rasterize scenes in games is, quite frankly, because it sucks at it. The same thing applies here, of course you use the graphics processing unit to do this work because that is exactly what it is suited for and the CPU, while capable of doing it, is comparatively sucky at it. It doesn't matter how much praise you think a 1950XThreadripper deserves the fact is it is comparatively terrible at this task.
Re: (Score:2)
This is why the browser renders so much faster. Instead of 200ms per paint of a small section of the screen, it renders the entire screen in 15ms. The rest of the time, your CPU/GPU sits idle. Also, Webrender does all rendering on the GPU instead of CPU, so it has better optimization for painting the scene (CPUs suck at this entirely)
How does this work when the browser and GPU are on diifferent machines? I run my browser in remote X. Will that be tonnes slower when it transfers entire renderings over the network?
(Never mind that "modern Xorg programmers" already killed LBX because they did not use it themselves so didn't see a need.)
Re: (Score:2)
Check right now. How goes GPU compositing work with your current setup? I bet it doesn't, honestly. So you'd still be stuck in the classic software rendering that you currently already have.
Re: (Score:2)
Check right now. How goes GPU compositing work with your current setup? I bet it doesn't, honestly. So you'd still be stuck in the classic software rendering that you currently already have.
Yes, but at least it only sends over the rectangles that change, and doesn't redraw the entire window. While X is no longer as frugal as when it had the LBX extension, it still handles partial updates. If I understand this correctly, they want to do away with that entirely and always update the entire viewport.
Re: (Score:2)
Re: (Score:2)
If you're on a laptop with an nvidia gpu, usually the gpu is powered down and the integrated gpu is running everything, saving power.
Now when Firefox is open, your nvidia gpu needs to stay powered up the whole time.
Re: (Score:2)
I've never seen a laptop with a discreet board that did not allow you to change which video card applications used, usually in performance settings. If you wanted Firefox to not use the nvidia card you could just set it to static powersaving mode. 10 google seconds gave me this example: https://support.serato.com/hc/... [serato.com]
Re: (Score:2)
This Firefox feature only works on nvidia GPUs, as stated in the summary. How many CPUs have integrated nvidia GPUs?
If you want Firefox to not use the discrete GPU, you can't use WebRender.
Re: (Score:2)
Re: (Score:2)
Firefox is not adding back XUL. Google doesn't use it. All web browser just copy that now.
But the reason I wake you is that Waterfox will let you use XUL addons if you didnt know about it.
Re: (Score:2)
Classic Theme Restorer, Download Status Bar, and another one do not work with FF57+ and there are no replacements for them.
min specs? (Score:1)
Found the LUDDITE! (Score:2, Funny)
Only LUDDITES use WebRender. Modern app appers use AppApper!
Apps!
Re: (Score:2)
What about AppRender?
Security implications? (Score:2)
What are the security implications of letting web sites run arbitrary code on your GPU?
I bet they're more significant than you're expecting.
https://lib.dr.iastate.edu/cgi... [iastate.edu]
https://ieeexplore.ieee.org/st... [ieee.org]
Re: (Score:2)
Good thing they're not running the javascript vm on the gpu then.
Re: (Score:2)
Re: (Score:2)
What are the security implications of letting web sites run arbitrary code on your GPU?
This article is about the web browser using the GPU for rendering, not about web sites running arbitrary code on the GPU.
Re: (Score:2)
Because it's impossible to have arbitrary code execution vulnerability in rendering software, and nor has anyone in history ever chained exploits together to achieve a desired outcome?
Oh, wait-
https://security.stackexchange... [stackexchange.com]
And chaining vulnerabilities is very common.
Re: (Score:2)
Re: (Score:2)
Orly?
https://www.digitaltrends.com/... [digitaltrends.com]
Re: (Score:2)
Again, this is not about "letting web sites run arbitrary code on your GPU". But more to the point you obviously didn't even read that paper. In order to do that you need to first compromise the system at the root level so you can get access to the memory holding the keyboard buffer then run a CUDA program with admin privileges to map that memory to the GPU and execute the compute kernel to read that memory. i.e. to do this you need to have completely compromised the operating system already.
Re: (Score:2)
You're driving a car down the road.
Do you make decisions solely based on what's directly in front of your bumper?
Or do you make decisions drawing from years of experience driving cars and what you see to either side and in the rearview mirror and your side mirrors and what you see further down the road and, dare I say it, common sense?
Re: (Score:2)
Or do you make decisions drawing from years of experience driving cars and what you see to either side and in the rearview mirror and your side mirrors and what you see further down the road and, dare I say it, common sense?
You seem very confused, you've posted links demonstrating that a privileged application can map kernel memory and run a CUDA program that can then access that memory, we already know that, that's not news (well actually it does seem to be news to you). Whether the browser uses the GPU to render something (which also has been done for a long time and is not a new thing) or not has no bearing whatsoever on that at all.
There's nothing fundamentally about Firefox's Web Render that makes it any more or less vuln
Re: (Score:2)
While walking along in desert sand, you suddenly look down and see a tortoise crawling toward you. You reach down and flip it over onto its back. The tortoise lies there, its belly baking in the hot sun, beating its legs, trying to turn itself over, but it cannot do so without your help.
You are not helping.
Why?
Re: (Score:2)
If you can posit why a CUDA application executed via a privilege escalation bug is a risk to Web Render feature of Mozilla any more than to any other application then, or indeed what about Web Render makes it particularly prone to some unspecified GPU-based attack then I'm willing to listen but I'm afraid it's pretty clear you don't - and likely lack the capacity to - understand. I'm afraid I can't fix your stupid, sorry.
Privilege escalation bugs that allow arbitrary execution of code on the GPU or any othe
Re: (Score:2)
Idiots Design Web Pages that Render 15 FPS (Score:4, Insightful)
Re: (Score:2)
Oh good, I'm glad to know that my Pentium MMX 166MHz with 512MB RAM is not the problem.
Imagine! (Score:2)
Imagine how wonderful /. will look!
Re: (Score:3)
win 10 3d animated fonts to the rescue!!!
Apps! (Score:2)
Okay, now that THAT'S out of the way, looks okay, but I think Randall should sue.
Where's Donald Becker? (Score:1)
I'd like to get a Beowulf Cluster of these!
60 FPS is great; any plans for ... (Score:3)
Any plans to target 120 Hz?
Re: (Score:2)
Re:60 FPS is great; any plans for ... (Score:4, Funny)
I want a plugin to limit the browser to 24 FPS to make it look more cinematic.
Re: (Score:2)
Amen, brother.
*lazily sips coffee from fresh-roasted beans*
I watch all of my browsing as shot on 35mm film, it's the only way I surf the internet nowadays.
*pushes black-rimmed glasses further up my hipster nose*
*adjusts suspenders*
*checks 4th gen. Apple watch*
Just imagine the speed increase (Score:2)
Re: (Score:3)
Even better, take out the Google spyware links. (Google is far from the only culprit, they just waste the most total time.)
Re: (Score:2)
After that, remove all the godamn fucking 8K-resolution images, not everyone has a 30" quadruple-HiDPI display.
commas (Score:3)
Lin Clark provides an excellent overview of WebRender and, states, "with WebRender...."
I have no idea why I typed all of those commas.
Re: (Score:3)
Yay Mozilla (Score:3)
Firefox is my main browser for a lot of reasons, not just that Google doesn't dominate it. Great to see the Mozilla team leading the way on this, and it's a big validation for Rust. Any serious systems programmer ought to take a close look methinks.
You want 60fps (Score:3)
Or stop people from filling up their sites with bullshit JS and media.
Re: (Score:2)
If I want to write a simple application that runs on all platforms I shouldn't have to produce and distribute a binary for each one
Why not? Personally, I agree with you, but a lot of other Slashdot users claim to have had no problem with developing an application using Qt and "produc[ing] and distribut[ing] a binary for each one".
Re: (Score:2)
Don't program for a web browser.
Don't want to succeed, don't program for a popular platform.
Re: (Score:2)
The top 40 radio stations prove that.
Re: (Score:2)
That depends on what you want to achieve. A "good" provider for their family would like to run a successful business that makes money. In software if your popularity is zero so is your income. Your wife may not think you're very "good" for your idealistic choices of programming platform.
Also turning your comment around, popular likewise doesn't mean bad. And as per TFA quite clearly if you want to achieve 60fps whatever that boiler plate statement may imply you could just fine program for a web browser.
So? (Score:3)
Re: (Score:1)
It's bitztream the autism-hating, custom EpiPen-hating, Musk-hating, Qualcomm-hating, Firefox tabs-hating, Slashdot editors-hating Slashdot troll!
Tha Mozilla Hacks article was posted on October 10 (Score:2)
This is extremely old news. It would be interesting to know if that tech has been enabled for more than Win10 and Nvidia GPUs by now.