Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Chrome Privacy News Games

Nvidia GPUs Can Leak Data From Google Chrome's Incognito Mode (softpedia.com) 148

An anonymous reader writes: Nvidia GPUs don't clear out memory that was previously allocated, and neither does Chrome before releasing memory back to the shared memory pool. When a user recently fired up Diablo 3 several hours after closing an Incognito Mode window that contained pornography, the game launched with snapshots of the last "private" browsing session appearing on the screen — revealing his prior activities. He says, "It's a fairly easy bug to fix. A patch to the GPU drivers could ensure that buffers are always erased before giving them to the application. It's what an operating system does with the CPU RAM, and it makes sense to use the same rules with a GPU. Additionally, Google Chrome could erase their GPU resources before quitting."
This discussion has been archived. No new comments can be posted.

Nvidia GPUs Can Leak Data From Google Chrome's Incognito Mode

Comments Filter:
  • by halivar ( 535827 ) <.bfelger. .at. .gmail.com.> on Monday January 11, 2016 @09:42AM (#51277445)

    Are you sure this isn't God judging your evil deeds?
    /duck
    /run

  • Ads (Score:5, Interesting)

    by bill_mcgonigle ( 4333 ) * on Monday January 11, 2016 @09:43AM (#51277447) Homepage Journal

    > Google Chrome could erase their GPU resources before quitting.

    Why blank it when you can write a gaming ad to the buffer instead? #incentives

    Why write a gaming ad when you can write a Radeon ad instead? #alsoincentives

    • Any chance you feel like patenting that atrocious, but clever, terrible plan to help ensure that someone who isn't even slightly joking doesn't go running with it?
      • by cfalcon ( 779563 )

        Patents cost money to file, and require a lot of effort to write. It's still a good idea, just an expensive one.

    • There is also the bug where Diablo 3 is displaying effectively random data [data it allocated in the GPU, but never initialized before displaying].

  • by Anonymous Coward

    Google have said they won't fix the bug.

    • by Anonymous Coward

      Google have said they won't fix the bug.

      In other shocking news, researchers have discovered that when an application releases storage resources, instead of writing a series of random data patterns onto the disk, the OS simply marks the space as "free". Even though this would be a simple fix, Google has chosen not to do it!

      • Oh, is Chrome caching incognito mode data on disk and failing to shred it? Crappy spyware.

        • Oh, is Chrome caching incognito mode data on disk and failing to shred it? Crappy spyware.

          Thats not what this article says.

      • by cfalcon ( 779563 )

        There's some BIG differences there. First, there's OS tools available that try to handle this case. Second, there's great workarounds for this insecure-but-fast disk habit, such as storing the data encrypted, or on an encrypted partitions. Third, the time/tradeoff is much greater with disk cases- writing a block of RAM on SHUTDOWN ONLY is not nearly as great a burden as writing over an arbitrary file on the disk.

        I could see Google's position on this- it's not technically their fault- but they could at le

        • by darkain ( 749283 )

          Additionally, if you're using a Copy-On-Write file system like ZFS, the contents wouldn't be overwritten anyways.

        • You mean blank on de-allocation. Unless you are going to try and track every block you've ever used (and possibly released -- and possibly now owned by someone else) and do it at exit -- which smells like a memory leak waiting to happen. Otherwise you need to blank before de-allocating it, which depending on how much is being allocated/de-allocated could significantly impact performance.

          • by KGIII ( 973947 )

            Do you mind a dumb question?

            If the user has enough RAM, wouldn't it be possible to reserve a goodly chunk and then wipe the entirety (some resource hit here - I should think) when de-allocated/closed such as at the end of the session?

            I'm slowly, but surely, getting back into poking at code - it's been like 8 years since I've even really looked at any and even longer since I've really done much of any. I'm seeing why this is happening (I think) but I'm not seeing why it's not being fixed. At least conceptual

    • Why should Google fix Nvidia's fuck up? Like they even could.

      There's plenty to not like about some of Google's recent moves, but you're just being a hater here.

      • by Anonymous Coward

        shouldn't an application that purports to be "secure" be paranoid enough to zero-out any memory it used "just in case". sure, the driver/OS *should* do this, but the application arguably shouldn't trust that they will. IMHO

        • Google Chrome actually has a big disclaimer when you start incognito (aka private) mode pointing out that it isn't actually "secure"; it just simply won't retain any cookies or or history from that private session after you close it, and that anyone observing your web traffic can see everything you are doing whether or not the session is "private"
  • by Anonymous Coward

    cmon no one uses it for anything else.

    • by Austerity Empowers ( 669817 ) on Monday January 11, 2016 @10:19AM (#51277711)

      There is value in using that mode for porn (although your IP address is still exposed, and it's unclear that anyone is going to understand why you were at LustyHotBabes.com for any non auto-erotic pursuits). But it is also incredibly useful for the times you want to visit a site that caches credentials locally or otherwise relies on client-side tracking, but you don't want that behavior. I do not like to leave data for gmail, facebook, linked-in etc. on my work machine, for example, I don't own it and IT can seize it at any time.

    • Not just PornMode (Score:5, Interesting)

      by crow ( 16139 ) on Monday January 11, 2016 @10:39AM (#51277861) Homepage Journal

      I use "incognito mode" all the time. Anytime I see some interesting link on Facebook, I always open it in incognito mode. Just one more level of protection against associating the link with my account or leaving behind unwanted trash.

      I also find it very useful for news sites that let you have a certain number of articles free before throwing up a paywall. Using incognito mode resets the counter back to zero.

      • by Anonymous Coward

        Incognito mode is actually more of a developer tool. If you want to make sure you see the site you are developing "fresh" without having to worry about pre-existing cookies or local storage values.

        • by rhazz ( 2853871 )
          That might be one use, but hardly the primary purpose, or even the marketed purpose of the feature.
      • by MobyDisk ( 75490 )

        I use private/incognito mode to access my bank, so that I can be sure there are no XSS attacks. It's also useful for browsing pages you are working on so that you know nothing is cached. Plus you can have two simultaneous sessions going with the same browser, but cookies and history won't be shared. It is also good for testing supercookies.

        I kinda like the idea of every tab being a "private mode" tab. It's kinda how the web was intended to be in the first place.

    • Re: (Score:3, Informative)

      by Anonymous Coward

      Use it to browse sites that need cookies to work, but then use them to fix or mess with prices against you. Like airline sites and travel search engines that will sometimes raise prices if you search from a browser with the same cookies.

      Use it to follow links you don't want messing with other tracked histories. You see an article on weird stuff for sale but don't want Amazon or other sites suggesting related stuff every time you visit in the future?

      Having trouble with sites that stupidly use cookies to tr

      • Good tip, while we were Christmas shopping my wife asked me why in the fark a sample of uranium ore was in my Amazon suggested items. It took me several seconds to realize that it was likely there from reading "The worst stuff for sale" blog which I generally find highly amusing. In fact, I am pretty glad it found the radioactive gag gift instead of several of the other postings that could have come up :-p I'd provide a link, but I can't be arsed.
    • Its also useful for logging into other google accounts that you want to keep separate. Its either use a different browser like Edge or go incognito.
  • by grumbel ( 592662 ) <grumbel+slashdot@gmail.com> on Monday January 11, 2016 @09:46AM (#51277485) Homepage

    The AMD Open Source Driver on Linux do the same thing. It's not really a new or spectacular bug, graphics cards and drivers have done that stuff for quite a long while. Once there was also a fun bug that would make large texts in Firefox 'bleed' into the desktop background image, so it wasn't just showing old content, but actively manipulating content of another application.

    • It's honestly somewhat surprising(perhaps a testament to the fact that with great power comes nontrivial chance of really crashing the hell out of things, perhaps just the availability of softer targets) that GPUs don't get involved in much scarier evil more often.

      Easily among the most powerful devices in the system that isn't the CPU(and, while not necessarily ideal for things that aren't specific compute workloads, turing complete), plenty of RAM to store payload to be injected in assorted places or da
    • by The MAZZTer ( 911996 ) <megazzt.gmail@com> on Monday January 11, 2016 @10:49AM (#51277955) Homepage

      Yeah. Your GPU was not designed with security of the information stored in it in mind. It was designed to play video games and a few other things, and it's not a big deal if a few of your game textures leak, if it means the GPU can be slightly faster at managing its memory. The responsibility should be Chrome's to clear out its GPU memory in incognito mode after it's done using it.

      • Re: (Score:3, Insightful)

        by Anonymous Coward

        > Chrome's to clear out its GPU memory in incognito mode after it's done using it.

        The driver manages the GPU memory and there is no particular reason to assume that if Chrome did that it would actually write to the same RAM location that had the sensitive data and not some other random memory area it was assigned temporarily.
        Thus the calls for Chrome to fix it are nonsense. Yes, it might work. But it might break any time with a driver update. This needs to be in the drivers.
        The discussion and arguments i

      • by Ed Tice ( 3732157 ) on Monday January 11, 2016 @12:18PM (#51278719)
        It used to be that the programmer was responsible for clearing sensitive data out of general-purpose memory to ensure that no other process got access to the data. It didn't work out very well. Now, the OS is responsible for clearing out memory prior to handing it to another process. It doesn't really make sense to have every application do something that could be implemented one time, correctly, in the operating system.
      • So what do you want to make a bet a clever hacker could then write a webgl script to download pics of your bank information? SOme banks report with javascript ads and can see this used to steal information

    • I still have bleeding visual buffers when using MPlayer, mpv, firefox or any opengl application on Linux.
    • by sad_ ( 7868 )

      It goes back to the beginning of computing.
      I remember well on the Amiga that after a reset you could run several rippers that would pull out the gfx and music (as long as it was not overwritten) from memory of the game/demo you were playing. Later i saw similar programs in DOS for PC that did the same thing.
      Does the linux kernel zero-out the bits it unallocates? I don't know and would assume not, it's an expensive operation.

  • Comment removed based on user account deletion
    • by Anonymous Coward

      Most programs don't bother zeroing memory after using it for that very reason. Unless the memory has something important in it, usually.

      • Comment removed based on user account deletion
        • by Delwin ( 599872 ) on Monday January 11, 2016 @10:03AM (#51277585)
          The performance hit is real - and without custom silicon it's quite expensive. This bit me on the ass recently on a GPGPU project I was working because the amount of time taken to clear the buffer before use was about 10x the amount of time to actually do the computation.
          • Comment removed based on user account deletion
          • Out of curiosity, in terms of 'what should be done', is the idea that an application should be responsible for clearing memory before releasing it considered a good practice; or is it considered a least-worst option to deal with the fact that the OS can't necessarily be trusted to do the job properly?

            Speaking as a complete layman, I would think that, just as handling memory allocation is usually left to the OS, in an ideal world the OS' memory allocation mechanism would also be responsible for clearing s
            • by Anonymous Coward
              It's considered good practice, the applications could also zero it out but you can infer the memory manager has 'cleared' the memory, too. This is how heap attacks happen, though, in some cases. Trusting what's in memory is generally considered to be bad, even if it's 'your' memory. Now, when it comes to GPUs performance is king -- millions of a second matter as the impacts are generally multiplied by literally billions of operations. And what you're describing I believe is sometimes attributed to garbage c
            • by gwolf ( 26339 ) <[gwolf] [at] [gwolf.org]> on Monday January 11, 2016 @11:55AM (#51278519) Homepage

              The GPU memory is not handled by the OS, it runs on a separate piece of hardware, a full computer system if you allow, that does not run an OS by itself.

              The NUMA API for using nVidia cards for GPGPU operations is quite simple and straightforward; when requesting memory, it allocates a chunk; when releasing it, it's just marked as "not yours anymore". Due to the massive parallel programming model, there is even some *value* in not clearing it, as for algorithmic iterations sometimes you can save the cost of populating and freeing memory blocks if you know you will get the same pieces of RAM (or if it does not really matter, and each algorithmic pass can work exclusively on a given set of data until a certain point has reached — think i.e. symmetric encryption schemes).

              Due to every time more intelligent C compilers (and of course higher level constructs) we have got used to memory being zeroed out on assignation, but no AFAICT no standard mandates that. I would place the burden of cleaning the memory on the *initialization* of the new application. After all, be it pr0n or just random flipped bits, Diablo looks bad by starting with the display of digital noise.

              I don't think it should be *too much* of a concern for Chrome cleaning up before closing a tab. Yes, there is a certain thing about it being "incognito mode" that should be honored, and –as a special case– it *should* ensure to clean up its act. But the main fault I'd say is at Diablo.

              • by Anonymous Coward

                there is even some *value* in not clearing it, as for algorithmic iterations sometimes you can save the cost of populating and freeing memory blocks if you know you will get the same pieces of RAM

                Jesus Christ! There are people who know how to program on this site. You're going to give someone a heart attack with comments like that!

                Also, it's not Chrome's responsibility to clear the memory. That might be a mitigation strategy, but no OS or driver should ever hand uncleared memory to a different application. Ever! What if Chrome crashes before it can clear the memory? We've had the performance discussion before, when some deemed clearing system RAM too costly. The answer is always the same: It costs w

                • by tepples ( 727027 )

                  there is even some *value* in not clearing it, as for algorithmic iterations sometimes you can save the cost of populating and freeing memory blocks if you know you will get the same pieces of RAM

                  Jesus Christ! There are people who know how to program on this site. You're going to give someone a heart attack with comments like that!

                  I don't think gwolf was referring to exploiting the undefined behavior of use after free. I think it was more along the lines of object pooling [wikipedia.org].

              • Diablo looks bad by ?starting with the display of digital noise.

                This brings up an interesting point, I think what we really have here is an exploit in the NVidia drivers rather than a "Super Scary oh noze mine pr0nz gots pwnd" privacy issue. What if next time, instead of stills from your latest crush fetish video, your VRAM was sitting on a call to "glGetTexImage()" and pulls an arbitrary instruction onto your stack? It is basically telling us that read-after-release is possible and DMA can be a lot of fun in the "right" hands. This is a lot more interesting than this r

              • "AFAICT no standard mandates that. I would place the burden of cleaning the memory on the *initialization* of the new application."

                Common sense mandates that a multi-user system separates users and processes from each other. If I log off from a workstation, the next user should not be able to do screen captures (potentially confidential documents, emails with passwords) using software that exploits this "feature".

                • by gwolf ( 26339 )

                  And it happens as you describe — on the main system's memory. However, the GPU is not the main system. It is more akin to a peripherial computer, with lots of intelligence and RAM of its own. Think of it as a printer. Do you really care if your printer blanks its buffers between jobs? No, as long as one job's leftover contents will in no way corrupt any next job's output.

          • Surely the GPU can zero hardware quickly though.
          • by Arkh89 ( 2870391 )

            I cannot believe you on this. How was this done? Using cudaMemset, clEnqueueFillBuffer or something equivalent? Or your own code (coalesced memory access...)?

          • by cfalcon ( 779563 )

            The performance hit for overwriting 4 gigs (or usually less) of RAM on *process or thread shutdown* shouldn't be that onerous... should it? I could be missing something about when it frees it.

          • If that's the case then Nvidia should really be on the hook for this rather than Google.

            If the memory space is unallocated then the card should zero out any memory space that was recently released whenever the card isn't under heavy load.
            • Nope, it's Google selling "porn mode", it should be up to Google to avoid your feeelthy pictures lying around.

  • I'm less concerned with GPUs not clearing their memory when done (known bug in PCs) and more with the fact that Diablo 3 is just using whatever happens to be in the buffer.
    • It likely only shows for a single frame or two until Diablo renders its content to it.
    • by dkman ( 863999 )
      This is what I came to say. Why would an application draw from memory it hasn't written to yet? I know that games often go to a black screen at launch. Is it just chance that it chose an area that was zeroed vs random garbage? I would think that if it just pulled from the beginning of memory that people would see some old image pretty often. Maybe that is the case and we just perceive it as a flicker unless it hangs for a moment as it did for him.
      • Also, why did the GPU driver assign the exact same start address for the frame buffer? If it randomized this somewhat, I would think you would end up with much less of a chance of this happening without taking a performance hit to clear the buffer whenever launching / cleaning up a thread.

      • by rl117 ( 110595 )

        Because you might be streaming content into the texture after its creation e.g. with GLTexSubImage2D (or 3D). You might already be running the render loop a few times over before it's fully filled. Ideally the driver would only ever give you blanked memory so it would be transparent and imperceptible.

    • I guess you've never rebooted a computer with either an nVidia or an AMD/ATI video card in it. It's very common for the boot screen (and even the desktop immediately after login) to briefly display old content from prior to the reboot. I see this on Linux, OSX and Windows machines every single day. It's not a Diablo problem, it's that the video drivers don't even bother initializing video RAM to a known state at driver startup let alone before allocating memory to applications.
  • Why does a web browser need GPU for basic web? unless that is a flash or HTML5 driven pron site?

    • Using a GPU to render a website allows faster rendering with lower power usage.

      Think of all the elements on a page that can be composited with something designed to do it with different levels of transparency.

      If you want laptops and mobiles to run faster and last longer on battery power then part of that is using computer resources more efficiently. Lots of stuff right now is wasted and the CPU is busy with memory IO due to poor algorithms.

      • by tysonedwards ( 969693 ) on Monday January 11, 2016 @10:10AM (#51277647)
        You mean using a graphics processing unit to process graphics? What level of weapons grade bullshit is this?
        • The OP was viewing adult content. It was most likely video. And the GPU has dedicated video decoding hardware. Chrome was probably offloading this and somehow some last rendered frame was in the GPU RAM when the next application launched.
      • In the case of most modern OS window managers, don't most programs end up getting their output scrawled onto some surface that the GPU manipulates, even the seriously retro ones that predate the concept of 'GPU' as anything other than a RAMDAC and some primitive fixed-function elements?

        Chrome, and similar, interact substantially more than that; but I thought that most of the various desktop transparency/preview/fancy-window-swooshing/etc. stuff was handled by drawing program output to something that the
    • by Anonymous Coward

      (GPU is irrelevant.)

      This browser shows things on the screen, so that the user can see it. (It's not the legendary braille or text-to-speech browser that HTML purists are always warning you that some of your website's users using.)

      And of course, if you show things on the screen, then they're going to get into video memory. (So that the user can see what the computer is trying to show them.)

      And though I said the GPU is irrelevant to the situation, it turns out there is nevertheless an answer to your quest

    • Wait, a video porn site that uses Flash?

      Can't be...

  • by Anonymous Coward

    It's been shown that you randomly snag other running applications data by initializing new framebuffers and seeing what happens to be in them.

    The problem is that your graphics card simply cant zero out chunks of ram every time an application requests them, not if you want your high performance rendering for your video games. This issue is an old one, and one that's tied to the hardware architecture itself, and can't be fixed as easily as the submitter seems to think.

    • It's true that zeroing upon reallocation is slow, but wouldn't zeroing in the background before deallocating do the trick?
    • by tepples ( 727027 )

      Consider the following policy: If a process requests video memory, and all of this memory previously belonged to the same process, don't clear it. Otherwise, zero it. And while the GPU is idle, zero some of the memory released by processes in the background, especially by processes that have ended. How would such a policy interfere with "high performance rendering for your video games"?

    • not if you want your high performance rendering for your video games.

      Cant we have a choice? Security or speed?

      The last time I played it computer game, it was DOS based. Or maybe colossal cave? I don't Need a GPU (and would not buy anything from NVidia if I did. Their own support for Linux is terrible, and Nouveau does not even work).

    • It's been shown that you randomly snag other running applications data by initializing new framebuffers and seeing what happens to be in them.

      The problem is that your graphics card simply cant zero out chunks of ram every time an application requests them, not if you want your high performance rendering for your video games. This issue is an old one, and one that's tied to the hardware architecture itself, and can't be fixed as easily as the submitter seems to think.

      Of course it can. If there is hardware support for it, clearing memory is practically free. Remember DRAM works by refreshing itself every single tick, you can opt not to refresh and thus blank it. Even without that, zeroing using a modern GPUs enourmous parallel power is also cheap.

  • by Anonymous Coward

    Personally I just think Chrome is becoming more mucked up by the version. I used to think it was the best browser, but I definitely think Google is going down the Firefox path of over playing serious bugs and continuing to update without fixing the important stuff. I mean it might set all the records for compliance and standards.
    But crashes happening regularly, Pepper Flash is awful, I get so many page rendering issues and CPU cycle pegs from Chrome helper on both Mac's and Windows its not funny anymore. I

  • Leave the memory better than you found it.

    Been doing this stuff for years and it never even occurred to me that this could be an issue. By "cleaning" up the memory usage on shutdown I just made sure I free'd the memory, not clearing it on the way out.

    JVM and .NET VM makers take note - You could add this to your GC and shutdown code and give all programs automatic support for this easily.

    • JVM and .NET VM makers take note - You could add this to your GC and shutdown code and give all programs automatic support for this easily.

      It would be better to put this in the OS cleanup code, clearing the buffers in a background thread before they're returned to the free pool. If the cleanup is left up to the application or framework then the application could exit due to a crash without getting a chance to clear the buffers.

      • Agreed.

        But if you're writing a secure program it's something to keep in mind. (Something like a cfree or cdelete might be overkill/difficult to implement because of performance issues or sheer complexity but things like clearing video memory or other specific resources might still be a good idea.)

        Honestly, after all the secure software initiatives and outcries that came out in the last decade I'm surprised this didn't pop up until now...

  • by Anonymous Coward

    The multi-tab view in iOS has a similar "bug" where if you're viewing something in private mode the jpg preview for that tab remains even if you change pages/open new/close that specific tab. Can't recall the exact steps to reproduce consistently but I've noticed it several times.

  • Question from a layman: Do GPUs have a physical-virtual memory mapping? Ex: Could process A get space on the GPU, then when process B requests memory, the GPU would give process B the physical pages process A used to have (while copying the actual data over) to defragment the physical pages, leaving B with a copy of what A used to own? Or, perhaps process B requests so much memory that the texture space requested by process A gets paged out to main memory?

  • This has been the case for video cards pretty much since the beginning. You could even write batch files into video ram and then have a program execute after a warm boot and then run the code. Did this as a proof of concept on a VGA/Hercules combo (two separate cards). So, it's a feature, if you decide to see it that way.
  • Has anyone noticed the same sort of thing with different Linux distributions and am hardware? Where the next login with briefly show the last thing from the previous session. It only shows the bottom 1/3 for me most of the time with lots of noise around the middle.
  • I have seen yesterday's game come on the screen when first booting into some environments. Data remains. For a long time.
  • Even on my Mac OS, running Minecraft, I see this. When Minecraft starts up, it opens a window, displays what ever is in the graphics memory, and then eventually clears it out and shows it's welcome window.

    That graphics memory can be anything from screen rendering pieces, to other window data, etc.

    I wonder if anything would survive a logout, and then someone else logging in?

    • by grumbel ( 592662 )

      Graphic memory doesn't just survive logouts, it survives soft-reboots as well (e.g. the login screen in Linux likes to show garbage from the previous boot). To clear the memory you have to switch of the computer completely.

If you have a procedure with 10 parameters, you probably missed some.

Working...