Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Intel Businesses Graphics Patents The Almighty Buck News

Intel To Pay NVIDIA Licensing Fees of $1.5 Billion 135

wiredmikey writes "NVIDIA and Intel have agreed to drop all outstanding legal disputes between them and Intel will pay NVIDIA an aggregate of $1.5 billion in licensing fees payable in five annual installments, beginning Jan. 18, 2011. Under the new agreement, Intel will have continued access to NVIDIA's full range of patents."
This discussion has been archived. No new comments can be posted.

Intel To Pay NVIDIA Licensing Fees of $1.5 Billion

Comments Filter:
  • Wonder if Intel.. (Score:4, Interesting)

    by RightSaidFred99 ( 874576 ) on Monday January 10, 2011 @05:58PM (#34829684)
    Wonder if Intel will be able to use any of NVidia's patents to bolster their GPUs, which is really their only sore spot at the moment (Atom vs. ARM might be a sore spot, but there's hope there).
    • by icebike ( 68054 )

      Wonder if Intel will be able to use any of NVidia's patents to bolster their GPUs, which is really their only sore spot at the moment (Atom vs. ARM might be a sore spot, but there's hope there).

      I rather suspect Intel was using Nvidia's patents all along, and that was what the big fight was all about.

      I doubt they will stop now, since they are now paying for the privileged. I also suspect Nvidia got something besides money in return, such as access to certain Intel patents or something.

      • by icebike ( 68054 )

        I note that AMD CEO Meyer resigned. [mercurynews.com]

        Perhaps this agreement was the writing on the wall for him?

        • It could be coincidence, or it could be incredibly telling.

          The best thing AMD had going was that Intel's onboard GPU's sucked. AMD has a new chip architecture coming out in the next few months and no one really knows how well it performs, except AMD. It was pretty much a given that they would have a better integrated GPU since they have ATI building it, but the CPU portion of the chip is still an unknown.

          We'll assume that for the sake of argument that AMD knows that on the CPU side they'll be weaker,
          • Although AMD is cheaper, and when you are squabbling about a minor speed decrease, over 4 cores to read your email, its pretty irrelevant.
            Might be relevant for work horse scenarios but nothing else.

            • They're not cheaper because they want to be, they're cheaper because they have to be. They can't compete on performance, so they try to do it on price. Look at the financial results for the last quarters for both companies. Notice that one made landmark profits whereas the other suffered a loss.

              I hope AMD can stay in the game, if for no other reason than that Intel got incredibly lazy with NetBurst and if AMD hadn't stepped up, Intel probably wouldn't have cared all that much.
              • They're not cheaper because they want to be, they're cheaper because they have to be. They can't compete on performance

                Intel is more expensive because they have to be. They can't compete on value.

                See what I did there?

                The only metric worth anything is performance per dollar. You have not used that metric but tried to draw a conclusion as if you did.

                Before you reply in fanboy rage, lets try it with cars:

                Ford is not cheaper (than Ferari) because they want to be, they're cheaper because they have to be. They can't compete on performance, so they try to do it on price.

                With cars your bullshit logic has no teeth. Now wh

                • The only metric worth anything is performance per dollar.
                  BULLSHIT

                  From a customers point of view what matters depends on whether the workload can/will be split across multiple nodes or not. If it can't/won't the question is performance at a given price point and whether the extra performance from moving to a higher price point is worth the extra money. If it can and will then what matters is performance per dollar of the ENTIRE SYSTEM (potentially including things like hosting, power, software and even admi

            • When you are squabbling about a minor speed decrease, over 4 cores to read your email, its pretty irrelevant.

              Not true, some of my family members have bloated their e-mails to keep pace with Moore's law.

              "Wow, my new computer can handle 200 animated gifs of kitties at a time!!! [loads up outlook]"

          • by gfody ( 514448 )
            Integrated GPUs suck in general and will continue to suck for some time. Let's assume for the sake of argument that AMD and Intel both understand that x86 performance matters little and the future is power efficient SOCs with decent graphics performance. Let's also have a look at what they've done recently:
            AMD - created an atom-like chip with decent graphics performance
            Intel - created an even more powerful x86 chip with OK graphics performance
            NVidia - announced an ARM chip together with MS announcing win
            • by Moryath ( 553296 )

              So long Wintel, Hello Winvidia! I think it's clear that AMD and Intel are crapping their pants and swiftly ousted AMD's CEO to prepare for the merger of AMD and Intel.

              Merger of AMD and Intel? Not likely. The phrase "antitrust lawsuit" comes swiftly to mind.

              More likely as well, what you'll get for ARM-based Windows is an attempt to compete in the tablet/smartphone market. Let's face it, Win7phone is a joke, Android OS was never designed for tablets and Google has admitted so, leaving the need for "something"

              • Comment removed based on user account deletion
                • That made a hell of a lot more sense to me than trying to fit some sort of backwards compatibility for X86 onto ARM, or trying to get those millions of Windows apps ported.

                  I see a lot of apps on my Windows XP machine that already work on an ARM CPU: Firefox, Java, Pidgin, Chrome, Geany, Thunderbird, Gimp, OpenOffice.org, Evince, Abiword, Gnumeric.
                  Oh you meant craptastic non-free software; nevermind. :)

                  • It seems that you don't have much experience of what we call "the real world". Browsers are easily interchangeable.

                    Email clients and messenger programs, slightly more awkward but still possible sure. I use either Thunderbird or Evolution myself, but they both have issues with Exchange server.

                    And the rest of those over Photoshop and MS Office? Again technically possible (and again I use both of these myself), but good luck doing it in practice in a real business. It's often just not worth even trying, though

              • Android OS was never designed for tablets and Google has admitted so, leaving the need for "something" to compete with iPads down the road.

                Obviously not an important point to quibble, but the iPad runs iOS, which was designed for phones too.. and I like Android fine on my Dell Streak (which is basically halfway between a phone and a tablet). I think it would run fine on a tablet. It's certainly better than Windows, and I would expect on a par with iOS.

                I've had very little experience with iOS, but I found its lack of a "back" button to be extremely annoying. I think that's one of the nicest features of Android, being able to go back not just in

            • by ogdenk ( 712300 )

              Integrated GPUs suck in general and will continue to suck for some time.

              A few years ago I'd agree but the GeForce 9400M in my 2009 Macbook is certainly more capable of a GPU than most users utilize. The newer incarnations of it certainly don't suck for most apps and are well-supported for OpenCL apps under MacOS X at least.

              If you want to play the latest and greatest at 1600x1200 at max settings and 4x antialiasing it ain't gonna cut it but for light gaming and ACTUAL WORK they do just fine.

              Now Intel's integrated GPU's suck. So do most of the ATI offerings but they have a few

            • Intel needs AMD really badly, the only thing that keeps Intel out of monopoly charges is that they can point to AMD and say, with full honesty:

              "See, the x86 marketplace is really competitive, the one time we tried to do something about it (Itanium) it backfired badly when AMD invented the x64 extensions."

              Terje

            • Keep in mind that the ARM design house is just that, a Design House that produces no actual hardware except for a limited number of test chips. Due to that, they make their money by licensing the design to anyone who wants to use it and both AMD and Intel can easily pay those fees if it makes a profit.

              MS announced an ARM based Windows for the same reason that Apple switched to Intel. Keeping their options open and the NT kernel (used in Win2k/XP/Vista/Win7+) was designed for easy porting to new hardware whi

        • Or perhaps it's a sign that bulldozer will suck and he wants to get out before the shit hits the fan.

      • If Intel was already using nVidia's patents, then why did their graphics chips suck so badly? You'd think if they were stealing nVidia ideas, they could come up with better graphics, wouldn't you?
        • by icebike ( 68054 )

          Follow the money.

          If they weren't already using their patents, do you think they would be paying 1.5 Billion?

        • Because Intel just doesn't take drivers seriously. Unlike, say, audio or networking hardware, the drivers for GPUs implement a lot of the actual API (OpenGL or Direct 3D). Drivers are a crucial component in the GPU system design, and Intel just never got that.

          That is one of the (several) reasons why NVIDIA won't open source their drivers. They actually do have significant trade secrets in that domain.

    • by Renraku ( 518261 )

      If Intel needs anything, it's to work on their integrated graphics for laptops. Intel graphics chipsets are so bad that they struggle immensely with low resolution flash animations.

      • by smash ( 1351 )
        No they don't. They're fine for business grade laptops performing business grade activity. I've had several of them. An OPTION for a high end part would be nice, but for my ultra portable 12.1" work machine i do not miss the lack of nvidia GPU at all.
      • I have an Intel GPU on my desktop at home and it works fine with Flash aninmations. It does have difficulities with recent games I concede that, but Flash animations isn't too much of a challenge for Intel's GPUs.
        • Try running flash videos at 720P or 1080P. That's where AMD and nvidia hardware acceleration shine in.

          • by Nutria ( 679911 )

            But isn't GPU-assisted-Flash only about 6 months old?

          • Which I have no problem doing on my hardware. Seriously, take a look at the latest (read: 2 years old) stuff coming out of Intel and see how well it performs with common tasks required by 95% of users. Your example, HD YouTube videos, work pretty well on this 2 year old machine using integrated Intel graphics. I had a nVidia PCI-Expres card (with some 1GB of DDR RAM) for a while and I didn't notice much difference when I switched back to Intel graphics -- only notable difference was found in games released
    • by Nursie ( 632944 )

      Use patents to bolster their GPUs?

      Hahahaha... you crack me up. It's almost as if you think that patents are a publishing of technological idea and techniques designed to excahnge a limited monopoly for full disclosure!

      Poor fool, patents are a mechanism whereby an intentionally vague and generic description of something can be used for a legal land-grab and territorial pissing contest!

  • umm...I for one welcome our new GeF-tel overlords?

    I know, I know - but who cares if Microsoft != NVIDIA.

  • It looks like NVIDIA really is betting the company on ARM. Godspeed.

    • by smash ( 1351 )
      Given that the market for ARM is many times larger than x86 (mobile phones, tablets, desktop phones, embedded systems), thats not such as bad move.
    • by jonwil ( 467024 )

      Why doesn't NVIDIA buy Via? They get the x86 license (and given the recent rulings by the FTC that Intel are abusing their market power, I think Intel would be dumb to end Via's x86 license just because they were acquired by NVIDIA)

      NVIDIA could use the acquired CPU tech alongside its ION GPU tech to produce a viable competitor to ATOM in the netbook space.

      • by Lashat ( 1041424 )

        NVIDIA does not have the money to buy VIA. They are HUGE in China.

      • I can only imagine that NVIDIA has concluded that, without any fab abilities of their own, playing #3 in the x86 market would be a cruel, low-margin game(assuming they even managed to make a profit at it). You already have Intel, whose GPUs are anemic; but who has the best core designs and superb in-house fabs. Then you have AMD, whose cores and (formerly in house) fab capabilities lag those of Intel; but whose GPUs are approximately equal, on average, with NVIDIA's, and whose CPUs kick the hell out of Via'
        • by mikael ( 484 )

          Nvidia have probably deliberately chosen not to get involved with fab production - that's not really their core business (chip design is, not chip manufacture). It's safer for them or the board manufacturers to lease multiple fab plants, just in case any one gets disrupted due to whatever reason. Leave that up to the board manufacturers and just give them a reference design.

          • Oh, I don't think that NVIDIA would be any better off if they tried to break into the fab business(or even that they would survive the attempt). My point was just that, due to their superior fab prowess, Intel can afford(except during those periods where their architecture guys have really dropped the ball, as in much of the P4 period) either to beat any other x86 vendor on price without losing money or to occupy the high-margin areas. AMD, for their part, has ATI's superior GPU designs and, now that they h
            • by mikael ( 484 )

              That sounds like the path SGI and other Silicon Valley vendors like Cray made back in the 1990's.

              European electronic companies used to refer to those workstations as "big iron". They knew back then that they couldn't compete against these companies in terms of performance, because the Silicon Valley companies would always outmanoeuvre any new competitor by adding larger and more integrated ASIC's, more boards, larger chassis's, bigger power units, faster networking, more disk space and so on...

              So the Europe

  • by divide overflow ( 599608 ) on Monday January 10, 2011 @06:08PM (#34829814)
    I can't help but wonder if this was primarily a fig leaf for Intel's licensing/acquisition of NVIDIA's GPU technology with which to compete with AMD and its acquisition and incorporation of ATI's graphics products within its own silicon. This may have advantages over the alternative of Intel making an offer to purchase all of NVIDIA.
    • Considering AMD paid 5.4 Billion for ATI many were worried that they may have overpaid.

      I think considering that they bought the whole damn company, IP, assets, and all for 5.4 Billion while Intel LICENCED some GPU technology from NVIDIA for 1.5 Billion it doesn't look so bad now.

  • No x86 or Chipset. (Score:4, Insightful)

    by rahvin112 ( 446269 ) on Monday January 10, 2011 @06:09PM (#34829838)

    Look like nvidia finally gave up on getting the x86 or chipset license. Guess the CEO is now going to bet the farm on ARM and Linux and think they can pull it off with closed source drivers! Either that or ARM windows which in my opinion will be DOA. Those patents where nVidia's best hope for an x86 license, Intel appears to have bargained with the bottom line being no x86.

    • by Junta ( 36770 )

      ARM windows which in my opinion will be DOA

      But look at all the success they have had with the Windows editions for MIPS, Alpha, Itanium, and Power! (No, I don't count the kernel on xBox 360 in the same realm here).

    • Don't forget windows is now running on ARM..

      • Don't forget windows is now running on ARM..

        I read this and picture Popeye, powered up from a fresh dose of canned spinach, suddenly flexing and (for reasons that only begin to make sense in terms of his current situation) a picture of an anthropomorphic window-pane with arms and legs appears on his bicep. And the window is running...

    • by Kjella ( 173770 )

      What good would an x86 chipset license do them? No matter what Intel has moved the GPU on the die, meaning the only thing nVidia could do is add cost by adding another mediocre GPU. Intel may or may not have enough legitimate grounds in that AMD is doing the same to win an anti-trust suit, but either way the best nVidia could hope for was cash and not their market back. nVidia knows long term it needs to find another niche as Intel's graphics suck less each generation and graphics cards are approaching over

      • A year or two ago the chipset business was nearly 1/3 of nVidia's business. When the i7 was introduced Intel refused to license it to nVidia and 1/3 of nvidia's revenue and profit died. The CEO of nVidia opened an anti-trust complaint and threatened to sue. It got really nasty. Without replacing that revenue there will be a very significant drop in nvidia's stock price and value.

    • by smash ( 1351 )
      You've overlooking the hundreds of millions of iphone/ipad like devices that are coming onto the market in the next few years. The PC market is saturated. For many people their hardware is plenty "good enough". Trying to compete/sell new machines in that space is going to be a lot more difficult than the phone/tablet market.
    • Unless of course they know something we don't.

    • I don't know why Nvidia doesn't buy VIA, or whomever has the necessary patents from Transmeta's portfolio...
      • The VIA license is non-transferable. In the event Via changes ownership the x86 license terminates. The only way VIA and nvidia could merge with Via retaining the license would be fore Via to buy nVidia.

        • by h4rr4r ( 612664 )

          Easy enough, Nvidia buys X shares of Via for $Y billion dollars. Via uses that money to buy Nvidia.

    • Guess the CEO is now going to bet the farm on ARM and Linux and think they can pull it off with closed source drivers!

      AFAIK none of the ARM SoC offerings are particularly 'open' with respect to drivers. Hopefully the Nouveau drivers are somewhat reusable. Maybe the horrid driver for GMA500 can be adapted for GPUs sharing a PowerVR SGX?

  • Lets hope Intel and NVIDIA can end their fighting so that NVIDIA can make chipsets for the latest Intel CPUs again.

    • Don't get your hopes up. Part of the agreement specifically amends the old chipset license to say that NVIDIA can't make chipsets for Sandy Bridge, Westmere, Nehalem, etc. chips that have a memory controller built-in. NVIDIA can make discrete graphics for these, of course, but the MCP line is D-E-D dead.

  • by IYagami ( 136831 ) on Monday January 10, 2011 @08:08PM (#34831086)

    See http://www.amd.com/us/press-releases/Pages/amd-appts-seifert-2011jan10.aspx [amd.com]

    Some very interesting analysis can be found at:
    http://www.brightsideofnews.com/news/2011/1/10/coup-at-amd-dirk-meyer-pushed-out.aspx [brightsideofnews.com]
    "Remember, Dirk Meyer’s three deadly sins were:

    1) Failure to Execute: K8/Hammer/AMD64 was 18 months late, Barcelona was deliberately delayed by 9 months, original Bulldozer was scrapped and is running 22 months late -I personally think this is not true; Dirk Meyer was AMD's CEO from July 18, 2008 until January 10, 2011; he could not be responsible for K8 nor Barcelona, however Bulldozer...-
    2) Giving the netbook market to Intel [AMD created the first netbook as a part of OLPC project] and long delays of Barcelona and Bulldozer architectures -this is interesting, after Intel has a serious failure with the Pentium 4, it's mobile division is the one who changes everything with Intel Core 2, designed from a mobile perspective-.
    3) Completely missing the perspective on handheld space - selling Imageon to Qualcomm, Xilleon to BroadCom -I think this is the key; no one expected this market to be as successful as it is at the moment-"

  • This will help AMD because, to cover the costs, Intel has to raise their prices slightly. That means AMD can compete more in the cost vs performance battle so hurray for AMD, except you have to also realize that the customers get screwed. The only time AMD should do better is when they make better processors. THAT benefits us. When they do better without as much motivation to advance their processor performance, then things go downhill for the customers because they get a slower chip in the long run.
    • Well, the deal is $1.5 billion over five years. $300 million/year. Given that they bring in over 100 times that in revenue each year, I don't think that is going to be a big deal for them.
  • Consider: Intel traded NVIDIA a P4 FSB license for access to NVIDIA patents back in 2004. Begin Intel nForce era. What did Intel get? Posit: Intel implemented the NVIDIA patents in their CPUs and Intel now doesn't wish to stop using that patented technology or they'd have to revise the Centrino/Core Duo platform..

    It's pretty safe to assume that Intel didn't want GPU-specific patents, since they haven't developed a miraculous high end GPU peerage, and their integrated GPUs are plodding along as ever. Intel w

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...