Intel To Pay NVIDIA Licensing Fees of $1.5 Billion 135
wiredmikey writes "NVIDIA and Intel have agreed to drop all outstanding legal disputes between them and Intel will pay NVIDIA an aggregate of $1.5 billion in licensing fees payable in five annual installments, beginning Jan. 18, 2011. Under the new agreement, Intel will have continued access to NVIDIA's full range of patents."
Wonder if Intel.. (Score:4, Interesting)
Re: (Score:2)
Wonder if Intel will be able to use any of NVidia's patents to bolster their GPUs, which is really their only sore spot at the moment (Atom vs. ARM might be a sore spot, but there's hope there).
I rather suspect Intel was using Nvidia's patents all along, and that was what the big fight was all about.
I doubt they will stop now, since they are now paying for the privileged. I also suspect Nvidia got something besides money in return, such as access to certain Intel patents or something.
Re: (Score:3)
I note that AMD CEO Meyer resigned. [mercurynews.com]
Perhaps this agreement was the writing on the wall for him?
Re: (Score:3)
The best thing AMD had going was that Intel's onboard GPU's sucked. AMD has a new chip architecture coming out in the next few months and no one really knows how well it performs, except AMD. It was pretty much a given that they would have a better integrated GPU since they have ATI building it, but the CPU portion of the chip is still an unknown.
We'll assume that for the sake of argument that AMD knows that on the CPU side they'll be weaker,
Re: (Score:3)
Although AMD is cheaper, and when you are squabbling about a minor speed decrease, over 4 cores to read your email, its pretty irrelevant.
Might be relevant for work horse scenarios but nothing else.
Re: (Score:2)
I hope AMD can stay in the game, if for no other reason than that Intel got incredibly lazy with NetBurst and if AMD hadn't stepped up, Intel probably wouldn't have cared all that much.
Re: (Score:3)
They're not cheaper because they want to be, they're cheaper because they have to be. They can't compete on performance
Intel is more expensive because they have to be. They can't compete on value.
See what I did there?
The only metric worth anything is performance per dollar. You have not used that metric but tried to draw a conclusion as if you did.
Before you reply in fanboy rage, lets try it with cars:
Ford is not cheaper (than Ferari) because they want to be, they're cheaper because they have to be. They can't compete on performance, so they try to do it on price.
With cars your bullshit logic has no teeth. Now wh
Re: (Score:2)
The only metric worth anything is performance per dollar.
BULLSHIT
From a customers point of view what matters depends on whether the workload can/will be split across multiple nodes or not. If it can't/won't the question is performance at a given price point and whether the extra performance from moving to a higher price point is worth the extra money. If it can and will then what matters is performance per dollar of the ENTIRE SYSTEM (potentially including things like hosting, power, software and even admi
Re: (Score:2)
When you are squabbling about a minor speed decrease, over 4 cores to read your email, its pretty irrelevant.
Not true, some of my family members have bloated their e-mails to keep pace with Moore's law.
"Wow, my new computer can handle 200 animated gifs of kitties at a time!!! [loads up outlook]"
Re: (Score:2)
AMD - created an atom-like chip with decent graphics performance
Intel - created an even more powerful x86 chip with OK graphics performance
NVidia - announced an ARM chip together with MS announcing win
Re: (Score:1)
So long Wintel, Hello Winvidia! I think it's clear that AMD and Intel are crapping their pants and swiftly ousted AMD's CEO to prepare for the merger of AMD and Intel.
Merger of AMD and Intel? Not likely. The phrase "antitrust lawsuit" comes swiftly to mind.
More likely as well, what you'll get for ARM-based Windows is an attempt to compete in the tablet/smartphone market. Let's face it, Win7phone is a joke, Android OS was never designed for tablets and Google has admitted so, leaving the need for "something"
Re: (Score:2)
Re: (Score:2)
I see a lot of apps on my Windows XP machine that already work on an ARM CPU: Firefox, Java, Pidgin, Chrome, Geany, Thunderbird, Gimp, OpenOffice.org, Evince, Abiword, Gnumeric. :)
Oh you meant craptastic non-free software; nevermind.
Re: (Score:2)
It seems that you don't have much experience of what we call "the real world". Browsers are easily interchangeable.
Email clients and messenger programs, slightly more awkward but still possible sure. I use either Thunderbird or Evolution myself, but they both have issues with Exchange server.
And the rest of those over Photoshop and MS Office? Again technically possible (and again I use both of these myself), but good luck doing it in practice in a real business. It's often just not worth even trying, though
Re: (Score:2)
Android OS was never designed for tablets and Google has admitted so, leaving the need for "something" to compete with iPads down the road.
Obviously not an important point to quibble, but the iPad runs iOS, which was designed for phones too.. and I like Android fine on my Dell Streak (which is basically halfway between a phone and a tablet). I think it would run fine on a tablet. It's certainly better than Windows, and I would expect on a par with iOS.
I've had very little experience with iOS, but I found its lack of a "back" button to be extremely annoying. I think that's one of the nicest features of Android, being able to go back not just in
Re: (Score:2)
Integrated GPUs suck in general and will continue to suck for some time.
A few years ago I'd agree but the GeForce 9400M in my 2009 Macbook is certainly more capable of a GPU than most users utilize. The newer incarnations of it certainly don't suck for most apps and are well-supported for OpenCL apps under MacOS X at least.
If you want to play the latest and greatest at 1600x1200 at max settings and 4x antialiasing it ain't gonna cut it but for light gaming and ACTUAL WORK they do just fine.
Now Intel's integrated GPU's suck. So do most of the ATI offerings but they have a few
Antel - AMD merge? No way! (Score:2)
Intel needs AMD really badly, the only thing that keeps Intel out of monopoly charges is that they can point to AMD and say, with full honesty:
"See, the x86 marketplace is really competitive, the one time we tried to do something about it (Itanium) it backfired badly when AMD invented the x64 extensions."
Terje
Oops: Intel - AMD merge? No way! (Score:2)
Sorry about the title typo. :-(
Re: (Score:2)
Keep in mind that the ARM design house is just that, a Design House that produces no actual hardware except for a limited number of test chips. Due to that, they make their money by licensing the design to anyone who wants to use it and both AMD and Intel can easily pay those fees if it makes a profit.
MS announced an ARM based Windows for the same reason that Apple switched to Intel. Keeping their options open and the NT kernel (used in Win2k/XP/Vista/Win7+) was designed for easy porting to new hardware whi
Re: (Score:2)
Not until Intel provide some real low power CPU options. And if nVidia just license out their patents to some other company that creates the actual ARM/nVidia chips then that should avoid Intel getting their grubby mits on it surely?
Re: (Score:2)
Or perhaps it's a sign that bulldozer will suck and he wants to get out before the shit hits the fan.
Re: (Score:2)
Re: (Score:2)
Follow the money.
If they weren't already using their patents, do you think they would be paying 1.5 Billion?
Re: (Score:2)
Because Intel just doesn't take drivers seriously. Unlike, say, audio or networking hardware, the drivers for GPUs implement a lot of the actual API (OpenGL or Direct 3D). Drivers are a crucial component in the GPU system design, and Intel just never got that.
That is one of the (several) reasons why NVIDIA won't open source their drivers. They actually do have significant trade secrets in that domain.
Re: (Score:2)
If Intel needs anything, it's to work on their integrated graphics for laptops. Intel graphics chipsets are so bad that they struggle immensely with low resolution flash animations.
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Try running flash videos at 720P or 1080P. That's where AMD and nvidia hardware acceleration shine in.
Re: (Score:2)
But isn't GPU-assisted-Flash only about 6 months old?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Use patents to bolster their GPUs?
Hahahaha... you crack me up. It's almost as if you think that patents are a publishing of technological idea and techniques designed to excahnge a limited monopoly for full disclosure!
Poor fool, patents are a mechanism whereby an intentionally vague and generic description of something can be used for a legal land-grab and territorial pissing contest!
Given Intel's impending divorce with Microsoft.... (Score:2, Insightful)
umm...I for one welcome our new GeF-tel overlords?
I know, I know - but who cares if Microsoft != NVIDIA.
Still no x86 license. (Score:1)
It looks like NVIDIA really is betting the company on ARM. Godspeed.
Re: (Score:2)
Re: (Score:2)
Why doesn't NVIDIA buy Via? They get the x86 license (and given the recent rulings by the FTC that Intel are abusing their market power, I think Intel would be dumb to end Via's x86 license just because they were acquired by NVIDIA)
NVIDIA could use the acquired CPU tech alongside its ION GPU tech to produce a viable competitor to ATOM in the netbook space.
Re: (Score:1)
NVIDIA does not have the money to buy VIA. They are HUGE in China.
Re: (Score:3)
Re: (Score:2)
Nvidia have probably deliberately chosen not to get involved with fab production - that's not really their core business (chip design is, not chip manufacture). It's safer for them or the board manufacturers to lease multiple fab plants, just in case any one gets disrupted due to whatever reason. Leave that up to the board manufacturers and just give them a reference design.
Re: (Score:2)
Re: (Score:2)
That sounds like the path SGI and other Silicon Valley vendors like Cray made back in the 1990's.
European electronic companies used to refer to those workstations as "big iron". They knew back then that they couldn't compete against these companies in terms of performance, because the Silicon Valley companies would always outmanoeuvre any new competitor by adding larger and more integrated ASIC's, more boards, larger chassis's, bigger power units, faster networking, more disk space and so on...
So the Europe
Re: (Score:2)
Re: (Score:1)
x86-64 is owned by AMD. Somehow I doubt they'll be licensing it to their main GPU competitor.
Somehow I doubt they'd be legally allowed to NOT license it to their main GPU competitor.
Re: (Score:2)
Of course they don't have to license it to a competitor. The whole purpose of patents is to be able to license (or not) your stuff to whomever you want. Sometimes a monopoly may be found guilty of anti-competitive behavior, and part of the remedy may be forced licensing of patents, but that is not normally the case.
Re: (Score:2)
It would be hard to argue that AMD has a monopoly, so therefore you are right - it is unlikely they'd be forced.
However, if NVIDIA really needs that tech, they can just start violating the patent. AMD sues. NVIDIA countersues AMD for violation of NVIDIA patents (it is almost guaranteed that MAD violates some of NVIDIA's patents). In the end, they either fight it out in court, or reach a settlement. Either way, the resolution of the conflict is that somebody determines the difference in value between the pat
Re: (Score:1)
x86-64 is a CPU architecture.
Why the fuck should a GPU company get the rights to a CPU architectural extension?
besides, I bet they're already working with Intel's stuff - it would explain why the Fermi burned like a bitch first-gen, they must've tried implementing netburst.
Re: (Score:2)
Because a GPU caches texture memory in 2D and 3D pyramids (MIP-mapping), and a CPU does code and data page caching in 1D. Somehow, these two just might merge, especially with shader languages and trillion point data sets.
Re: (Score:2)
x86-64 is a CPU architecture.
Why the fuck should a GPU company get the rights to a CPU architectural extension?
besides, I bet they're already working with Intel's stuff - it would explain why the Fermi burned like a bitch first-gen, they must've tried implementing netburst.
The fuck because they also used to be a chipset company in addition to a GPU company. That said, license to x86-64 is not the problem, its LGA1156 that's the problem.
Re: (Score:2)
x86 probably already has, but not the subsequent extensions like SSE/3DNOW/amd64. Building a 486 CPU clone probably isn't very profitable.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
IANAL, someone enlighten me on when X86 patent runs out (am I even in the right ballpark, legally) ? I mean, X86 has been around.. forever.
It depends what you mean by x86. x86 is being constantly extended and those extensions are almost certainly being patented.
Any patents on 8086 itself will almost certainly have long expired. Hell even the 80486 is probablly pretty much clear by now (though I can't say for sure due to craziness of patent law in some countries including the US) and IIRC there are a few com
Re: (Score:1)
Software implementation of new instruction sets was always trailing years. Simply because you would have to re-write already working and tested code. For what benefit? What if performance was OK with old instruction set? I suspect making a CPU with arch. equivalent to Pentium I and runs at 1000MHz is fairly easy if there is no licensing cost and spec is open. And is more that good enough to drive Windows XP - which is still current.
Re: (Score:2)
Software implementation of new instruction sets was always trailing years.
Trailing years isn't enough though. Patents last 20 years or so.
And is more that good enough to drive Windows XP
I'm not so sure about that. Afaict the only 233MHz pentium had MMX as did all K6 chips. I dunno if there are any patent issues with MMX but if there are it could be a problem.
which is still current.
Currently supported but afaict no longer availiable to OEMs and by 2013 when the pentium becomes 20 years old it will be very ne
Re: (Score:2)
It ran out. But what nvidia needs to go back to making intel chipsets is access to the patents on the chip interface (e.g. socket 1156/1155), which are recent (last 5 years).
Re: (Score:2)
Outside of Core i7 monsters (Crysis, specialist scientific pursuits) the future is low power SoCs.
With intel putting their GPU on chip, the writing's on the wall. Hence the Tegra2.
Very Interesting Vis-a-vis AMD/ATI Aquisition (Score:5, Interesting)
Makes the AMD/ATI Deal look not so bad... (Score:2)
Considering AMD paid 5.4 Billion for ATI many were worried that they may have overpaid.
I think considering that they bought the whole damn company, IP, assets, and all for 5.4 Billion while Intel LICENCED some GPU technology from NVIDIA for 1.5 Billion it doesn't look so bad now.
No x86 or Chipset. (Score:4, Insightful)
Look like nvidia finally gave up on getting the x86 or chipset license. Guess the CEO is now going to bet the farm on ARM and Linux and think they can pull it off with closed source drivers! Either that or ARM windows which in my opinion will be DOA. Those patents where nVidia's best hope for an x86 license, Intel appears to have bargained with the bottom line being no x86.
Re: (Score:3)
ARM windows which in my opinion will be DOA
But look at all the success they have had with the Windows editions for MIPS, Alpha, Itanium, and Power! (No, I don't count the kernel on xBox 360 in the same realm here).
ARM and Linux? (Score:2)
Don't forget windows is now running on ARM..
Re: (Score:2)
Don't forget windows is now running on ARM..
I read this and picture Popeye, powered up from a fresh dose of canned spinach, suddenly flexing and (for reasons that only begin to make sense in terms of his current situation) a picture of an anthropomorphic window-pane with arms and legs appears on his bicep. And the window is running...
Re: (Score:2)
What good would an x86 chipset license do them? No matter what Intel has moved the GPU on the die, meaning the only thing nVidia could do is add cost by adding another mediocre GPU. Intel may or may not have enough legitimate grounds in that AMD is doing the same to win an anti-trust suit, but either way the best nVidia could hope for was cash and not their market back. nVidia knows long term it needs to find another niche as Intel's graphics suck less each generation and graphics cards are approaching over
Re: (Score:2)
A year or two ago the chipset business was nearly 1/3 of nVidia's business. When the i7 was introduced Intel refused to license it to nVidia and 1/3 of nvidia's revenue and profit died. The CEO of nVidia opened an anti-trust complaint and threatened to sue. It got really nasty. Without replacing that revenue there will be a very significant drop in nvidia's stock price and value.
Re: (Score:2)
Gaming being what % of the portable market?
The rest of the market is moving towards fanless, long battery life appliances. Nvidia's target isn't the high end here rather Atom smashing.
Re: (Score:2)
Re: (Score:2)
Unless of course they know something we don't.
Re: (Score:2)
Re: (Score:2)
The VIA license is non-transferable. In the event Via changes ownership the x86 license terminates. The only way VIA and nvidia could merge with Via retaining the license would be fore Via to buy nVidia.
Re: (Score:2)
Easy enough, Nvidia buys X shares of Via for $Y billion dollars. Via uses that money to buy Nvidia.
Re: (Score:2)
AFAIK none of the ARM SoC offerings are particularly 'open' with respect to drivers. Hopefully the Nouveau drivers are somewhat reusable. Maybe the horrid driver for GMA500 can be adapted for GPUs sharing a PowerVR SGX?
Lets hope Intel and NVIDIA can end their fighting (Score:1)
Lets hope Intel and NVIDIA can end their fighting so that NVIDIA can make chipsets for the latest Intel CPUs again.
Re: (Score:2)
Don't get your hopes up. Part of the agreement specifically amends the old chipset license to say that NVIDIA can't make chipsets for Sandy Bridge, Westmere, Nehalem, etc. chips that have a memory controller built-in. NVIDIA can make discrete graphics for these, of course, but the MCP line is D-E-D dead.
Re: (Score:2)
The MCP line is D-E-D dead.
Yeah, it got de-rezzed a few years back.
In related news... AMD CEO resigns! (Score:4, Insightful)
See http://www.amd.com/us/press-releases/Pages/amd-appts-seifert-2011jan10.aspx [amd.com]
Some very interesting analysis can be found at:
http://www.brightsideofnews.com/news/2011/1/10/coup-at-amd-dirk-meyer-pushed-out.aspx [brightsideofnews.com]
"Remember, Dirk Meyer’s three deadly sins were:
1) Failure to Execute: K8/Hammer/AMD64 was 18 months late, Barcelona was deliberately delayed by 9 months, original Bulldozer was scrapped and is running 22 months late -I personally think this is not true; Dirk Meyer was AMD's CEO from July 18, 2008 until January 10, 2011; he could not be responsible for K8 nor Barcelona, however Bulldozer...-
2) Giving the netbook market to Intel [AMD created the first netbook as a part of OLPC project] and long delays of Barcelona and Bulldozer architectures -this is interesting, after Intel has a serious failure with the Pentium 4, it's mobile division is the one who changes everything with Intel Core 2, designed from a mobile perspective-.
3) Completely missing the perspective on handheld space - selling Imageon to Qualcomm, Xilleon to BroadCom -I think this is the key; no one expected this market to be as successful as it is at the moment-"
bad news for us (Score:1)
Re: (Score:2)
Intel needs NVIDIA patents usedin Centrino/CoreDuo (Score:1)
Consider: Intel traded NVIDIA a P4 FSB license for access to NVIDIA patents back in 2004. Begin Intel nForce era. What did Intel get? Posit: Intel implemented the NVIDIA patents in their CPUs and Intel now doesn't wish to stop using that patented technology or they'd have to revise the Centrino/Core Duo platform..
It's pretty safe to assume that Intel didn't want GPU-specific patents, since they haven't developed a miraculous high end GPU peerage, and their integrated GPUs are plodding along as ever. Intel w
Re: (Score:2, Funny)
Die nVidia. Die in a fire.
that sounded like it came from a stereotypical bearded islamofascist
Re: (Score:2)
Re: (Score:2)
FTFY.
Re:nVidia needs to die in a fire (Score:5, Insightful)
Re: (Score:2)
It's sad how that one skit from Saturday Night Live in 1976 is still relevant and still accurate:
http://www.hulu.com/watch/4163/saturday-night-live-ernestine [hulu.com]
Seriously, what the heck is it with telecom companies?
Re:nVidia needs to die in a fire (Score:5, Insightful)
What's wrong with nVidia? They don't provide open source drivers, but they do provide the *best* drivers for Linux. While I'd rather have good and open source drivers, good is a higher priority to me. I guess ATI has been getting better, but I've never had bad experiences with nVidia drivers.
And it's worth noting that they don't provide open source Windows drivers either and likely never will. Complaining because they don't do more for Linux users than they do for Windows users seems strange to me.
Re: (Score:1)
Maybe it's because I have an outdated card, but I don't get the hate. I thought Nvidia released good drivers for Linux and all. I'm a Linux user, I have a GeForce and performance seems comparable to that of Windows. Proprietary drivers, yes, but good ones. Am I missing something?
Re: (Score:1)
Maybe it's because I have an outdated card, but I don't get the hate. I thought Nvidia released good drivers for Linux and all. I'm a Linux user, I have a GeForce and performance seems comparable to that of Windows. Proprietary drivers, yes, but good ones. Am I missing something?
Open source drivers from both camps suck ass.
Closed source drivers from AMD suck ass.
Closed source drivers from NVidia are competent.
Linux Neckbeard Warriors will never publicly support installing the closed source drivers, but every single one of them will do it.
Re: (Score:2)
I haven't used the ATI drivers on linux so I can't comment on them, but I have also found that the Nvidia drivers provide the same performance as on windows.
Re: (Score:2)
Re: (Score:1)
I'm so thoroughly done with my ex wife it is worse than the hate for toyota or nVidia
She may or may not be the performance leader or whatever she think she is, but her abuse of me is just too bad when compared to her competitors, hot 20 year olds. I'm about to order a new one and guess what? She'll have huge knockers. She just has no excuse for playing these crap games any more.
Die ex wife. Die in a fire.
Re:nVidia needs to die in a fire (Score:5, Insightful)
get this: even if windows is better for some stuff, die hard zealots will stick to linux, it's about being open/free source.
ATI contributes code in the open, even if it sucks, it's preferable (for the die hards) than the better working but proprietary nVidia code.
Comment removed (Score:5, Insightful)
Re: (Score:2)
I'm guessing that you are using "die hard zealot" as a pejorative. There are a couple of things that I don't get:
1) Why do you care what I prefer to run?
2) How do you not understand the value of being able to compile my own drivers?
Look, there is obvious value in having more performance in a piece of software. But that's not the only value. I keep my computers for a very long time. I use them for things that other people don't anticipate. The last thing I need is to try to upgrade the kernel and find o
Re: (Score:2)
If only they would work on my Alienware M11xR2 with optimus hybrid graphics. They won't and they never will. They totally locked Linux users out.
Re: (Score:2)
Shut up! I keel you!!
look what's left of DEC-Alpha employees... (Score:3)
after Compaq sold the rights to Intel
You want to assume that some of them are still working Alpha goodness into Intel products, but it is just as likely that they killed the tech and kept their talent out of the light of day
Re: (Score:1)
Re: (Score:2)
That is what the news at the time reported. AMD and Alpha already cooperated on hypertransport before, and the Alpha engineers joined the Athlon team, which might have helped make it the fastest available CPU for several years.