Forgot your password?
typodupeerror
This discussion has been archived. No new comments can be posted.

NVIDIA Open-Sources Tegra K1 Graphics Support

Comments Filter:
  • by Anonymous Coward on Saturday February 01, 2014 @09:27PM (#46131415)

    Tegra has been a horrid disappointment for Nvidia till now, and the competition in the ultra-mobile SoC market is ramping up at a terrifying rate.

    -Tegra 1. The equivalent of Microsoft's Windows 1,2. If it ever existed, no-one noticed.
    -Tegra 2. Horribly late, missing NEON, and missing hardware acceleration for H264 video decode. Used in devices only because Nvidia was forced to give it away.
    -Tegra 3. First ARM SoC part from Nvidia worth using. Late, but good enough to get still get some major contracts as a highish end part.
    -Tegra 4. Pretty much an unmitigated disaster. Late and expensive enough to lose the small progress Tegra 3 had made. Wrongly specced, so Nvidia had to announce the 4i.

    -Tegra 5, renamed the K1. Built on the wrong process (not really Nvidia's fault- TSMC and others have failed to make the shrink progress expected years ago when this part was first planned). Using the wrong ARM core (A15), so Nvidia had to announce a later version of the K1 that will come with Nvidia's own 64-bit ARM core. Of course, this means the first K1 is already obsolete, long before it is on sale. First Tegra with PC class GPU cores, but not the NEW Maxwell GPU architecture Nvidia launches on the desktop in a few weeks time (750TI). So, the GPU is also out of date before the K1 goes on sale.

    The Tegra 5/K1 has a lot of graphic clout for an ARM SoC, BUT cannot use that power in a phone/normal tablet form factor. Therefore, Qualcomm and Apple will best the K1 in performance per Watt, once again.

    So, Nvidia has zero (ZERO!!!!!) to lose by throwing out all the tech details of the K1 into the public arena. Intel pulls the same stunt with its laughably poor integrated GPUs on its current CPU chips. If you can't compete, make your documentation open-source in the hope this will boot-strap some extra business.

    • by mrbluze (1034940) on Saturday February 01, 2014 @09:34PM (#46131461) Journal

      If you can't compete, make your documentation open-source in the hope this will boot-strap some extra business.

      Too little too late. For YEARS we have been screaming for nvidia drivers that aren't buggy, closed and unstable, to the point of writing Nuveau, an open source hack (remarkably good but still crippled). Rot in hell, NVIDIA - I have wasted enough money on your hardware.

      • by Anonymous Coward

        NVIDIA Linux drivers are closed source and complex, but you must admit they have good quality.

        They used to be far better than ATI/AMD proprietary stuff.
        Only Intel write good opensourced graphic drivers, but their hardware was so limited that their
        drivers didn't need the complexity of serious OpenGL drivers.

        • by Anonymous Coward on Saturday February 01, 2014 @10:37PM (#46131649)

          Stop spouting this rubbish. Intel's pretty decent performance wise these days and NVIDIA's drivers have always sucked no matter how much better the performance was.

          NVIDIA's graphics drivers don't' f'ing work without head pounding. I shouldn't have to f around with the terminal for hours to install a proprietary driver that only half works on a select set of distributions.

          The days where Intel's graphics sucked are long over. It's not the 1990s. Intel's graphics are pretty good. Intel's 3rd generation graphics were decent. Almost comparable at the low end with NVIDIA. The newer 4th generation stuff is pretty impressive although unfortunately Iris Pro has been restricted to integrated CPUs and thus no socketed CPUs have it. As a result motherboard manufacturers have chosen to opt out in protest. Nobody ships an Intel Iris Pro mini itx motherboard. In fact there are very few Iris Pro systems. I have one of the very few that exist in fact. It's an ultrabook-like form factor 14” screen.

          AMD's drivers still suck and they are still non-free despite the public relations stunt to “open” them.

          While I hope this actually helps improve the free drivers for NVIDIA's graphics chips I'm doubtful. I'm not that familiar with these chips although I'm pretty sure they are targeted at and only available in cellular devices and similar. It won't help the desktop users.

          • Re: (Score:2, Insightful)

            by JDG1980 (2438906)

            The days where Intel's graphics sucked are long over. It's not the 1990s. Intel's graphics are pretty good. Intel's 3rd generation graphics were decent. Almost comparable at the low end with NVIDIA. The newer 4th generation stuff is pretty impressive although unfortunately Iris Pro has been restricted to integrated CPUs and thus no socketed CPUs have it. As a result motherboard manufacturers have chosen to opt out in protest. Nobody ships an Intel Iris Pro mini itx motherboard. In fact there are very few I

            • by slacka (713188) on Sunday February 02, 2014 @07:56AM (#46133149)

              it is decisively beaten by pretty much every graphics card over $100. You can't game at 1080p or use MadVR with maximum settings on Iris Pro.

              To be competitive on the desktop, Intel needs something about as powerful as a Radeon HD 7850 or GeForce GTX 650 Ti Boost. As of now they aren't even close.

              I built my desktop with a 3.4GHz Core i5-3570K Ivy Bridge. Anyone telling you the HD Graphics 4000 is "good enough" for gaming is full of shit. Even my low rez 1200x1080 monitor, most games struggled to get 30 FPS at anything about the lowest detail level. When I got into Dota 2, that was the final straw. I caved in and bought a Radeon HD 7850 for $150. The difference is night and day. Integrated graphics are still garbage, worthless for anything beyond angry birds.

              I dual boot to Linux and have a decent steam library. The only thing I'll give Intel, is that they do make decent open source drivers that perform nearly as well in Linux as Windows. The AMD open source drivers are terrible for gaming. They get 30-80% of the proprietary drivers FPS and have major issues with micro stuttering. And yes, I use the dev drivers from the edgy PPA along with all the tweaks like SB backend. They still suck.

        • Re: (Score:3, Informative)

          by Khyber (864651)

          " but you must admit they have good quality."

          When a driver update kills the fan and thus kills the GPU by failing to cool it (320.18) you call that good quality?

          Really?

        • you must admit they have good quality.

          "good" evidently does not mean what you think it does.

          Nvidia Linux drivers are diabolical, as in "work of the devil" and I am bitterly disappointed. The devil (and presumably Microsoft, if a different entity) may well be pleased.

      • Not at all. It's never too late to see the light and do the right thing.

      • Re: (Score:2, Interesting)

        by LWATCDR (28044)

        So you are stuck with Intel with their FOSS drivers but terrible GPUs. AMD also was all proprietary and their FOSS drivers are not as feature rich as the closed source. So you are made about them not supporting FOSS in the past... But they are doing it now and you are still mad.... Seems counter productive to keep complaining after a company goes FOSS.

        • by Arker (91948) on Saturday February 01, 2014 @10:47PM (#46131679) Homepage Journal

          "AMD also was all proprietary and their FOSS drivers are not as feature rich as the closed source."

          Their Free drivers may not be as 'feature rich' but they're a heck of a lot more stable and compatible than the blobware.

          I'm planning to buy new video hardware about the middle of the year and their chances of getting my money just went from 0 to... well to nonzero at least.

          • by tick-tock-atona (1145909) on Sunday February 02, 2014 @01:09AM (#46132133)

            Don't get too excited. It's not like nvidia are actually opening up here:

            The scope of this work is strictly limited to Tegra (although given the similarities desktop GPU support will certainly benefit from it indirectly), and we do not have any plan to work on user-space support. So do not uninstall that proprietary driver just yet. ;)

            This is only about leveraging the hard work already done by nouveau hackers, in order to bring their embedded SOC product to market more quickly. There was no documentation dropped, and they're specifically refuting the idea desktop linux support.

      • For Linux systems, NVIDIA was the worse graphics card manufacturer. Except, of course, for all their competition.

        Now, with nouveau drivers starting to mature, things are getting better. I just install Linux Mint 16, and for the first time in installing linux systems with nvidia graphics drivers (probably in 2004?), I didn't install the nvidia graphics driver and things work at an acceptable speed.

        Thanks, nouveau developers, for your excellent work so far. I hope Nvidia gives you more support in the futur

    • by MacDork (560499)
      I'm actually pretty excited about this SoC. The problem with Tegra 4 is it is not available on anything but a couple of Chinese phones and those don't even come with the software defined radio. The other problem is that it is not CUDA. K1 is CUDA, so presumably, I should be able to install the CUDA Toolkit along with something like rpud and take maximum advantage of those 192 cores. I'll definitely wait for the 64-bit variety and hope for a phone with >4GB of RAM. I'll probably get the 64-bit one. Hopefu
      • The Linux drvers are far better than anthing else. Look at the system76 hardware or any other preconfigured systems. All of them use nVidia as far as I can see. I have yet to find anyone using AMD. The reason is the nVidia Linux systems outperform AMD and are much less troublesome (e.g. expensive) to support. In Linux, nVidia > Intel > AMD > everything else in my experience.
    • by JDG1980 (2438906)

      -Tegra 5, renamed the K1. Built on the wrong process (not really Nvidia's fault- TSMC and others have failed to make the shrink progress expected years ago when this part was first planned). Using the wrong ARM core (A15), so Nvidia had to announce a later version of the K1 that will come with Nvidia's own 64-bit ARM core. Of course, this means the first K1 is already obsolete, long before it is on sale. First Tegra with PC class GPU cores, but not the NEW Maxwell GPU architecture Nvidia launches on the de

      • 64 bit is better because you use up all your cache memory in half the time. It's better when your cache is full, right?

      • ARMv8 supports both AArch32 (32-bit ISA) and AArch64 (64-bit ISA), similar to how AMD (and now Intel) CPUs support both x86 and amd64 ISAs.

        Meaning, you can run a 32-bit OS on a 64-bit chip, and get access to all the improvements to the architecture, and it will run like a faster 32-bit chip.

        Or, you can run a 64-bit OS on the 64-bit chip, and still run 32-bit apps, and get access to all the improvements to the architecture, and it will run like a 32-bit chip with access to a full 64-bit address space (for th

    • by edxwelch (600979)

      You're partially right, Tegra has been pretty poor so far. However it's to early to make a call on Tegra 5 yet. There are have been no independant benchmarks done on it. Also, we do not know which process it uses. It could well be 20nm. 20nm has been in test production since the start of 2013.

  • finally pops on.

  • See, if you get management's attention with a few precisely calibrated and executed gestures, stupidity can sometimes end.

    • Whoa, what have we got here, Nvidia manager with mod points? Come on, admit it, closing your driver source when you're trying to sell hardware is just plain stupid, no two ways about it.

  • by deviated_prevert (1146403) on Saturday February 01, 2014 @10:52PM (#46131715) Journal

    That jump all over how terrible Tegra SOCs are, the chips still power a crapload of cheap devices all over the planet. What opening up the source on these chips will do is make it easier for smaller companies to create Android and other OS based devices for the expanding cheap device market. What some here refuse to realize is that China is a have and have not market. Those who cannot afford iPads and iPhones will go for the best cheap alternatives and Samsung's products are not significantly cheaper than Apple products. The stuff that they make that is low end could easily be blown out of the water by other companies that clone both the iPad the iPhone and high end Samsung products like the top end Note series and Galaxy 4 phones. You can bet within a very short period of time there will be a flood of cheap knock offs that do everything that these devices do and with just as much grunt but much cheaper.

    The high end portable device market that is run by SOCs is undergoing the same thing the pc market went through, aggressive competition and a patent portfolio will not adequately stop the production of knock offs. This is most likely what NVidia is counting on happening, all they care about is selling a gazillion SOCs as fast as they can, just like everybody else that relies on sales of hardware for revenue. NVidia realizes that their SOCs are not going to make it into iPads and Win8 RT is a complete bust so they are instead taking a run at Samsung by opening up their software specs and making cheap but much more powerful versions of Android and even things like Firefox OS and Ubuntu on arm a real possibility. No doubt this will make many more powerful cheap devices possible than what we currently see coming out of the east. This sort of game change is only to be expected, even if many would like to see NVIDIA FOD they are in a position to change the game simply by not playing by the old closed source software design rules that killed many manufactures in the PC market place. My prediction is the next company to bite the dust will be Creative unless someone like NVidia buys them out and teams up with someone like Lenovo to produce killer pro devices and the like as well as consumer do dads. There will be a huge consolidation in the industry and this time Microsoft and their so called "hardware partners" could be left in the dust, perhaps NVidia sees the writing on the wall this time and is breaking free from Redmond's apron strings for a change.

  • Microsoft's loss (Score:3, Informative)

    by EmperorOfCanada (1332175) on Saturday February 01, 2014 @11:15PM (#46131775) Homepage
    One of the greatest strengths Microsoft has had was its library of drivers. Quite simply most manufacturers would be foolish to make their drivers for anything but Windows first and foremost. Thus when a company would deploy their resources they could ask the question is it better to spend some resources for porting the drivers to things like Linux or just put more effort into the Windows version. Thus at best the Linux version (if any) played second fiddle to Windows (or third after Mac).

    This resulted in Microsoft effectively having billions of dollars worth of drivers that they didn't even have to pay for; a serious competitive advantage. But as many power users have moved over to Linux for various needs such as servers, rendering, and large scale computing; certain classes of drivers have become valuable for hardware manufacturers to port properly (or assist in the porting).

    This won't kill Windows but it is a nice step toward leveling the playing field somewhat.
    • by Bert64 (520050)

      Microsoft don't have a library of drivers for ARM, in fact it's the other way round where windows for arm has an extremely limited set of hardware support and linux is far ahead because a lot of the drivers for x86 can be trivially recompiled.

      • Ah but the key is that the open source community needed to cook up the bulk of those drivers and often without the cooperation of the hardware manufacturer. Of late the manufacturers have a higher chance of either cooperating or actually cooking up their own OS drivers. The key is that MS could sit back and effectively let this asset build up it would be sort of like all the guide books, yellow pages, and travel websites only ever mentioned your hotel chain and ignored the others. You wouldn't get 100% of t
  • deceptive title (Score:5, Informative)

    by Gravis Zero (934156) on Sunday February 02, 2014 @12:12AM (#46131989)

    The title should be "NVIDIA publishes Nouveau patches to support Tegra K1"

    Nothing has been "open sourced" as it was never closed sourced to start with. It's all original code written specifically for Nouveau.

    • If you feel like being a pedant, consider that it was closed until released.
      Or, do you want to argue about whether software with an open source license, that sits on a disk inaccessible to anyone but the author, is open?
      I understood that the code was released with an open license. Finding that it is just patches does not significantly deviate from that meaning, certainly not enough to deserve bold.

      • by Anonymous Coward

        Perhaps you should realise that they were trying to distinguish between writing small patches to improve Nouveau with the intention of releasing those patches vs open sourcing existing code from their proprietary codebase.

    • by Zimluura (2543412)

      this is probably a good way for nvidia to test the waters with regard to oss.

      they probably have some brass that see it as:
      "paying our engineers to write drivers, to give away for free to our enemies"

      hopefully nothing will scare them off.
      noone make any sudden moves! whisper! ...ad try not to breath to much!

Lend money to a bad debtor and he will hate you.

Working...