Intel Gets Called Out Again For Their M.I.A. 3.0 X.Org Driver (phoronix.com) 110
An anonymous reader writes: The xf86-video-intel 3.0 DDX driver has been in development the past two and a half years without seeing an official release. The last development release even of xf86-video-intel 3.0 Git was 13 months ago with the xf86-video-intel 2.99.917 release. At that time it was said by Intel's lead DDX developer, "3 months have passed, we should make one more snapshot before an imminent release." Since then, there's been no communications about a stable release of this DDX driver that makes SNA the default acceleration architecture over UXA. Over on the intel-gfx mailing list users are bringing up again the state of xf86-video-intel 3.0 and why it isn't released yet, questioning if Intel is "able to maintain its own device driver in a usable way?"
Re: (Score:2, Informative)
Oh yeah, that's right... they give away the chips for free...
Re:Heh (Score:5, Insightful)
How is buying their GPUs not paying them for driver development?
Re: (Score:3)
If you buy something based on future development potential, you are quite simply an idiot. No, buying their GPU does not obligate them to give you anything other than the GPU itself and whatever driver comes with it at that point.
Re:Heh (Score:4, Informative)
Well, there are still commits going into the git repo - so there's some work being done.
If the complaint is that they haven't "released" anything In years, then well, it's OSS - technically you can call each new checkin a new release.
So it's far from a dead project, and it'll be far from the only OSS project where the last release was years ago and everyone just pulls the latest source code and settles with that.
Re: (Score:2)
Exactly. Don't buy the damn thing.
Re: (Score:3)
If you buy something based on future development potential, you are quite simply an idiot.
No, they are buying something based on future development promises, which is still stupid but which is also not quite the situation you make it out to be. Intel has promised to develop these drivers, it's not like people are buying the hardware because intel has the capabilities to develop the drivers. They stated plans to complete them. They are now obligated to do so, for some very small value of obligated.
Re: (Score:2)
Re: (Score:2)
As for how much they should get paid, I believe wages for developers are reasonably well established a
Re: (Score:2)
Software developers should absolutely be paid, for their time. Neither they nor the company they work for should be paid for anything beyond their actual time.
You say "should" like there is some kind of absolute morality. Hint: there is not. What software developers should get paid for is what users are willing to pay for. What users should pay for is what will serve them. Somewhere in between lies reality, which is why the world not only does contain but actually should contain a mixture of open and closed software. However, also involved in that reality is the fact that users often work against their own interests, so the balance is far more towards the closed
Re: (Score:2)
Absolute morality? I'd not go that far. It is simply the fact that ideas are not unique or property. We've created the concept that you can own an idea and supported it by force it is not a natural or innate thing. The key in my statement that justifies my saying what they should or should not get as an absolute is "artificial restriction." The evidence does not support the idea that we would not have any given solution if t
Agile (Score:5, Funny)
Re: (Score:2)
Synergize!
Failed commit (Score:3)
Intel going Windows only and without AMD doing (Score:2)
Intel going Windows only and without AMD doing much they can.
Re: (Score:2)
Apple is laging / thinking about going ARM.
Re: (Score:1)
Apple is laging / thinking about going ARM.
I'm SURE that they have Prototype Macs running AMD, ARM, and maybe even something as prosaic as MIPS.
Apple has always wanted to keep their options open. They got caught-up once being kind of beholden to IBM and it PowerPC product timeline, and I doubt they will EVER make that mistake again.
And since OS X is pretty damned processor-agnostic, they can easily jump ship if the mood/market dictates.
Re: (Score:2)
It wouldn't surprise me there's a team developing a top secret version of OS X on the iPad Pro for the day when theyÂre forced to compete head-to-head with the Surface Pro. Why wouldn't OS X Macbook Air users benefit from a detachable keyboard and an iPencil...
Re: (Score:1)
It wouldn't surprise me there's a team developing a top secret version of OS X on the iPad Pro for the day when theyÂre forced to compete head-to-head with the Surface Pro. Why wouldn't OS X Macbook Air users benefit from a detachable keyboard and an iPencil...
I thought that the instant I saw the iPad Pro, too.
Re: Intel going Windows only and without AMD doing (Score:2)
Apple has already put OS X on ARM with iPad and iPhone, and well written programs can compile to it as well (the only difference between an "app" and an application is the UI)
The problem is that there are no ARM processors that hold a candle to AMD/Intel offerings.
Someone may be working on it but ARM (the company) doesn't manufacture processors and has an entirely different market namely the making and selling of customized processor blueprints.
Re: (Score:2)
The issue is basically the same as with the old T1's and upload bandwidth, the technology is kept ultra expensive because it contains features that are desirable for the enterprise market and nobody wants to make hardware that would take the bottom out of their heavily marked up enterprise markets. So the latest and greatest arm chips with a boatload of cores, virtualization extensions, ultra fast gpu on chip,
Re: (Score:2)
1-4x SMP within a single processor cluster, and multiple coherent SMP processor clusters through AMBA® 5 CHI or AMBA 4 ACE technology.
That will provide for a very
Re: (Score:2)
They don't mind shitting on Mac users either. They suck microsoft dick hard, it's where the money is at and they are whores after all.
Re: (Score:1)
They don't mind shitting on Mac users either. They suck microsoft dick hard, it's where the money is at and they are whores after all.
Listen, asshole: I work developing Windows Application Software.
So I think I know which is better.
Re: (Score:2)
At the very least you know where the money is.
Re: (Score:1)
At the very least you know where the money is.
And that's the ONLY thing (good) that can EVER be said of Windows Development.
Re: (Score:2)
Depending on what you do, you might be able to get some of the latest ARM stuff on boxes being sold out of china as "tv boxes"
Just Install Windows Version Ten Point Zero (Score:1)
Re: (Score:3, Informative)
Re: (Score:1)
HD 520 on my Core i5 6200U works just fine on Debi (Score:2)
Jessie would revert to software rendering, which makes working in Blender nearly impossible. Had to upgrade from Jessie to Stretch/testing for this to work though, which upgrades the entire graphics stack from the kernel up to X.
Probably focusing on Wayland (Score:2)
X is pretty much deprecated at this point. No one wants to maintain drivers for that shit.
Re: (Score:2)
Re: (Score:1)
Nope, but you do severely limit what hardware you can use.
So the fuck what?
It all comes down to "Do you want to work WITH your Computer" or "Do you want to work ON your Computer".
I got all the "work ON my computer" out of my system (no pun) in the Apple ][ days. Now, I'd just as soon get some REAL work done, thankyouverymuch!
But for some people, it's the only way they can feel like they have control over some small part of their empty existence.
Note that this does NOT apply to those who are actually DOING the Yeoman's work of DEVELOPING F/OSS Projects. For
Re:Is it the year of the Linux desktop yet? (Score:5, Insightful)
I love my macs, but Apple does not sell anything that represents a performance machine, and never has. In fact that is why some of us learned to hate them in the 90s: we can put together a much faster machine, for less than their not so fast machines for users we, frankly, disrespect. Now I'm older and have a life, and I am sensitive to the argument that I want to use the machine not constantly tinker with it, and although I have designed computers literally from copper traces, I respect the investment Apple makes in building a very high quality machine that can last and requires very little TLC. They are the best machines out there for casual use.
But I still wish they made one with a high end processor and a high end GPU (hint: AMD does not make any, but then either does Intel). I don't want to hear about "not needing the performance", that is a horrible answer on many different levels, and in point of fact, is wrong for some of us.
So, Apple having forsaken us, we're forced to use the next best os (Linux) and cope with what drivers the gods of Proprietary Hell (right above Special Hell) deign to give us, and frequently bitch and moan about their idiocy. I don't especially care about Intel graphics myself, I always replace it, but I can understand the attractiveness some might see given the wider array of form factors some of these low end machines can come in, where Intel graphics is a key feature.
Re: (Score:1)
But I still wish they made one with a high end processor and a high end GPU (hint: AMD does not make any, but then either does Intel). I don't want to hear about "not needing the performance", that is a horrible answer on many different levels, and in point of fact, is wrong for some of us.
The problem is, the market for such a machine is vanishingly-small.
I would never say that SOME people might benefit from the extra 5-10% of performance (let's even get it to 20%); but you're only talking a couple of hundred-thousand machines WORLDWIDE. For a company the size of Apple, the numbers just don't add-up, for the extra R&D (both hardware and software), extra testing/support and extra supply and distribution logistics.
Re: Is it the year of the Linux desktop yet? (Score:2)
Not just that but the cost as well. A bump up the models (latest gen) CPU easily increases it's price by $500-1500 for what is often only a faster clock (same speed interconnect, cache etc) or 2 more cores. That and the yields for those are so low, you can't guarantee an order of more than a few 1000 at any given time.
Apple makes computers that are price competitive with feature-matched HP/Dell machines and has superior service and quality. Sure you can get cheaper 2-4 gen old systems but then you can also
Re: (Score:2)
Re: (Score:2)
I do not agree - Apple used to be price competitive. A Dell Precision is a very different beast then a MacBook Pro (Quadro and Firepro Graphics, multiple storage device support, you can easily work on the hardware, current generation of Mobile Precisions has Xeons and ECC RAM, etc)
Re: (Score:2)
When every action you perform on a machine is instant and there are no longer progress bars or hour glasses no matter how many things you do in parallel or what operation you are performing THEN you no longer benefit from a higher performing machine so long as it remains this way in the face of all new software released for the rest of your life. We aren't the
Re: (Score:1)
Everyone benefits from 10-20% more performance. There might be a relatively small number of people who NEED 10-20% more but everyone benefits from it. When every action you perform on a machine is instant and there are no longer progress bars or hour glasses no matter how many things you do in parallel or what operation you are performing THEN you no longer benefit from a higher performing machine so long as it remains this way in the face of all new software released for the rest of your life. We aren't there on any of our machines yet. You can buy the biggest and most powerful machine in any data center and it still won't be performant enough to achieve this today let alone future proof.
No, EVERYONE does not benefit from an increase in performance; only those who bump into the upper end of their machine's performance actually benefit from the extra performance. Otherwise, it is just wasted potential (and energy). Not to mention that the additional cost often does not justify that extra speed, if the bottlenecks are few and far between and unless the intended processes are truly CPU/GPU-bound, and the slowness is not due to any other constraint that is already in the top echelon of its per
Re: (Score:2)
"When every action you perform on a machine is instant and there are no longer progress bars or hour glasses"
A progress bar or hour glass is someone bumping into the upper end of their machines performance. Okay, admittedly some of these are hard coded to display as animations so there is a minimum time they'd appear regardless but
Re: (Score:2)
I would never say that SOME people might benefit from the extra 5-10% of performance (let's even get it to 20%); but you're only talking a couple of hundred-thousand machines WORLDWIDE. For a company the size of Apple, the numbers just don't add-up, for the extra R&D (both hardware and software), extra testing/support and extra supply and distribution logistics.
Explaining to me the business concerns does not change my opinion on the product I need, that is not my problem. I also think you grossly unde
Re: (Score:2)
Explaining to me the business concerns does not change my opinion on the product I need, that is not my problem.
Maybe not; but it IS Apple's "problem". And like every manufacturer of every product, they have to do a cost/benefit analysis for each and every major design decision of each and every product. In your little "me-centric" world, you may not recognize that metric; but then you've likely never been the CEO of a Company the size of Apple (nor have I).
The Mac Pro is targeting a vanishingly small segment of the market, but Apple has stood behind it...
Barely. When is the last time you heard "Mac Pro" in an Apple Keynote? Heck, they don't even include it when they haul out the "Mac Lineup" Slides at those events
Re: (Score:2)
Re: (Score:3)
For desktop use, Linux is 2nd best. The linux desktop has become significantly less good in the past few years since Canonical mostly abandoned it, I remain optimistic against all odds that it will improve, but OS X is much cleaner and more responsive. It manages to combine the best parts of unix with the best parts of a modern UI. There are things I'd change, but compared to linux where even on a fast machine X responds slowly and with high latency, I use both and prefer OS X.
For server use, for multi-user
Re: (Score:3)
Re: (Score:1)
> I got all the "work ON my computer" out of my system (no pun) in the Apple ][ days.
I REMEMBER THOSE DAYS TOO!
Re: (Score:1)
> I got all the "work ON my computer" out of my system (no pun) in the Apple ][ days.
I REMEMBER THOSE DAYS TOO!
Yeah, back when if you ran your Apple ][ with the cover ON (because you didn't get into the guts very often), you were considered lame...
I was QUITE the hardware/software developer for that wonderful little machine.Honestly, and in all modesty, probably one of the top ones in the country (USA) outside of some engineering employees at Apple HQ itself.
Good times, good times...
Re: (Score:3)
For some of us those two things are two intertwined there is little difference. I don't want to spend much time getting my computer up and running but playing with all the latest and greatest toys and understanding how they work inspires me when it comes time to work on things which get me paid.
Re: (Score:1)
'It all comes down to "Do you want to work WITH your Computer" or "Do you want to work ON your Computer."' For some of us those two things are two intertwined there is little difference. I don't want to spend much time getting my computer up and running but playing with all the latest and greatest toys and understanding how they work inspires me when it comes time to work on things which get me paid.
I understand; and I truly do envy you for having that kind of free-time; but for me, those days are (somewhat sadly) over...
Re: (Score:1)
I'm always amused by idiots that get all hard over the Apple II. An expensive computer with no hardware sprites, sound synthesizer, or speech?
And I'm always amused by idiots that forget/revise history...
You DO realize, of course, that the Apple ][ was developed in 1976, long before Jay Miner's brilliant (but CPU-cycle-stealing) custom chipsets (Copper, Blitter, etc) were anything other than scribbles in his Engineering Notebook. Time marches on, and I also loved my Commodore 64's SID and its "GPU" with its Sprites, etc; but you have to remember, that machine was designed nearly FOUR YEARS after the Apple ][ (which was, in a lot of ways, really
Re: (Score:3)
Yup, only with a Mac you fall out of support entirely - a Mac Pro released in 2006 wasnt supported by an OS Apple released just 6 years later. Runs Windows 10 fine however. Hows that for stupidity?!
Re: (Score:2)
And yet a MacBook Pro from 9 years ago (2007) works with all of the latest Apple software, including El Capitan.
I have a feeling my hardware will fail before Apple's support will.
Re: (Score:3)
I fully expect to replace my laptop more frequently than every 6 years.
The only computer I genuinely used as a primary machine for more than 6 years was the Apple ][+ which lasted for 10. Things have changed since then.
Re: (Score:2)
Then you are, literally, an idiot.
I'm trying this on an eight year old laptop with a dual core, 2.2 GHz Intel Core 2 processor and a Geforce 9800 GTS. Full HD screen. It can run any app I need it to. It can run most of the games I'm interested. If I really want power, I use my desktop.
For the last two year's I've debated myself about once per quarter because a shiny new laptop would be neat. Yet every time I decide that what my $800
Re: (Score:2)
Quieter (fan noise) and less energy usage would be the main benefits then.
You might save $10/quarter on electricity with a new machine. Then there's the cost of a battery replacement for an old machine.
Though you might have to buy adapters for all your 'legacy' equipment and buy an external DVD drive and put up with other people's fingerprints on your touchable screen.
Re: (Score:2)
I end up with peoples fingerprints all over my non touchscreen anyway. I don't get why some people can't point without touching it. They also get offended when i keep pointing out that it's NOT a touchscreen and therefore you're not meant to touch it and cover it with greasy fingerprints.
Re: (Score:2)
what he meant was that the apple computers he has bought since have all failed within 6 years.
seriously, the rate the mobos go bad should be alarming. who cares if it's encased in aluminium if it just breaks some subsystem or another when it wants to. just today hd controller broke on one over here.. solution? buy a new mac of course.
Re: (Score:2)
what he meant was that the apple computers he has bought since have all failed within 6 years.
seriously, the rate the mobos go bad should be alarming. who cares if it's encased in aluminium if it just breaks some subsystem or another when it wants to. just today hd controller broke on one over here.. solution? buy a new mac of course.
It's the batteries that go bad first. But that is common to all laptops. The choice of replacement is based on other factors.
Re: (Score:3)
I'm typing this on a 20-year old machine. Granted, I have replaced some parts here and there at various times (like the cpu, keyboard, screen, ram, motherboard, drives, case, video card, nic, modem, mouse, etc.) and it's still working fine...
Re: (Score:2)
The number 5 key on my current desktop gaming PC just failed on me today. But WASD still works, so I'll be fine.
Re: (Score:2)
Re: (Score:2)
it's doing just fine. although someone put an axe through it a while back, so i had to get the handle replaced.
Re: (Score:2)
The only computer I genuinely used as a primary machine for more than 6 years was the Apple ][+ which lasted for 10. Things have changed since then.
I would say that depends on what you buy. My primary machine is a first-gen Core i7 (920) I built years ago. Still fine today.
I'm debating if I should build a new machine from Kaby Lake or Cannonlake when they are released. Main reason for upgrading is to move to a smaller (physical size) computer that runs cooler, not performance.
Re: (Score:2)
Right now, today, I have a P3 desktop running CentOS6/32 as a network monitor. It's old as the hills. My phone beats it handily in performance. But it runs on about 15 watts, and does the job so reliably that, in 10 years, it's never skipped a beat. It started with the original RedHat 6.1. (before RHEL was a thing)
I know it won't actually make any records, but I'm sure it's one of the oldest 0.01%, maybe even 0.001% of computers in terms of durability. It would be a remarkable machine if it wasn't otherwise
Re: (Score:1)
Yup, only with a Mac you fall out of support entirely - a Mac Pro released in 2006 wasnt supported by an OS Apple released just 6 years later. Runs Windows 10 fine however. Hows that for stupidity?!
However, for the savvy Slashdot reader, you can easily make [lowendmac.com] that first-revision 2006-2007 Mac Pro run later versions of OS X than 10.7 (Lion). I will admit, there are a few gotchas; but all in all, it works.
Here's another article [pro-tools-expert.com] that goes beyond the first one, giving Yosemite (and very likely beyond) compatibility. Oh, and the next Mac Pro hardware revision, 2008, apparently has no problems running the latest OS X. So, that would be eight years of support for that model (so far).
Re: (Score:1)
Re: (Score:1)
I don't think it is fair to claim that hardware has 8 years of support when you have to apply third-party hacks to get the newer versions of OS X to work. Isn't that pretty much the definition of unsupported?
I don't know. If we were talking about Linux, I think that you'd be arguing the other direction.
But I admit I see your point.
Sure, you can make it work, but Apple won't help you get it to work, they even go and deliberately make it harder to accomplish than it needs to be.
I agree with the first clause of your sentence; but why do you say that Apple has deliberately made it harder to accomplish than it needs to be? Just because they decided to change/delete a Framework or an API, that does NOT imply that Apple did so in order to "deliberately make it harder to accomplish". It's the "deliberately" part that I take exception to; software changes, OSes c
Re: (Score:2, Informative)
Uh, Intel developers are PAID developers to work on linux drivers for their hardware. It's not like they work on Linux stuff on a volunteer basis during their lunch break.
Re: (Score:1)
How do you know this? Have you spoken to them on this subject? What was their names?
Oh AC you so lazy.
Here's their propaganda:
https://01.org/about
For more than two decades, Intel has employed thousands of software engineers around the world to ensure open source software delivers, top notch performance, scalability, power efficiency and security on Intel platforms—across servers, desktops, mobile devices and embedded systems.
Intel Open Source Technology Center
The Intel Open Source Technology Center (OTC) is a world-class international team dedicated to working within open communities. Intel commits quality code that optimizes the latest in Intel platform features to drive software reliability, accessibility, security, and performance from enterprise to mobile, web technologies to virtualization, and the cloud.
OTC is part of Intel’s Software and Services Group with a mission to define and deliver product-quality open source software and technology innovation that unlocks the potential of Intel hardware and creates software business opportunities.
Intel has one of the largest software organizations in the world, many working in open source. With software engineers and user experience designers at sites around the world, Intel continues to set the pace. For the past decade, Intel has been a top contributor to Linux Graphics, Linux Bluetooth and the Linux Kernel, as well as one of the top three contributors to Webkit*, Chromium* and Android* projects. Today Intel works in open source communities to support everything from enterprise, Big Data, clients, and the Internet of Things.
As a leader in the silicon industry, we are unique in that we significantly invest in open source software optimization, helping to ensure that a breadth of solutions run exceptionally well on Intel architecture. This gives operating system and software application vendors, device manufacturers, and service providers the freedom, power and choice to build and deliver revolutionary solutions faster, better and at lower cost.
Working with software leaders, like SuSE, Google, Red Hat, SAP, Oracle, SAS, Canonical, IBM, Mozilla, and our mutual OEM customers means, as Linus Torvalds put it, "Intel's open source team leads the industry, making sure their hardware is well supported..." Our deep open source experience and working relationships are geared to deliver solutions that work best for our customers and partners
Intel® Graphics for Linux*
The following people from Intel Open source technology center are working on this project:
Intel Development Team
Anuj Phogat
Ben Widawsky
Carl Worth
Chad Versace
Chang Zhou
Chris Wilson
Chuanbo Weng
Damien Lespiau
Daniel Vetter
Eduardo Lima
Gwenole Beauchesne
Haihao Xiang
Ian Romanick
Jordan Justen
Paulo Zanoni
Jesse Barnes
Kenneth Graunke
Kristian H
Re: (Score:2)
I know for a fact that Intel employs Jack Vogel to work on the FreeBSD (and possibly Linux) port of their Ethernet drivers, as I've talked to him on their mailing list on more than one occasion. OSS drivers are in no way "special projects" at Intel.
Re:Is it the year of the Linux desktop yet? (Score:4, Interesting)
Apple doesn't make video cards. I've had to deal with shit like this on a Mac too. Too many times I bought things that supposedly had Mac drivers but they were out of date and didn't work with the latest version of the OS. When you call them they act just like they do to linux users. The sneer in their voice is audible.
Re: (Score:2)
Three OSs: Windows, linux, OSX.
Three drivers, of equal complexity and thus about the same cost.
90+% of their customers run Windows.
So when 60% of their software development budget time, the sneer is hardly a surprise.
It's worse than that, though. Linux's driver interface moves fast - every couple of kernel releases they need a recompile, and there are many different distros with just enough differences to make testing very time-consuming. OSX is not so bad. But Windows? A major release every couple of years
Re: (Score:2)
Ah, the curse of slashdot!
"So when less than 10% of their users take more than 60% of their software development time budget..."
I used angled brackets before.
Re: (Score:2)
I highly doubt that. The vast majority of intel graphics chips are likely in servers and only a tiny fraction of those would be running windows. Windows servers are for exchange, rdp, and a few poorly written apps are so industry specialized that they are hard to get rid of.
Re: (Score:3)
Servers outnumber personal computers...? Wha?
Re: (Score:2)
End users care about graphics so personal
Re: (Score:2)
I'd guess the vast majority of intel chips are in laptops. Most laptops under $800 or so have intel video.
And server operators don't care about xorg drivers, so you wouldn't count them in the consumer base that counts for developing graphics drivers.
Re: (Score:2)
Re: (Score:2)
Debian dumping Linux Standard Base was really a big slam against interoperability.
Re: (Score:2)
I understand it. It doesn't make me hate them less though. I can't do anything about it, not really but I fuck them every little way I can. When Nvidia started making their cards work on Linux, even though it was with a binary blob, I was so happy I haven't built a Radeon (now AMD) system since. It's insignificant and petty I know.....but I don't care. Every single friend or coworker or family member's computer I've built in all those years contained an Nvidia graphics card. All because of a snarky su