Apple Reportedly Developing 5K Retina Thunderbolt Display With Integrated GPU (hothardware.com) 296
MojoKid quotes a report from HotHardware: If you head over to Apple's website, the Cupertino outfit will happily sell you a 27-inch Thunderbolt display for $999, at least until its inventory runs out. Word on the web is that it's nearly out of stock and Apple doesn't plan to replenish them. Instead, Apple will launch a new version of its Thunderbolt monitor, one that's been upgraded to a 5K resolution and has a discrete GPU stuffed inside. It's an interesting product actually, if you think about it. Depending on the task, it can take some serious graphics muscle to drive a 5K resolution display. It amounts to over 14.7 million pixels (5120x2880), compared to Apple's current generation Thunderbolt display which runs at 2560x1440, or less than 3.7 million pixels. Apple's thinking is likely that if it integrates a GPU capable of driving a 5K resolution into the display itself, it won't have to worry about trying to balance graphics performance with thin and light designs for its future Mac systems.
Modularity Revolution (Score:2)
...but not from Apple yet (Score:2)
Sounds great.... (Score:2, Insightful)
Now give us the 17" retina Macbook Pro.
Some of us do real work and need a portable workstation. not everyone is an internet blogger that can live on a low power paper thin 13" stack of paper.
Re: (Score:2)
AC's often claim to claim represent a large population of dissatisfied potential users for a 17" MBP but actual sales figures proved that to be false. The 17" MBP was eliminated because it had been proven to be a tiny market that needed disproportionate ressources to serve.
Re: (Score:2)
I would really, really like a 17" MBP. Not AC, not alone. I do real work, I need pixels.
My hunch is that getting "retina" pixels in a 17" form factor is probably expensive/difficult. I'd be happy with their best effort.
Re: (Score:2)
I'd like one myself. Apple does not exist to assuage our every whim however and unfortunately the number of us who would buy one is insufficient for them to make enough money to justify making it.
Re: (Score:2)
Re: (Score:2)
Getting old sucks. Bite the bullet & get glasses so you can clearly read 1920x1200 on a rMBP. Apple isn't going to bring back the 17" as it was always a minuscule market that needed disproportionate ressources.
Re: (Score:2)
Had I bought my mac a year earlier I'd have been exactly in your shoes but as I waited for a refreshed 17", the rMBP came out & I had to admit that I needed glasses for the 15".
If there were more people that actually bought the 17" it'd still be around but there aren't, it's not and not lying to myself anymore has had it's benefits, among them a much lighter Mac.
Re: (Score:2)
Re: (Score:2)
Planned Obsolescence (Score:4, Insightful)
Integrated GPU just means that you'll be looking to upgrade your 5k monitor in a year or two.
Nope, no thank you apple.
External graphics make sense for laptops (Score:5, Interesting)
Integrated GPU just means that you'll be looking to upgrade your 5k monitor in a year or two.
That's what most people do anyway. The only people who upgrade piecemeal are geeks like us and even then most of us don't bother. Most people just buy a whole new system when they buy a new computer. Apple knows this better than anyone. What you are saying isn't silly but the numbers don't lie. Most people just go the simple route and upgrade everything.
Honestly I've wondered for a long time why nobody made an external graphics system - either integrated into a monitor or a separate box or in a docking station. I would be SUPER useful for a laptop or other portable device - maybe even for a tablet. Then you can have your industrial strength graphics at your permanent desk but when you are traveling or doing light duty work and don't need it you don't have to lug the extra hardware and have the attendant power drain. It makes a lot of sense if you have a fast enough interconnect. Apple sells a ton of laptops so external graphics processing actually makes a ton of sense for a certain segment.
Most people vs Geeks (Score:2)
Yup, most people replace their whole laptop after a few year.
And most people will be okay with the GPU power built into their 5k display anyway.
It will be far enough for most people's use
(watching video - so unless the entire world switches to a completely new codec (the Daala/Thor/VP10 mash-up that is supposed to come out of AOMedia) *AND* drops forever any MPEG AVC/HEVC and Google VPx codec, it should be still working 2 years down the line
casual gaming - the OpenGL/Vulkan capability of the GPU should stil
Re: (Score:2)
casual gaming - the OpenGL/Vulkan capability of the GPU should still be okay)
Define casual. If it doesn't need many textures, it should be fine. But I'm betting that they will skimp on RAM, and then the bus speed will become an issue where it otherwise wouldn't.
Re:External graphics make sense for laptops (Score:4, Insightful)
That's what most people do anyway. The only people who upgrade piecemeal are geeks like us and even then most of us don't bother
I'd say that monitors, keyboards, and mice are probably the exceptions to these. That was part of the motivation for the Mac Mini - most people already have the peripherals and so can plug in a computer. Generally, monitors are upgraded less frequently than computers and a faster GPU is one of the main reasons for upgrading the computer, now that CPU speeds have plateaued.
Upgrading vs "upgrading" (Score:2)
I'd say that monitors, keyboards, and mice are probably the exceptions to these.
I'm pretty sure Apple has substantial data regarding this. I'm equally sure they've examined it. Outside of the enthusiast market I think most people actually do "upgrade" those things when they change computers. They often don't throw away the old machine so the new one needs it's own peripherals. Some do use the old stuff of course but it's hardly unusual for them to buy new as well.
That was part of the motivation for the Mac Mini - most people already have the peripherals and so can plug in a computer.
The Mac Mini isn't Apple's best selling computer. They sell FAR more laptops and I seem to recall the iMac tends to out
Comment removed (Score:4, Interesting)
Re: (Score:2)
External graphics on docking stations (Score:2)
People have made docking stations with integrated graphics cards for quite some time.
None you could use by plugging in a single standard cable. I'm aware of a few clumsy attempts at it from days of yore but nothing anyone would actually buy as you point out. If Apple is really doing it and doing it properly, there is a stronger than average chance it will actually be done well and catch on to whatever degree the market will support. Personally I think it could make a ton of sense. I would absolutely buy such a thing for my laptop which I often tote between work and home. I only need he
Re: (Score:2)
Re: (Score:2)
Non Apple route? (Score:2)
There has been some external GPU chasis for this exact purpose already in the PC world.
Mainly, a small plug to the extrernal express-Card connector of the laptop going to a expander box with the big GPU inside.
Maybe, some of these companies would manage to build a similar Thunderbolt-PCIe expander box ?
And then you plug a vanilla 5k display into it using a regular DP or HDMI cable.
So 2 years down the line you can upgrade the GPU independently of the monitor.
The main draw backs are:
- hot-plug: not all GPU ar
Re: (Score:2)
These have been around for years for Thunderbolt. Search on "Thunderbolt 2 PCIe Expansion Chassis"... You'll find product announcements from 2010.
Re: (Score:2)
Re: (Score:2)
The average person doesn't upgrade CPUs very often either. Usually they take whatever is in their new computer and leave it there. If they play games, they're mostly running them from web pages in IE. The gamer market is very small compared to the people-using-computers market.
The big thing about this is that it will let you plug a 5K monitor into a laptop. Once you get into that high of a resolution, it gets too big for integrated GPUs to handle, and you can't use a current consumer standard single video
Re: (Score:2)
The big thing about this is that it will let you plug a 5K monitor into a laptop. Once you get into that high of a resolution, it gets too big for integrated GPUs to handle, and you can't use a current consumer standard single video connector, they all top out at 4K now. And this will do so with a connector that can already handle dumb video at 4K.
But I ask again, which consumers going to buy a 5K monitor especially when there's very little actual media in 4K? They are fine with 1080p displays for a long time. Professionals need 5K for the extra resolution when they work in multiple windows. I agree with you that is product might appeal more to consumers in that they don't need to buy any extra hardware to run a 5K monitor but I don't think they need it to begin with. For those that do buy this monitor it will be a while before 8K or whatever the nex
Re: (Score:2)
It's still a lot better than an integrated WHOLE FUCKING COMPUTER. Especially one with soldered-only RAM like modern iMacs. Unless you're a serious FPS gamer, you probably don't need to upgrade your GPU every year or two. And if you are, you probably already have a computer with, you know, actual card slots that you can plug those $1000 graphics cards into.
This is meant to let you use a real GPU with a computer that has no slots, like a Mac Mini, for those who need better than the now half-decent CPU integ
Re: (Score:2)
Re: (Score:2)
It also makes no sense to try to sell an ultra-expensive monitor for the low-budget Mac mini. Sure, some people might want that setup, but the Mac mini isn't a huge seller to begin with compared to the iMacs and the laptops.
Re: (Score:2)
Re: Planned Obsolescence (Score:2)
How would it be cheaper? Except for modularity, it would be a worse product. The video card would need to sit on a riser (or be perpendicular to the display). It would be bulkier, physically less robust, harder to cool, harder to manage compatibility, and probably more expensive due to the extra components and cooling.
There is probably a market for a 5K monitor with user-replaceable video card, but I think a fixed GPU makes more sense for almost all of Apple's target market.
User replacable cards (Score:4, Interesting)
Except for modularity, it would be a worse product.
Maybe but maybe not. That's a bit like arguing that a PC with a discrete graphics card is a worse product than one with integrated graphics. Worse in what way? You have to consider the entire product and the equation isn't just as simple as More Complicated = Worse.
The video card would need to sit on a riser (or be perpendicular to the display).
A discrete card could be mounted into a gap in the board flush with the board for the monitor itself. It wouldn't necessarily have to have a riser though that is possible. Even if it did, is that really a problem? I don't see it as one. If my monitor is 4 inches deep vs 6 inches deep is that really a big concern?
It would be bulkier, physically less robust, harder to cool, harder to manage compatibility, and probably more expensive due to the extra components and cooling.
Probably a bit bulkier but not much and not enough to really matter in most cases. Physical robustness isn't likely to be a challenge unless you plan on moving it a lot which kind of defeats the entire purpose of a device like this. Plus it would be roughly as robust as a desktop PC which is demonstrably fine. Cooling is a concern but a well understood and manageable one. We're not talking bleeding edge water cooling here. It definitely would be more expensive than a regular monitor but that doesn't prevent it from being good value for money.
There is probably a market for a 5K monitor with user-replaceable video card, but I think a fixed GPU makes more sense for almost all of Apple's target market.
I think there is almost no chance it will be a user replaceable GPU. Would be nice but it would be pretty contrary to Apple's standard MO. If Apple can make it catch on however it wouldn't surprise me to see something in the PC enthusiast market for user replaceable cards. Unclear how much of a market there is for that but I could see it happening.
Re: (Score:2)
Not to mention the power supply issues. Some of those high-end cards need a LOT of power, even having dedicated power connectors on the card. At that point you might as well go to a Thunderbolt-to-PCIe card cage, at which point you would realize that TB is only PCI-e x2 or x4, not the x16 that most video cards want. You could make it fit (PCI-e is designed that way), but you would lose 3/4 of the potential bandwidth from the start.
And the target market of this seems to be laptop users, who wouldn't have 16
Re: (Score:2)
except... this is not what is going to happen. They will have both 2D and 3D into the monitor.
Re: (Score:2)
For games, yes. But the purpose of the GPU in this case is mainly to handle the ridiculous number of pixels. So not the same scenario. You wouldn't buy this for gaming.
There are many reasons gamers would not buy this monitor including most games are probably going to 4K (3840x2160) eventually instead of 5K (5120x2880) as this is doubling the pixels both horizontally and vertically from current games. Scaling to 5K is much more difficult. 5K monitors are really for professionals like game and video developers who need to see the whole resolution of a 4K screen while working on it.
Re: (Score:2)
Rene Ritchie, said "no", so no new display (Score:2)
BR> But I think at least one hardware announcement will sneak in.
Re: (Score:2)
Almost correct. (Score:2)
Rene Ritchie specifically said that there was not going to be a display with an integrated GPU introduced at WWDC or in the near future.
That doesn't mean that there isn't going to be any Thunderbolt Display refresh.
Dan Aris
Re: (Score:2)
He didn't say no new display. He said no new display with integrated GPU.
Some innovation finally! (Score:2)
Re: (Score:2)
You can get thunderbolt-attached PCIe cardcages that let you put basically any PCIe card you would normally install internally in an external enclosure; and various outfits with a focus on mac users have integrated peripherals that are the same basic conc
Re: (Score:2)
Not really innovative (Score:4, Insightful)
I can think of a lot of disadvantages though. Can't be repaired/upgraded separately. Destroys the thin profile of modern monitors. Overly complicated purchase choices (thinking ahead to a future when x different monitors and y different GPUs are available, you have to pick from x*y monitor/GPU combos, instead of just picking them separately for x+y choices). Hotspot created by GPU could damage the portion of the monitor it's adjacent to. Fan to cool the GPU is stuck in the monitor, so you can't shove it and the computer into a closet with only KVM cables leading to your desk, for some peace and quiet,
Put the GUI there too. (Score:2)
The sensible thing is to have the GPU, or at least some of the GPU, in the display, along with a lightweight processor so that the display alone can run a decent GUI. Essentially something like the Quartz display server, but with a few extras. First, you can set up surfaces in the display GPU and stream content to them via either HDMI/VGA/DVI interfaces etc. and over a gigabit ethernet cable, possibly even wifi. You need a programmable API for how the GUI actually works, etc. Basically, you end up reinventi
Re: (Score:2)
Isn't that basically an iMac?
VR & portable high performance GPUs (Score:3)
Anchoring the GPU to the desk may be entirely the wrong choice now: there will likely be a lot of demand for portable high performance GPUs from VR headsets.
Re: (Score:2)
Thunderbolt 3 dubious for external GPU (Score:3)
T3 has only 4 x lanes (x4) of PCIe gen 3.0 ( and 2 slower lanes ) [wikipedia.org], given that most discrete GPU adapters want to be in a x16 slot, it suggests to me that external GPUs will be crippled in PCIe connection bandwidth. However, I assume the beautiful monitor will accept the 4K DP over T3 to give you great performance for on-system GPUs.
Re: (Score:3)
Natural evolution (Score:2)
This has seemed the natural evolution of monitors to me for some time. Powerful GPUs are getting cheap enough that they should be able to be built-in to all TVs and monitors without raising costs inordinately. Once this becomes common place, CPU-oriented devices can drop their power and size requirements substantially.
The only pain point to this is hand-held gaming devices. Obviously they can't depend on an external monitor.
Take that Mac-hating gamers! (Score:2)
Now maybe the major studios will release on OS X at the same time as other platforms.
Too late, already probably false (Score:4, Funny)
The news cycle on /. is so slow sometimes. If you follow Apple news, you were hearing about this a few days ago, and YESTERDAY we already heard from Rene Ritchie claiming that this isn't happening.
So don't expect anything. And for those of you already mad at Apple for a product that hasn't been officially announced and is probably imaginary, calm thine tits. At least be mad at Apple for the stuff they ACTUALLY do, not the made up stuff that hasn't happened.
Why would you want the GPU outside the box? (Score:2)
The GPU needs to communicate closely with the CPU. It needs to have access to huge amounts of data. Never mind the pixels involved in a 5K display. The GPU has to take many inputs that would likely be much higher total resolution, and digest them into a 5K output. There's a reason the GPU is tightly integrated with the mother board. It seems to me that moving it to the monitor would be like tying one hand behind your back, when it comes to performance.
Re: (Score:2)
Send 5k compressed video to the monitor, and have the built in GPU decode it. Of course, no-one makes 5k video, but say you are doing video editing now you can have a 4k video a full resolution and some controls on screen at the same time.
Re:How? (Score:5, Informative)
Thunderbolt can, among other things, encapsulate PCIe. You effectively end up with a discrete GPU that has slightly higher latency than the attached one, but all of the rendering is done on the GPU and the final image is output directly to the display. You'll upload textures, geometry, and shaders to the external GPU via Thunderbolt, but you won't be streaming rendered images over the connection.
This isn't a very surprising development. At least one third-party has been providing external GPUs for Macs since shortly after they started shipping with Thunderbolt (you can also buy a simple PCIe enclosure that plugs into Thunderbolt and lets you plug in other cards). Moving the GPU into the display (which, in an Apple monitor, already uses the PCIe parts of Thunderbolt to provide USB, Firewire 800, Gigabit Ethernet, audio, and a camera) is a pretty logical step and one that several people had predicted.
Re: (Score:2)
Not a huge problem if you just want a lot of screen to do relatively undemanding things; but if Apple hasn't entirely finished chasing off their workstation customers they will probably find the performa
Re: (Score:2, Informative)
That's no problem. You can always buy next year the new model.
Re: (Score:2)
This isn't the sort of thing likely to bother Apple; but the major downside will be that the monitor will be stuck with whatever GPU was integrated for its entire life;
It would probably not be something Apple would do, but you could easily make that GPU replaceable.
Re: (Score:3)
5k is too high a resolution for DP 1.2 to drive at 60Hz, so you would be limited to DP 1.3 or 1.4 parts, or nonstandard hacks of 1.2
Re: (Score:2)
It could be made modular(though Apple probably wouldn't be the vendor to do it); but I suspect that it would be in serious danger of falling into the same sort of niche that 'MXM' GPU modules for laptops have. Those are theoretically standardized and swappable; but relatively rare and often thermally or mechanically limited such that only a few specific upgrades can be made.
I am on the fence as to whether Apple will at least offer BTO GPU options for these displays, and I agree that, if this catches-on, other mfgs. could come out with aftermarket GPU-card upgrades.
Second, I think this is a different situation than with the MXM modules, simply because of Form-Factor. Laptops live-and-die by crunching-out millimeters and tenths-of-millimeters, but in a monitor/TV, there is automatically a TON of space to allow for cards with heatsinks, fans, etc., so there is LOT more design-f
Re: (Score:2)
This isn't the sort of thing likely to bother Apple; but the major downside will be that the monitor will be stuck with whatever GPU was integrated for its entire life
Yeah, you've pretty much described Apple products since the aughts. I don't think I've seen a significant number of video upgrades for Apple computers since the 90s. I remember maybe 10 years ago, there were a handful of aftermarket cards available for the Mac Pro but you had to buy the Apple version. Or get the PC version and a PC to flash the Apple-compatible BIOS onto them. The Mac Pros may have had expansion slots but there wasn't a heck of a lot of hardware you could plug into them.
So you're right.
Re: (Score:2)
I've only ever plugged two things into my Mac pro expansion slot: An eSATA interface card (which worked perfectly) and an infiniband adaptor (Which was not detected by any OS). Based on that record, I wouldn't buy any PCI-e expansion for a mac pro without checking the returns policy, just in case.
Re: (Score:2)
Re: (Score:2)
For the customer who just wants to be able to use a macbook air or the like and still en
Re: (Score:2)
Re: (Score:2)
This isn't the sort of thing likely to bother Apple; but the major downside will be that the monitor will be stuck with whatever GPU was integrated for its entire life; and odds are that a nice 5k panel will be good enough for the job longer than the GPU that would fit within a suitably slim power budget.
OTOH, there is a possibility that GPUs can be in a dedicated, but standardized, card-slot in the display, allowing the display OEMs the ability to easily offer a range of performance/cost options without having to spin-off an entirely new model of the entire display. Also the possibility of user Upgrades to the GPU card could create aftermarket and OEM add-on sales.
br. Done right, this is a very interesting development.
Re: (Score:2)
You could also send compressed video (e.g. HEVC) to the display and decode it there, saving bunch of your Thunderbolt bandwidth.
Re: (Score:2)
That's going to take pricy hardware and a lot of power, and it's really unsuited for anyone who works in content production and needs artefact-free display.
Re: (Score:2)
TBH I was only thinking of the scenario where you've already got compressed video, not encoding first before transmitting it over the Thunderbolt connection. Perhaps you were thinking there'd be an encode step, hence your comment about artefacts? That'd be a pretty poor encoder BTW.
Re: (Score:2)
Re: (Score:2)
What resolution do they work in? Not 5K I presume, and the link is already designed to handle 4K of raw display data. Worst case, uncompressed video of the desired resolution can be sent, and will get up-scaled in the display's GPU to Retina resolution. It just goes over a bit more wire to get to the GPU. And there's nothing to keep someone doing 4K video production from just, you know, getting a computer with a built-in GPU and a dumb high-res monitor. They're also not going to need a GPU that pushes polyg
TB3 is only pci-e X4 MAX (Score:2)
TB3 is only pci-e X4 3.0 MAX and display / other data on the same chain eats into that. Now if apple can get pci-e switches that can pool 2 TB3 buses at the display to get pci-e 3.0 X8 and also have at least 2 TB3 bus at the cpu then this may work.
Re: (Score:3)
Correct.
I would rather see a separate box that you just attach to the monitor. However this is only useful for laptops.
Dear Apple your anorexic design language is played out. iMacs, MacBook Pros, and MacPros do not have to be the smallest and thinnest devices.
Give your customers a MacBook Pro that uses M.2 for SSDs and allows them to upgrade the ram! Give your desktop customers a line with at least one PCI-e slot and that uses desktop CPUs. Give your MacPro customers a tower with M.2, SATA, bays for drive
Re: (Score:2)
Dear Apple Consumer,
Our real customer is Wall Street. They like it when you pay $400 for something that is available at NewEgg for $100.
Best,
Tim
Re: (Score:2)
Mac sales are down. iPhone sales are down. iPad sales are down.
Wall Street is not happy with the lack of the next big thing.
The source is *INSIDE* the monitor (Score:5, Insightful)
Thunderbolt isn't only Display Port.
Thunderbolt is also PCIe.
The idea is that to drive a 5k monitor, you need a 5k-capable source.
i.e.: a quite big GPU.
But instead of putting the big discreet CPU inside the laptop and have a regular 5k picture over the display port
(which would have negative impact on battery life, weight and thickness - which doesn't seem to align with Apple's current goals which seem to boil down to "Make a laptop thin enough that you can cut cheese with it")
You put a huge honking GPU inside the screen (say a Nvidia Pascal or AMD Polaris), and have the PCIe link to the laptop.
Thus when you the laptop is connected to the screen, on its PCIe bus, it has access to a big enough GPU, but when you disconnect it, the etra weight and power consumption stays inside the monitor and the marketing department can continue touting the Mac Air being so thin you can almost see-through.
Plus it has the nice advantage to lock you even further into Apple's hardware:
you need to buy Apple's Monitor+GPU combo in order to use it with Apple's Mac Airs.
You won't get 5k out of a regular 5k monitor with vanilla DisplayPort or HDMI inputs.
But this also raises a big security problem:
as the GPU is inside the monitor, the texture uploads happen to RAM located *on the graphic card inside the monitor*.
If the monitor isn't powered down between uses, a hostile could plug the monitor and instead of uploading new texture/windows to it dump its memory content and get a good idea of what was displayed latest.
And remember that nowadays games aren't the only things uploading textures to a GPU. Desktop Composers (including like Apple's Quartz Extreme) do use it to composite the desktop too.
Re: (Score:2)
Yes I was thinking this would save the switch from integrated to discrete graphics cards that my MBP does when I connect an external monitor. Good for battery life, although I wonder if a future monitor would be supplying power back to the laptop (USB-C connection). If they get rid of the discrete graphics card, could this be the first step to them making Intel QuickSync available to end-user applications such as video encoders?
Re: (Score:2)
I wouldn't say the security problem is impossible... just when the monitor is unplugged, have all RAM get flipped to all 1s, then back to 0. Very quick, and would ensure that nothing is displayed that shouldn't be.
However, this is something that is really original. I would pay for a monitor that had its own GPU so the laptop wouldn't need as much silicon to power up and cool down.
I do wonder if this functionality should be in a docking station as well, think the PowerBook Duo, or the IBM docking station o
Re: (Score:2)
More likely a technology akin to DisplayLink, which drives a monitor over USB-Port. A DisplayLink chip is used either in the monitor itself (like on ASUS MB169B+ or Philips P-Line 231P4QUPES) or inside a docking station with some monitor connection. The operating system treats the USB device like a graphics card and you can do the usual multi-monitor setup.
I have a docking station on my desk (which drives 1920x1080 via USB3) and it works fine for productivity. It's not so good for gaming as there seems to b
Re:How? (Score:5, Informative)
Don't sent pixels.
You're sending the instructions that go into the GPU, not the pixel data that comes out of the GPU. So you're sending polygon vertices and textures and so on. Worst case is video, and then you are sending an mpeg stream, which is a lot less bandwidth than number of pixels times number of frames.
Re:How? (Score:5, Interesting)
If you think about how a blu ray player works, for example, turning the video into raw pixels then sending over HDMI is actually quite stupid. Rather, if your display can decompress h264 in hardware, you can just stream the raw h264. With a few decent royalty free standards things could work so much better, albeit against a number of entrenched proprietary interests (which is why I don't hold my breath).
Re: (Score:3)
But then you have to upgrade your display every time a new codec comes out, or have a player capable of real-time transcoding.
Re:How? (Score:4, Informative)
It could be as easy as updated GPU firmware to support a new codec (but not for your GPL Codec of the Week). In the worst case, it's still over a link that is already designed to run a 4K monitor, so unless you had a full 5K source, you could send the raw decoded video and the GPU in the monitor would upscale to full screen resolution.
This isn't some kind of crappy USB3 here, Thunderbolt is set up to be basically a PCI-e x2 or x4 port. The question is, what's it's future? Apple has had some pretty cool technologies that died when the rest of the world finally caught up with a different technology, in particular Firewire. Intel has accepted it, I don't see any other technology leap-frogging it yet, and it is possible that Thunderbolt over USB-C could actually catch on. (which could make the mini-DisplayPort version fall to the curse, ha ha)
I hate this a lot less than cramming the whole computer (something you want to upgrade more often than the screen) into a nice screen like current iMacs. Making a KVM switch for it could be tricky, though.
Re: (Score:3)
Thunderbolt is set up to be basically a PCI-e x2 or x4 port.
Ugh. A graphics card on a PCIe x4.
That's going to be fast.
Re: (Score:3)
This entire discussion may be moot, as it appears there won't be a 5K monitor [imore.com] coming from Apple at all; at least, not in the near future.
Way to be up-to-date on the Mac rumors, Slashdot.
Re: (Score:2)
This isn't some kind of crappy USB3 here, Thunderbolt is set up to be basically a PCI-e x2 or x4 port. The question is, what's it's future? Apple has had some pretty cool technologies that died when the rest of the world finally caught up with a different technology, in particular Firewire. Intel has accepted it, I don't see any other technology leap-frogging it yet, and it is possible that Thunderbolt over USB-C could actually catch on. (which could make the mini-DisplayPort version fall to the curse, ha ha)
The coolest thing here is that it doesn't require any hardware changes to the computer-side of things, so long as the computer supports Thunderbolt or USB-C. Heck, with a Display-List-Transfer protocol like this, even plain-old USB 3.0 would have enough bandwidth by far.
Now, who says Apple just repackages other people's ideas?
Re: (Score:2)
even plain-old USB 3.0 would have enough bandwidth by far.
Now, who says Apple just repackages other people's ideas?
Every USB 3.0 display adapter manufacturer.
Re: (Score:2)
A minor point to make: Intel didn't just accept Thunderbolt, they invented it. Apple just happens to be the only mainstream adopter that I can think of, and if something like this was in their long term plans it makes sense why they started including it years ago in the MBP line.
Not exactly true.
While Apple is far-and-away the leading evangelist and adopter of Thunderbolt (afterall, they co-developed the copper-wire version in partnership with Intel), there are Wintel PCs from major OEMs [infoworld.com] with it (sorry I couldn't find a more recent article), and their number is increasing.
I would bet that this idea of Apple's is a game changer, though. The time for attempting to pour raw pixels to the monitor is past. "Display-List-Transfer" is FAR more efficient, even at the expense (no pun) o
Re: (Score:2)
Of course this assumes companies design their
Re: (Score:3)
Re: (Score:3)
Re: (Score:3)
Well, there are a number of cameras coming out, that are getting into decent price ranges, that shoot 4K+ video....and with my photo editing, that high resolution really allows me to truly see my high rez images while editing them.
I can also fit a LOT of open windows on a 5K monitor which is nice too....
But there are applications for a 4K or 5K computer monitor.
I had thought recently to get one of the iMac
Re: (Score:2)
Re: (Score:2)
Except... there's no standard for a 5K display. There are proposed standards, and ones coming down the pipeline, but ... no standards now. Which is why it is impossible to connect (easily) a 5K display. Sure, you CAN buy 5K displays now, but they're dual display port displays (basically two monitors that your OS mashes together as multi-monitor), but that's.. inelegant - you need two DP cables, you need to configure your OS correctly, twiddle any cables and your monitor layout may reconfigure itse
Re: (Score:2)
Re: (Score:2)
Have you seen how much a Mac costs? Apple can't simply take a $500 PC and sell it for $1000 verbatim. Apple has to push the edge because that's the only way to command premium prices. Things like super fast SSDs (Apple was among the first to put PCIe SSDs into consumer PCs), 5K displays (which there is no standard for driving, yet, except using some awful "near lossless" compression
Re: (Score:3)
Why are you connecting a work laptop over WiFi? Especially overnight! How cheap are they that you don't have a gigabit wired Ethernet connection?
And why are they such drooling morons that they don't require you to use wired Ethernet, but force you to run antivirus crap on OS X? Yeah, all those overnight updates to everybody's computers via WiFi? No wonder it's flaking out all the time.
Protip: go to the wireless icon in the menu bar, Turn Wi-Fi Off, Turn Wi-Fi On. Much faster than rebooting.
Gonna agree wi
Re: (Score:2)