Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Desktops (Apple) Graphics IOS OS X Software iMac News Apple Entertainment Hardware Technology

Apple Reportedly Developing 5K Retina Thunderbolt Display With Integrated GPU (hothardware.com) 296

MojoKid quotes a report from HotHardware: If you head over to Apple's website, the Cupertino outfit will happily sell you a 27-inch Thunderbolt display for $999, at least until its inventory runs out. Word on the web is that it's nearly out of stock and Apple doesn't plan to replenish them. Instead, Apple will launch a new version of its Thunderbolt monitor, one that's been upgraded to a 5K resolution and has a discrete GPU stuffed inside. It's an interesting product actually, if you think about it. Depending on the task, it can take some serious graphics muscle to drive a 5K resolution display. It amounts to over 14.7 million pixels (5120x2880), compared to Apple's current generation Thunderbolt display which runs at 2560x1440, or less than 3.7 million pixels. Apple's thinking is likely that if it integrates a GPU capable of driving a 5K resolution into the display itself, it won't have to worry about trying to balance graphics performance with thin and light designs for its future Mac systems.
This discussion has been archived. No new comments can be posted.

Apple Reportedly Developing 5K Retina Thunderbolt Display With Integrated GPU

Comments Filter:
  • I've hoped Apple would take this design route for years; their other existing product lines benefit from having a superior display provide independent capabilities. Think of the future where a more universal video connector allows everything from iPhones to MacAir, etc. to connect and display video on it seamlessly. That is the crucial issue with modern multi-device households - no single visual interface, even when all the devices are in the same room. That will change now.
  • Sounds great.... (Score:2, Insightful)

    by Anonymous Coward

    Now give us the 17" retina Macbook Pro.

    Some of us do real work and need a portable workstation. not everyone is an internet blogger that can live on a low power paper thin 13" stack of paper.

    • by phayes ( 202222 )

      AC's often claim to claim represent a large population of dissatisfied potential users for a 17" MBP but actual sales figures proved that to be false. The 17" MBP was eliminated because it had been proven to be a tiny market that needed disproportionate ressources to serve.

      • I would really, really like a 17" MBP. Not AC, not alone. I do real work, I need pixels.

        My hunch is that getting "retina" pixels in a 17" form factor is probably expensive/difficult. I'd be happy with their best effort.

        • by phayes ( 202222 )

          I'd like one myself. Apple does not exist to assuage our every whim however and unfortunately the number of us who would buy one is insufficient for them to make enough money to justify making it.

  • by Anonymous Coward on Friday June 03, 2016 @06:13AM (#52240353)

    Integrated GPU just means that you'll be looking to upgrade your 5k monitor in a year or two.
    Nope, no thank you apple.

    • by sjbe ( 173966 ) on Friday June 03, 2016 @06:27AM (#52240399)

      Integrated GPU just means that you'll be looking to upgrade your 5k monitor in a year or two.

      That's what most people do anyway. The only people who upgrade piecemeal are geeks like us and even then most of us don't bother. Most people just buy a whole new system when they buy a new computer. Apple knows this better than anyone. What you are saying isn't silly but the numbers don't lie. Most people just go the simple route and upgrade everything.

      Honestly I've wondered for a long time why nobody made an external graphics system - either integrated into a monitor or a separate box or in a docking station. I would be SUPER useful for a laptop or other portable device - maybe even for a tablet. Then you can have your industrial strength graphics at your permanent desk but when you are traveling or doing light duty work and don't need it you don't have to lug the extra hardware and have the attendant power drain. It makes a lot of sense if you have a fast enough interconnect. Apple sells a ton of laptops so external graphics processing actually makes a ton of sense for a certain segment.

      • Yup, most people replace their whole laptop after a few year.
        And most people will be okay with the GPU power built into their 5k display anyway.
        It will be far enough for most people's use
        (watching video - so unless the entire world switches to a completely new codec (the Daala/Thor/VP10 mash-up that is supposed to come out of AOMedia) *AND* drops forever any MPEG AVC/HEVC and Google VPx codec, it should be still working 2 years down the line
        casual gaming - the OpenGL/Vulkan capability of the GPU should stil

        • casual gaming - the OpenGL/Vulkan capability of the GPU should still be okay)

          Define casual. If it doesn't need many textures, it should be fine. But I'm betting that they will skimp on RAM, and then the bus speed will become an issue where it otherwise wouldn't.

      • by TheRaven64 ( 641858 ) on Friday June 03, 2016 @07:49AM (#52240753) Journal

        That's what most people do anyway. The only people who upgrade piecemeal are geeks like us and even then most of us don't bother

        I'd say that monitors, keyboards, and mice are probably the exceptions to these. That was part of the motivation for the Mac Mini - most people already have the peripherals and so can plug in a computer. Generally, monitors are upgraded less frequently than computers and a faster GPU is one of the main reasons for upgrading the computer, now that CPU speeds have plateaued.

        • I'd say that monitors, keyboards, and mice are probably the exceptions to these.

          I'm pretty sure Apple has substantial data regarding this. I'm equally sure they've examined it. Outside of the enthusiast market I think most people actually do "upgrade" those things when they change computers. They often don't throw away the old machine so the new one needs it's own peripherals. Some do use the old stuff of course but it's hardly unusual for them to buy new as well.

          That was part of the motivation for the Mac Mini - most people already have the peripherals and so can plug in a computer.

          The Mac Mini isn't Apple's best selling computer. They sell FAR more laptops and I seem to recall the iMac tends to out

      • Comment removed (Score:4, Interesting)

        by account_deleted ( 4530225 ) on Friday June 03, 2016 @10:00AM (#52241555)
        Comment removed based on user account deletion
      • This definition of what people do with computers is constantly evolving. Who would have thought that my 7 year old computer (i7 950 based system) would still be perfectly usable today?
    • There has been some external GPU chasis for this exact purpose already in the PC world.

      Mainly, a small plug to the extrernal express-Card connector of the laptop going to a expander box with the big GPU inside.

      Maybe, some of these companies would manage to build a similar Thunderbolt-PCIe expander box ?
      And then you plug a vanilla 5k display into it using a regular DP or HDMI cable.
      So 2 years down the line you can upgrade the GPU independently of the monitor.

      The main draw backs are:
      - hot-plug: not all GPU ar

      • by Dr. Evil ( 3501 )

        These have been around for years for Thunderbolt. Search on "Thunderbolt 2 PCIe Expansion Chassis"... You'll find product announcements from 2010.

    • Who is "you"? For the average person, they don't upgrade monitors very often. Sure consumers eventually ditched CRTs for flat screens but most of them probably waited for wide screen ones. Many of them might still be on 720p. As for 5K monitors, you realize that is not only cutting edge but also extremely niche. The only people that might want 5K monitors are pros who do video. Game developers maybe not gamers. Unless the industry goes to 5K as a standard for media, they are not going to upgrade that monito
      • by Megane ( 129182 )

        The average person doesn't upgrade CPUs very often either. Usually they take whatever is in their new computer and leave it there. If they play games, they're mostly running them from web pages in IE. The gamer market is very small compared to the people-using-computers market.

        The big thing about this is that it will let you plug a 5K monitor into a laptop. Once you get into that high of a resolution, it gets too big for integrated GPUs to handle, and you can't use a current consumer standard single video

        • The big thing about this is that it will let you plug a 5K monitor into a laptop. Once you get into that high of a resolution, it gets too big for integrated GPUs to handle, and you can't use a current consumer standard single video connector, they all top out at 4K now. And this will do so with a connector that can already handle dumb video at 4K.

          But I ask again, which consumers going to buy a 5K monitor especially when there's very little actual media in 4K? They are fine with 1080p displays for a long time. Professionals need 5K for the extra resolution when they work in multiple windows. I agree with you that is product might appeal more to consumers in that they don't need to buy any extra hardware to run a 5K monitor but I don't think they need it to begin with. For those that do buy this monitor it will be a while before 8K or whatever the nex

    • by Megane ( 129182 )

      It's still a lot better than an integrated WHOLE FUCKING COMPUTER. Especially one with soldered-only RAM like modern iMacs. Unless you're a serious FPS gamer, you probably don't need to upgrade your GPU every year or two. And if you are, you probably already have a computer with, you know, actual card slots that you can plug those $1000 graphics cards into.

      This is meant to let you use a real GPU with a computer that has no slots, like a Mac Mini, for those who need better than the now half-decent CPU integ

      • by Megane ( 129182 )
        ...or as someone mentioned in another post, this is really targeted at laptop users. That actually makes a lot more sense than the Mac Mini use case. It's kind of hard to fit a monster desktop NVidia card into a laptop. Just leave the display+GPU on the desk at home, plug it in when you get back.
        • It also makes no sense to try to sell an ultra-expensive monitor for the low-budget Mac mini. Sure, some people might want that setup, but the Mac mini isn't a huge seller to begin with compared to the iMacs and the laptops.

      • No gamer is going to buy a 5K monitor to run a 4K or 1080p game. You'd think they would know better. This is for creative professionals which are not likely to upgrade their monitors every other year.
  • Seems to have consistent well placed source ( Rene Ritchie), said "no", so no new display at WWDC.
    BR> But I think at least one hardware announcement will sneak in.
    • It's possible that Apple is working on it but like many companies many things that they work on might not be released for various reasons and get cancelled in development. For example the iPad development actually started before the iPhone. When Apple realized they could shrink everything into a phone, the development shifted to smart phones first then tablets afterwards. I heard somewhere that the first tablets were Atom based but Apple as Apple had shifted to Intel for their computers. They reportedly co
    • Rene Ritchie specifically said that there was not going to be a display with an integrated GPU introduced at WWDC or in the near future.

      That doesn't mean that there isn't going to be any Thunderbolt Display refresh.

      Dan Aris

    • He didn't say no new display. He said no new display with integrated GPU.

  • An actually interesting way of utilising thunderbolt beyond a simple combination of display port and data bus. I wonder what other pcie devices could benefit from being externalised? It feels like this would actually reduce obsolescence (yeah I know it's Apple though).
    • PCIe devices don't really benefit from being handled over thunderbolt(it's effectively only a PCIe 4x link, and slightly higher latency than an internal slot); but if what you have is a laptop pretty much any PCIe card is better external than not at all.

      You can get thunderbolt-attached PCIe cardcages that let you put basically any PCIe card you would normally install internally in an external enclosure; and various outfits with a focus on mac users have integrated peripherals that are the same basic conc
    • by Solandri ( 704621 ) on Friday June 03, 2016 @07:14AM (#52240619)
      Putting the GPU in a separate box [razerzone.com] sitting between the computer and monitor and connected via a high-bandwidth cable like Thunderbolt, has already been done. This is just that idea, but combining the box and monitor. The only advantages I can think of over a separate GPU box are: You don't need a separate power cable because you can mooch off the monitor's power supply. And you could conceivably bypass any cable speed limits by running a direct channel from the GPU to the monitor (thinking ahead to when resolutions are higher than even Thunderbolt can support).

      I can think of a lot of disadvantages though. Can't be repaired/upgraded separately. Destroys the thin profile of modern monitors. Overly complicated purchase choices (thinking ahead to a future when x different monitors and y different GPUs are available, you have to pick from x*y monitor/GPU combos, instead of just picking them separately for x+y choices). Hotspot created by GPU could damage the portion of the monitor it's adjacent to. Fan to cool the GPU is stuck in the monitor, so you can't shove it and the computer into a closet with only KVM cables leading to your desk, for some peace and quiet,
  • The sensible thing is to have the GPU, or at least some of the GPU, in the display, along with a lightweight processor so that the display alone can run a decent GUI. Essentially something like the Quartz display server, but with a few extras. First, you can set up surfaces in the display GPU and stream content to them via either HDMI/VGA/DVI interfaces etc. and over a gigabit ethernet cable, possibly even wifi. You need a programmable API for how the GUI actually works, etc. Basically, you end up reinventi

  • by ooloorie ( 4394035 ) on Friday June 03, 2016 @08:31AM (#52240907)

    Apple's thinking is likely that if it integrates a GPU capable of driving a 5K resolution into the display itself, it won't have to worry about trying to balance graphics performance with thin and light designs for its future Mac systems.

    Anchoring the GPU to the desk may be entirely the wrong choice now: there will likely be a lot of demand for portable high performance GPUs from VR headsets.

    • by Megane ( 129182 )
      Hint: who said anything about "anchoring" to the desk? You can plug this into a laptop. And this may be more about 5K (go ahead, see what's available right now even from a desktop, yeah, that's right, even the cables only support 4K!) than about moving a high-performance GPU out of the computer.
  • by Glasswire ( 302197 ) on Friday June 03, 2016 @08:48AM (#52241027) Homepage

    T3 has only 4 x lanes (x4) of PCIe gen 3.0 ( and 2 slower lanes ) [wikipedia.org], given that most discrete GPU adapters want to be in a x16 slot, it suggests to me that external GPUs will be crippled in PCIe connection bandwidth. However, I assume the beautiful monitor will accept the 4K DP over T3 to give you great performance for on-system GPUs.

  • This has seemed the natural evolution of monitors to me for some time. Powerful GPUs are getting cheap enough that they should be able to be built-in to all TVs and monitors without raising costs inordinately. Once this becomes common place, CPU-oriented devices can drop their power and size requirements substantially.

    The only pain point to this is hand-held gaming devices. Obviously they can't depend on an external monitor.

  • Now maybe the major studios will release on OS X at the same time as other platforms.

  • The news cycle on /. is so slow sometimes. If you follow Apple news, you were hearing about this a few days ago, and YESTERDAY we already heard from Rene Ritchie claiming that this isn't happening.

    So don't expect anything. And for those of you already mad at Apple for a product that hasn't been officially announced and is probably imaginary, calm thine tits. At least be mad at Apple for the stuff they ACTUALLY do, not the made up stuff that hasn't happened.

  • The GPU needs to communicate closely with the CPU. It needs to have access to huge amounts of data. Never mind the pixels involved in a 5K display. The GPU has to take many inputs that would likely be much higher total resolution, and digest them into a 5K output. There's a reason the GPU is tightly integrated with the mother board. It seems to me that moving it to the monitor would be like tying one hand behind your back, when it comes to performance.

BLISS is ignorance.

Working...