VESA Publishes DisplayPort 2.0 Video Standard, Offers Support For Beyond-8K Resolutions and Higher Refresh Rates For 4K/HDR (vesa.org) 46
The Video Electronics Standards Association (VESA) today announced that it has released version 2.0 of the DisplayPort (DP) audio/video standard. DP 2.0 is the "first major update to the DisplayPort standard since March 2016, and provides up to a 3X increase in data bandwidth performance compared to the previous version of DisplayPort (DP 1.4a), as well as new capabilities to address the future performance requirements of traditional displays," it said in a statement. From a report: These include beyond 8K resolutions, higher refresh rates and high dynamic range (HDR) support at higher resolutions, improved support for multiple display configurations, as well as improved user experience with augmented/virtual reality (AR/VR) displays, including support for 4K-and-beyond VR resolutions. The advantages of DP 2.0 are enjoyed across both the native DP connector as well as the USB Type-C connector, which carries the DP audio/video signal through DisplayPort Alt Mode. DP 2.0 is backward compatible with previous versions of DisplayPort and incorporates all of the key features of DP 1.4a, including support for visually lossless Display Stream Compression (DSC) with Forward Error Correction (FEC), HDR metadata transport, and other advanced features.
The increased video bandwidth performance of DP 2.0 carried over the USB-C connector enables simultaneous higher-speed USB data transfer without compromising display performance. DP 2.0 leverages the Thunderbolt 3 physical interface (PHY) layer while maintaining the flexibility of DP protocol in order to boost the data bandwidth and promote convergence across industry-leading IO standards. In addition, the new data rates of DP 2.0 come with a display stream data mapping protocol common to both single-stream transport and multi-stream transport. This common mapping further facilitates multi-stream transport support of DP 2.0 devices for a single DP port on the source device to drive multiple displays either via a docking station or daisy-chainable displays. First products incorporating DP 2.0 are projected to appear on the market by late 2020.
The increased video bandwidth performance of DP 2.0 carried over the USB-C connector enables simultaneous higher-speed USB data transfer without compromising display performance. DP 2.0 leverages the Thunderbolt 3 physical interface (PHY) layer while maintaining the flexibility of DP protocol in order to boost the data bandwidth and promote convergence across industry-leading IO standards. In addition, the new data rates of DP 2.0 come with a display stream data mapping protocol common to both single-stream transport and multi-stream transport. This common mapping further facilitates multi-stream transport support of DP 2.0 devices for a single DP port on the source device to drive multiple displays either via a docking station or daisy-chainable displays. First products incorporating DP 2.0 are projected to appear on the market by late 2020.
Is this a TB upgrade, too? (Score:3)
Does this mean then that Thunderbolt is getting a speed boost, too? The article claims that DP2.0 will leverage the "Thunderbolt 3 physical interface", but TB3 itself is limited to 40gig.
Re:Is this a TB upgrade, too? (Score:4, Informative)
No, it just means it'll use more of the TB3 bandwidth, probably up to 40gbps. HDMI 2.1 is up to 48gbps, but that part isn't ready yet (there are no certification tests, tools, or hardware for 48gbps). Faster speeds for TB3 will require revalidation and testing and that's a troublesome process.
TB3 at 40gbps is already out there, running and validated which means basically the standard is ready for vendors to make hardware. Granted, they use DSC which isn't ideal...
Re: (Score:2)
So then where does it get the other 37gbps from for the claimed total of 77+?
Re:Is this a TB upgrade, too? (Score:4, Informative)
TB3 is 40 full duplex, IE it can send 40 Gbps in both directions simultaneously. It does that using two 20Gbps lanes in each direction, in both cases this is the "raw" number, the usable bandwidth after decoding is slightly less.
DisplayPort 2.0 takes the same lanes and encoding but run them all in one direction (to the screen) resulting in 80 Gbps "raw" bandwidth which and 77.37 Gbps usable video bandwidth.
It also has a 4x10 and 4x13.5 modes for less demanding application, right now it seems likely that only 4x10Gbps may end up being available on passive copper cables.
Ref: https://www.anandtech.com/show/14590/vesa-announces-displayport-20-standard-bandwidth-for-8k-monitors-beyond
Re: (Score:2)
That says nothing about 77 Gbps transport over Thunderbolt/U
Re: (Score:2)
Ah, blew it in that last sentence. Apparently you cannot connect to a DP monitor with a TB cable and have the fallback work. Did not expect that.
Lossy cable (Score:3)
Remember that visually lossless actually means lossy when referring to video compression, in this case the Display Stream Compression (DSC) [vesa.org] mechanism. Note the verified by subjective testing part in the table in the link. This lossy transmission mechanism was already implemented in DisplayPort 1.4 [wikipedia.org] when using high resolution display modes (8K 60Hz or 4K 120Hz).
Yes, they argue you probably cannot appreciate it -- same occurred with DivX video encoding.
Re: (Score:2)
I never once saw DivX ever advertised as "visually lossless". But then they would be right. DivX at really high bitrates was visually lossless.
Complaining about lossy compression without talking about the method and degree of compression employed is pointless.
Yes 60 Hz! (Score:3)
Anything that gets us off the crappy 4K 30 Hz refresh rate is MUCH appreciated.
HDR is just a bonus.
Looks like moving towards what I call the Trinity of Graphics ...
* 4K
* 10-bit HDR
* 120 Hz
Re:Yes 60 Hz! (Score:5, Informative)
DisplayPort has supported 4k60 for a while now. I've been running it on my 4K monitor ever since I got it. And yes, I've had both 4k60 and 4k30 running at the same time (I had to test a 4k HDMI display, and HDMI 1.4b only does 4k30, 4k60 is a hack mode using YUV420).
If people are running 4k30, that's because they're using HDMI most likely. It's only recent cards that supported HDMI 2.0 which had a true native HDMI 2.0a (supporting 4k60) bandwidth.
We've had 4K monitors at work for a couple of years now, and for the computers that needed a video card upgrade, we gave them midrange cards - I think we only just moved to nVidia 1050s or so in the past year (we bought 860 and 960 in the past) and they did 4k60 just fine.
Re: (Score:2)
No way, man. HDR is the star of the show.
You can throw more resolution at me until the cows come home - I won't even notice. I watch my TV from the couch, not with my nose to the screen. On the other hand, HDR makes a dramatic difference to the immersiveness of a scene, and is what really makes newer display technologies shine.
Re: (Score:2)
So are the two 4k60 displays I'm driving right now at 10-bit color over DisplayPort 1.2 a figment of my imagination?
You can get away from crappy 4k30 for the last year+.
Re: (Score:2)
144 does it pretty well for me. I definitely notice the difference in the jump form 60 to 144, but not so much beyond that. The issue is that display resolution keeps going up too, so we keep needing more and more bandwidth to provide the same framerate at ever-increasing resolutions.
Re: (Score:2)
Score: +5, Holodeck
Re: (Score:2)
*mashes face against screen*
Re: (Score:2)
It's very hard to define a hard limit to ideal resolution for two reasons:
1) Optimum resolution depends on viewing distance. This is why there's a real benefit to giving phones higher DPI than desktop monitors, and why desktop monitors have higher DPI than large screen TVs. If you want a bigger monitor at the same viewing distance, you need to increase the total number of pixels to maintain the same viewing experience.
2) There isn't a hard limit to what the eye can do. Think about reading an eye char
Re: (Score:2)
There is no "exactly" the limit of human vision for a number of reasons. One is individual variability. Another is needing to specify contrast and color.
Approximately speaking, if you don't move your head and the screen wraps around your full field of vision, anything beyond 8k is pointless. If you're allowed to move in close or walk along a huge display, it's unreasonable to try to specify an upper limit.
Re: (Score:2)
Except that more resolution means bigger displays don't look like shit.
If our video formats are based on the same L x W pixel count, then an 80 inch display scales that big by increasing pixel size. In a 1080p format, it looks like shit unless you are sitting pretty far away, which largely defeats the purpose of the bigger display. Move up to 4k and that 80 inch display doesn't look absolutely terrible any more, but still has the same pixel count as a 45 inch 4k display. 8k will go even better.
The limit
Re: Human vision limit (Score:5, Interesting)
> So what exactly Is the limit of human vision?
"It depends". Let's take framerate.
Suppose I use a program like Maya to render a high-contrast scene with rotating objects and motion behind an overlaid grid at 50, 100, 200, 400, 800, and 1600fps without attempting to synthesize motion blur.
Anyone can tell the difference between 50 and 200fps, especially if the rotating objects aren't timed to minimize artifacts (like wheels appearing to strobe, jerk, or turn backwards). It's night-and-day.
At 400fps vs 1600fps, you'll see the difference mainly as the presence or absence of judder & strobing as the on-screen wheels rotate at different rates. If you know what to look for, you can probably recognize 400fps vs 1600fps without needing a direct A/B comparison.
400 vs 800, or 800 vs 1600, probably requires direct A/B comparison to reliably distinguish, and simultaneous side-by-side to distinguish easily. There are definitely diminishing returns, but you won't really encounter them until you're WELL above 200fps, and can't quite ignore them until you're above 400fps.
For interactive content, double or quadruple everything. If you're dragging the tip of an active stylus over a screen & rendering virtual ink, you need a MINIMUM of 400fp under FLAWLESS circumstances to mostly eliminate perceived lag. 800hz sampling with 400fps framerate and 1 frame of latency (read stylus & update framebuffer in one frame, display it in the next) is pretty much the bare minimum before you'll mostly stop noticing any lag. Double both to get to the point where it becomes academic under all but the most contrived perceptual test conditions.
Now... pass-through augmented reality (capturing the scene in front of someone & directly displaying it on screens in front of their eyes) is where things get ugly. 800fps with next-frame output it is the MINIMUM to have any hope of eliminating perceived "swim" as you turn your head & look around. 1600fps, if you want to eliminate it from peripheral vision as well (peripheral vision is VERY sensitive to motion anomaly... it's an evolutionary survival mechanism).
This is why AR generally focuses on overlaying video on top of relatively clear glasses, instead of using cameras and video displays. Anything less than 400fps will suck, 800fps is barely tolerable, and 1000-1600fps is pretty much the minimum acceptable point for extended use without getting sick. Lower framerates might be OK for an adrenaline-rush videogame, but will have you heaving if you're using it for hours at a time doing CAD or giving virtual tours.
8k w/120-degree FOV at 400-500fps is approximately the point where you can comfortably use a VR headset to emulate the experience of sitting at a desk with three 27" 2560x1440 monitors in an arc approximately 18" away from your face... one filling ~60% of your FOV, with the other two partially visible out of the corners of your eyes. 800-1000fps, if you need to be able to comfortably look around the room, read a book in your lap, etc. for extended periods of time without risk of stress and/or VR sickness. 1600-3200fps with 1-frame latency if you want to be able to leave it on all day as you walk around & go about your day using pass-through video AR.
Just to emphasize, your eyes and brain are HARDWIRED to continually detect peripheral-vision motion and "anything weird" going on. Even if you aren't consciously aware of it, your body reacts... heart rate, blood pressure, cortisol, and stress level all increase... and if you're basically living in a pass-through AR environment, it takes a physical toll on you.
Part of the reason why Magic Leap & Hololens settle for relatively limited FOV is due to hardware limits, but part of the reason why expanding FOV well beyond ~100 degrees is a low priority is because both are intended for long-term daily wear (say, CAD). Constraining the synthetic video to foveal vision minimizes long-term stress that you'd experience if content in your peripheral vision kept jitterin
Re: (Score:2)
Score: -1, Don't be a moran.
Re: Do we have to wait for another 13 years for D (Score:2)
I suspect that beyond 8k @ 1kfps, we're going to have to stop trying to shovel raw bitmaps in real time, and move more buffering & intelligence to the display itself. Or do things like have a display with 16k resolution, but selectively compress areas in peripheral vision. Or frame-by-frame Jpeg-like compression. Because I really don't see much need for 16k in any context besides VR/AR HMDs. And with a personal HMD, you don't NEED full-range 32-bit color for every pixel out to the edge of peripheral vi
Those individual cells are hairy (Score:2)
beyond 8K resolutions
I haven't even jerked off to 4k pr0n yet! >:-(
Re: (Score:2)
Dude just skip it and go straight to 5K stuff.
And no I'm not kidding. VR porn has moved beyond 4K already.
And yes a "friend told me about it" ;-)
Re: (Score:2)
It's hard enough to jerk off to 1080p porn... you can already see every zit
Re: Opportunity missed (Score:2)
Displayport directly encapsulates HDMI. HDMI encapsulates Ethernet and I2C.
Thunderbolt directly encapsulates both USB and 4-lane (more now?) PCI express.
So, take your pic. Connect the touchscreen as USB via Thunderbolt, slow(ish) Ethernet via hdmi, or directly as a 1x PCIe controller via TB. Or as I2C via hdmi.
Re: (Score:2)
I know, right? That's why no new graphic card comes with DisplayPort 1.4 or better. Oh wait, they all do, moron.