YouTube Premium Adds 256kbps Audio 'Experiment' For Music Videos 13
YouTube Premium subscribers can now stream music videos at 256kbps bitrate audio quality, matching the high-fidelity standard previously exclusive to YouTube Music, the video platform said.
The audio upgrade comes as part of an expanded experimental features program that allows Premium users to test multiple new features simultaneously, departing from its previous single-experiment limitation, the company said.
The audio upgrade comes as part of an expanded experimental features program that allows Premium users to test multiple new features simultaneously, departing from its previous single-experiment limitation, the company said.
I'm Curious If They'll Upgrade Existing Videos (Score:3)
I'll have to activate this and check the one video I uploaded with full-rate PCM audio. It might even qualify as a music video since it's a concert.
Since when is 256kpbs considered high fidelity? (Score:5, Insightful)
I can hear the metallic underwater oscillation from lower bit rates when using my headphones, which is why I use lossless, or just leave it as an uncompressed format.
Re:Since when is 256kpbs considered high fidelity? (Score:4, Interesting)
What's more annoying to me is that some TV channels have even worse quality. One channel that broadcasts music most of the time uses 128kbps MP2. Some other channels also use that bitrate with HD video and 192kbps MP2 (reasonable quality), so basically choose if you want higher quality video or audio, because you're not getting both.
For me 256kbps mp3 sounds good enough, but I still prefer lossless or uncompressed. I have enough HDD space, I might as well keep my CD rips as wav. I would rather listen to a damaged/worn out analog tape than to a lossy compressed file with not enough bitrate - if I can hear the artifacts they annoy me much more than just the loss of high frequencies.
Re: (Score:3)
CD is only 16-bit.
If they are doing 24-bit with perceptual coding to 256Kbps from analog masters you might hear a difference.
I can only tell with my studio headphones or my stereo system with studio monitors and a 15" subwoofer.
I use those *so rarely* anymore I almost don't care. And only some of my 70's music was recorded with enough dynamic range to notice. Most of the remasters of those use trash compression to make them sound better in the car. The old versions are often unavailable.
Re: (Score:3)
16 bit is already at the maximum of human hearing, Philips and Sony weren't a bunch of amateurs when they concocted the CD format. Higher bitrates is for the audiophile believers.
What's more important is a clear and least tampered with recording capturing the acoustics of the instruments and the recording venue in the case of classic, jazz, rock, and anything else which isn't synthesised like trance (don't misunderstand, I have music I like in just about all genres).
I ripped all my CDs to Ogg vorbis quality
Re: (Score:2)
"16 bit is already at the maximum of human hearing, Philips and Sony weren't a bunch of amateurs when they concocted the CD format. Higher bitrates is for the audiophile believers."
Redbook CD audio is a good compromise, not the "maximum of human hearing". If you can't personally hear any better, that's you, not everyone else. I can notice the difference between 16bits/44.1kHz and 24bits/96kHz on my Hi-Fi set-up, but only when I know by heart every speck of sound from the CD (favourite songs). Still, the dif
Re:Since when is 256kpbs considered high fidelity? (Score:4, Informative)
I can notice the difference between 16bits/44.1kHz and 24bits/96kHz on my Hi-Fi set-up, but only when I know by heart every speck of sound from the CD (favourite songs). Still, the difference exists.
Noticing a difference doesn't necessarily mean the higher-resolution format provides increased fidelity. Obligatory in-depth article about high-resolution audio [xiph.org]. The article is mainly about 24/192 audio, but the same reasoning apply.
In particular about hearing differences between CD quality and higher-resolution:
192kHz digital music files offer no benefits. They're not quite neutral either; practical fidelity is slightly worse. The ultrasonics are a liability during playback.
Neither audio transducers nor power amplifiers are free of distortion, and distortion tends to increase rapidly at the lowest and highest frequencies. If the same transducer reproduces ultrasonics along with audible content, any nonlinearity will shift some of the ultrasonic content down into the audible range as an uncontrolled spray of intermodulation distortion products covering the entire audible spectrum. Nonlinearity in a power amplifier will produce the same effect. The effect is very slight, but listening tests have confirmed that both effects can be audible.
About 16 vs 24 bits:
Thus, 16 bit audio can go considerably deeper than 96dB. With use of shaped dither, which moves quantization noise energy into frequencies where it's harder to hear, the effective dynamic range of 16 bit audio reaches 120dB in practice, more than fifteen times deeper than the 96dB claim.
120dB is greater than the difference between a mosquito somewhere in the same room and a jackhammer a foot away.... or the difference between a deserted 'soundproof' room and a sound loud enough to cause hearing damage in seconds.
16 bits is enough to store all we can hear, and will be enough forever.
Oh great (Score:5, Funny)
Now I'll need to save up for a tube amplifier with Monster cables so I can pretend to hear high-quality audio that I haven't been able to hear since I was a kid.
Audiophool here... (Score:4, Funny)
I'm not interested unless the individual bits are cryogenically treated and the datacenter cables are woven from virgin yak hair by blonde-headed maidens.
Not seeing it (Score:2)