PDA

View Full Version : Is 24 bit better than 16 bit?



Kellen
2009-11-06, 09:56
All of this talk about the new Touch being able to play 24 bit files and the new Beatles remasters being offered in 24 bit FLAC files has me wondering why all the fuss over the higher bit depth offerings.

As I had it explained to me the only thing 24 bit gives you over 16 bit is an increase in the dynamic range that can be encoded. In this case an added dynamic range increase of 48 dB, from 96dB (16 bits) to 144dB (24 bits). But, given that, in the case of the Beatles anyhow, the music originally recorded doesn't have a dynamic range anywhere approaching the 96dB of 16 bit audio, why is the extra 48dB's even needed?

I also read that the added bit depth resolves the music more accurately in the dynamic range where the music actually lies since you will be quantizing the amplitude at more sample points in the process.

Anyone technical enough to explain this so that I might be better informed whether or not to replace my Classic with a Touch simply to play higher bit depth music?

Thanks in advance

pfarrell
2009-11-06, 10:03
Kellen wrote:
> Anyone technical enough to explain this so that I might be better
> informed whether or not to replace my Classic with a Touch simply to
> play higher bit depth music?

The answer is simple, don't. Get a new Touch because you are tired of
looking at the Classic. Or because you want a touch screen, or full
color display or...

There is a small benefit to 24 bit music, but only on music that has
either explicitly been remastered for 24 bits, or better, recorded,
mixed, processed, and produced in 24 bits. Better still, if it was done
all in 32 or more bits up until the 24 bit version was output and sold.

If a crooked record company takes the 16 bit files, converts them to 24
bits and sells it to you again (this is what, the fifth time (single,
LP, cassette, CD, now 24 bit) then there is no information in the extra
8 bits.

Acoustic music has a much higher chance of sounding better in 24 bits,
most pop/rock/hiphop/etc is so processed that it makes zero difference.

--
Pat Farrell
http://www.pfarrell.com/

funkstar
2009-11-06, 10:10
If a crooked record company takes the 16 bit files, converts them to 24
bits and sells it to you again (this is what, the fifth time (single,
LP, cassette, CD, now 24 bit) then there is no information in the extra
8 bits.
Surely if they go back to the original analogue masters to produce the 24bit recordings there would be a benefit though?

pfarrell
2009-11-06, 10:17
funkstar wrote:
> Surely if they go back to the original analogue masters to produce the
> 24bit recordings there would be a benefit though?

Not convinced. If you take the Beatles reissues as an example, they were
recorded a long time ago, and designed for mono.

16 bit signals have 96 dB of signal/noise ratio, and the analog tapes of
the 60s simply didn't have that much. Maybe they had 70 to 75 dB of
signal over noise. And they sure didn't have 20-20kHz bandwidth

So the answer is "maybe"
if they were pulled off with 14 bits or so of real signal, there could
be some value in processing them at 24 bits, and thus there could be a
chance that 18 bits of value is there.

All you can know for sure is that properly done 24 bit signals have less
worry about dither.

--
Pat Farrell
http://www.pfarrell.com/

gdpeck
2009-11-06, 10:30
...
...If a crooked record company takes the 16 bit files, converts them to 24
bits and sells it to you again (this is what, the fifth time (single,
LP, cassette, CD, now 24 bit) then there is no information in the extra
8 bits...


you left out 8-track. also loudness war victim CD re-masters.

mudlark
2009-11-06, 10:43
the 24bit BandW music sounds very good.

Phil Leigh
2009-11-06, 10:58
The Beatles remasters were done by copying the original tape masters to 24/192 and all operations were done at that depth/rate until they were downsampled/dithered to redbook. Working at these rates/depths gives more technical "wiggle room" and reduces "errors" in the process (in theory).

I live in hope that one day they will be available as at least 24/48 downloads.

All the 24/48 (or higher) files I have sound universally better than their redbook versions via the Touch. Now of course this may be because the masters are better. I don't care about the technicalities, I care about the music and these versions sound better for whatever reason.

eq72521
2009-11-06, 15:48
The Beatles remasters were done by copying the original tape masters to 24/192 and all operations were done at that depth/rate until they were downsampled/dithered to redbook. Working at these rates/depths gives more technical "wiggle room" and reduces "errors" in the process (in theory).

I live in hope that one day they will be available as at least 24/48 downloads.

All the 24/48 (or higher) files I have sound universally better than their redbook versions via the Touch. Now of course this may be because the masters are better. I don't care about the technicalities, I care about the music and these versions sound better for whatever reason.

They're probably just mastered louder. ;P

Phil Leigh
2009-11-07, 01:16
They're probably just mastered louder. ;P

Actually, they are mostly slightly quieter.

cliveb
2009-11-07, 01:43
All of this talk about the new Touch being able to play 24 bit files
[snip]
Anyone technical enough to explain this so that I might be better informed whether or not to replace my Classic with a Touch simply to play higher bit depth music?
Remarkably, in all of the replies so far, nobody has pointed out that your SB Classic can play 24 bit files anyway. The Touch can do 96kHz sample *rate* (while the Classic only does 48kHz), but they both support the same bit depth.


As I had it explained to me the only thing 24 bit gives you over 16 bit is an increase in the dynamic range that can be encoded.
True.


I also read that the added bit depth resolves the music more accurately in the dynamic range where the music actually lies since you will be quantizing the amplitude at more sample points in the process.
Unless the music you're sampling has signal below -96dB, this is false...

Suppose you have an original analogue signal with a S/N ratio of 80dB. What this means is that the noise in that signal makes it impossible to know what the instantaneous voltage of the signal really should be beyond a certain degree of accuracy. For a 80dB S/N ratio, the degree of that uncertainty is about 1 part in 10,000 (or thereabouts). If you digitally sample it at 16 bit accuracy (where the degree of uncertainty in measuring is about 1 part in 65,000), then all that happens is that you are unable to accurately sample the noise below the -96dB point. So you end up with a recording where all of the wanted signal and some of the noise (between -96dB and -80dB) is captured perfectly, and the rest of the noise (below -96dB) is lost and replaced by other noise (also below -96dB) that is generated by the digital sampling process.

If you sample the same analogue signal at 24 bit accuracy, you also perfectly capture the wanted signal, and this time get to accurately capture the noise down to -144dB. So by recording at 24 bit, all you do is more faithfully record the lower level noise.

(Note that all the figures used are theoretical maxima - in practice a typical 16 bit sampler will probably only get down to about -90 or -93dB, and a good 24 bit sampler will get down to about -120dB).

Kellen
2009-11-07, 13:44
Remarkably, in all of the replies so far, nobody has pointed out that your SB Classic can play 24 bit files anyway. The Touch can do 96kHz sample *rate* (while the Classic only does 48kHz), but they both support the same bit depth.

Ah, thanks Clive. That's a relief.



Unless the music you're sampling has signal below -96dB, this is false...

Suppose you have an original analogue signal with a S/N ratio of 80dB. What this means is that the noise in that signal makes it impossible to know what the instantaneous voltage of the signal really should be beyond a certain degree of accuracy. For a 80dB S/N ratio, the degree of that uncertainty is about 1 part in 10,000 (or thereabouts). If you digitally sample it at 16 bit accuracy (where the degree of uncertainty in measuring is about 1 part in 65,000), then all that happens is that you are unable to accurately sample the noise below the -96dB point. So you end up with a recording where all of the wanted signal and some of the noise (between -96dB and -80dB) is captured perfectly, and the rest of the noise (below -96dB) is lost and replaced by other noise (also below -96dB) that is generated by the digital sampling process.

If you sample the same analogue signal at 24 bit accuracy, you also perfectly capture the wanted signal, and this time get to accurately capture the noise down to -144dB. So by recording at 24 bit, all you do is more faithfully record the lower level noise.

(Note that all the figures used are theoretical maxima - in practice a typical 16 bit sampler will probably only get down to about -90 or -93dB, and a good 24 bit sampler will get down to about -120dB).
I'm curious, but technically challenged it seems, so I searched for additional info on these 24 bit Beatles remasters and happened upon this thread which debates the technical benefits of 24 bits versus 16 bits for these Beatles release.

http://www.stevehoffman.tv/forums/showthread.php?t=198394&page=11

It seems that the principals involved in this debate believe the benefit of 24 bit over 16 bit extends beyond simply going from 96dB to 144dB but rather involves capturing a "finer" value of amplitude at each sample in the range where the music actually lies.

Is this wrong or right?

It's still all confusing to me.

cliveb
2009-11-08, 03:04
It seems that the principals involved in this debate believe the benefit of 24 bit over 16 bit extends beyond simply going from 96dB to 144dB but rather involves capturing a "finer" value of amplitude at each sample in the range where the music actually lies.

Is this wrong or right?.
Perhaps a thought experiment may help. Consider an analogue waveform that you're going to sample digitally. At every instant in time when a sample is taken, you measure its height. If you're sampling at 16 bit, the accuracy with which you can measure that height is 1 part in 65,536. If you're sampling at 24 bit, then the accuracy is 1 part in 16,777,216. Clearly, you get a much more accuate measure of the waveform's value at 24 bit.

BUT... the above refers to sampling a *perfect* waveform. In the real world, the analogue signal you're sampling has got noise in it. That noise means that the height of the analogue waveform at any instant in time is only an approximation of what it should be. (And if the analogue waveform has an S/N ratio of 80dB, its height at any instant in time is potentially in error to the tune of about 1 part in 10,300). So measuring it at extremely high accuracy simply records the analogue noise better. Who cares about that? Noise is, by definition, random unwanted "stuff". It contains zero information, and capturing it faithfully does not improve the fidelity of the recording.