• Welcome to the all-new HUG. All content has been converted from the old system, and over the next days we will re-style HUG in a more familiar way.

"Money For Nothing" - supplemental: CD v. SACD

A.S.

Administrator
Staff member
The problems with the remaster are well-known. It's a real shame...
What I find really interesting in Mark's post #15 (to which I added the screen recording etc.) is that when we compare the original CD sound with the squashed and then loudness reduced remastered sound, there is a marked difference in subjective sound quality once the sound quantity (loudness) is approximately the same. The sort of subjective difference in quality that could so easily lead a home listener/reviewer to attribute adjectives to that difference. Did you notice that?

To make it clear, I've taken Mark's original and followed it with his compressed and then loudness reduced version. Assume you had not read anything about compression, clipping, headroom, amplifier power supplies - in fact, your technical knowledge and understanding was zero (and you were perfectly content with that situation). By just listening to these Dire Straits clips below, representing not two different masterings but two different amplifiers, turntables, cartridges, cables, speakers or whatever, would you say, for example, that one was more clinical and "digital" sounding, and the other more "analogue" sounding?



See how loudness of a sound stream and the musical context and spaciousness around it (all detected as loudness variations by the ear and converted into subjective impressions in the brain) absolutely and unequivocally defines its perceived quality? Loudness, level defines everything in audio because we have no other way of detecting changes in local air pressure around our heads i.e. what humans call sound.

Remember! Loudness, a scientific quantity, defines subjective (non-scientific) quality. If there is no change in loudness, there can be no change in event detection and hence no true change in quality.
 

Pluto

New member
Statistics from discs

Statistics from discs

My copy is a Super Bit Mapping version done in 1996 by Bob Ludwig of Masterdisk. I looked at the stats for the whole of track 2 "Money for nothing" and the left channel peaks at - 8.52 and the average RMS is -17.92. The right channel peak is -9.1 with the average RMS - 18.22.

I would be curious to know the equivalent numbers for the original CD and SACD.
Here are the numbers having analysed the 1985 Red Book CD disc:

Peak Amplitude: -0.01 0.00 dB
Average RMS Amplitude: -27.18 -27.00 dB

The same numbers from the 20th Anniversary SACD (DSD layer):

Peak Amplitude: -4.41 -4.55 dB
Average RMS Amplitude: -22.91 -22.83 dB

When comparing these figures, bear in mind that when converting between DSD & PCM the loudest DSD that is supposed to be recorded, converts to -6dBFS PCM. This is because, unlike PCM in which the loudest possible signal is defined unambiguously (all bits set to 1), the DSD world isn't so precise so there is mandatory headroom of 6dB to allow for the possibility that some DSD might excurse above -6dB when converted to PCM.

It might be interesting for you to post a screen shot of the entire track on your copy of the disc, as in post #5. After all, every picture tells a story!
 

Don Leman

Member
Clarification of the measurements

Clarification of the measurements

It might be interesting for you to post a screen shot of the entire track on your copy of the disc, as in post #5. After all, every picture tells a story!
Thanks for posting your numbers. Perhaps one of the statements I made was misleading. When I said peak I was referring to the Maximum RMS Power. Here are the actual numbers in full.

http://www.flickr.com/photos/lemanfamily/9562457235/

Here is the screen shot of the whole track.

http://www.flickr.com/photos/lemanfamily/9565248516/in/photostream/
 

Pharos

Member
Implications for level variability...

Implications for level variability...

I agree whole heartedly with what Alan says in post 21, but the implications for the subjective evaluation of Hi-Fi are enormous.

Perhaps even relatively subtle changes in recording dynamics can, and may well have in the past, influenced evaluations of equipment, perhaps even by reviewers, and this may have influenced sales of new equipment.

Perturbing.
 

EricW

Active member
Digital v. analogue?

Digital v. analogue?

Ummm. This is interesting. It's a pity nobody has addressed by question here (the other thread) because to my ears, Mark's audio examples of current remastering comprehensively explain the "digital" v. "analogue" sound ....
OK, I don't know if "digital" or "analogue" are the descriptors I'd use, but to my ears, the non-remastered track clearly sounds better. It sounds more like real instruments in a real space. The leading edges are all present. The remastered version sounds, to me, dull and foggy - detail is lost, and everything sounds somewhat alike. Maybe some would describe the former as "analog(ue)" and the latter as "digital", but that doesn't seem to make much sense.
 

A.S.

Administrator
Staff member
... so much 'nicer'

... so much 'nicer'

OK, I don't know if "digital" or "analogue" are the descriptors I'd use, but to my ears, the non-remastered track clearly sounds better. It sounds more like real instruments in a real space. The leading edges are all present. The remastered version sounds, to me, dull and foggy - detail is lost, and everything sounds somewhat alike. Maybe some would describe the former as "analog(ue)" and the latter as "digital", but that doesn't seem to make much sense.
[Referring to my post 21 above]

... But don't you think that the second clip in the example (repeated below) sounds so much "warmer" and 'nicer' and those are the very attributes which we so often read that listener's seek out in their home audio? In contrast, the opening clip sounds so much "harsher", "colder", "analytical", "thin", all summarised by the dreaded words . . . "digital sounding".


If you acclimatise to the softer second example in the video above, you would almost certainly reject the opening example as being unnatural.

Hence, at a stroke, the explanation of the "digital" v. "analogue" sound?
 

Pluto

New member
A challenge - send me your high res recordings for downsampling ...

A challenge - send me your high res recordings for downsampling ...

Hi Pluto (or anyone else) -
I personally need some clarification here, as I am somewhat confused.
A high resolution disc replayed on a suitable player will output a certain sound quality, when this same disc is then 'down graded' (in effect) to normal red book standard then how can you expect to hear the original high resolution sound?
What needs to be explained here is that most uncompressed digital standards are capable of better resolution than the human ear can perceive.

The earliest digital standard to which most people (over 30) were knowingly exposed is 16 bit at a sampling rate of 44.1kHz, abbreviated to 16/44 and known as “Red Book”, after the official Philips/Sony (CD inventors) publication which detailed everything you needed to know to create CDs and build a player using only a bag of sand and a blowtorch.

Back in the 1970s (when optical storage and digital audio were really cutting edge stuff) 16/44 was thought to be the lowest resolution consistent with the joint goals of adequate sound quality, manageable amounts of data and compatibility with the storage technology available at that time for mastering and manufacture.

These days, those who would sell you hardware and software to deal with recordings of greater resolution than Red Book will invoke all kinds of secrets and lies to convince prospective purchasers just how poor 16/44 is, and how much you need their higher resolution solutions, even to improve upon your gramophone turntable. HOWEVER, the fact remains that the jury is still out on the question of whether or not 16/44 can offer everything the human ear needs, any additional data accomplishing little in terms of audible satisfaction.

In an attempt to prove or disprove this contention, I have a standing challenge open to anybody. Send me whatever high resolution material you wish, and I will convert it to Red Book standard, 16/44. The idea is then to see if you can tell the two apart by listening alone*. So far, my record on this is rather close to 100%, insufficient statistical data preventing me from claiming 100%.

* This is vital if the test is to have real meaning. It is easy to determine the sampling rate & bit depth of a recording by simple inspection. The point is to determine whether or not Red Book offers everything the normal human ear needs.
 

Pluto

New member
Creeping limiter malaise

Creeping limiter malaise

So it would appear that your 1996 specially remastered version shows clear signs of the limiter malaise that has since become the noose around the neck of popular music recording.

It is interesting to note that the early technical studio guidelines to mastering for CD advised that the upper 10dB should be reserved for handling occasional transient peaks.

It is truly daft that we have arrived where we now find ourselves.
 

PhilN

New member
A marketeer's 'better' is not necessarily 'better'

A marketeer's 'better' is not necessarily 'better'

What needs to be explained here is that most uncompressed digital standards are capable of better resolution than the human ear can perceive.

The earliest digital standard to which most people (over 30) were knowingly exposed is 16 bit at a sampling rate of 44.1kHz, abbreviated to 16/44 and known as “Red Book”, after the official Philips/Sony (CD inventors) publication which detailed everything you needed to know to create CDs and build a player using only a bag of sand and a blowtorch.

Back in the 1970s (when optical storage and digital audio were really cutting edge stuff) 16/44 was thought to be the lowest resolution consistent with the joint goals of adequate sound quality, manageable amounts of data and compatibility with the storage technology available at that time for mastering and manufacture.

These days, those who would sell you hardware and software to deal with recordings of greater resolution than Red Book will invoke all kinds of secrets and lies to convince prospective purchasers just how poor 16/44 is, and how much you need their higher resolution solutions, even to improve upon your gramophone turntable. HOWEVER, the fact remains that the jury is still out on the question of whether or not 16/44 can offer everything the human ear needs, any additional data accomplishing little in terms of audible satisfaction.

In an attempt to prove or disprove this contention, I have a standing challenge open to anybody. Send me whatever high resolution material you wish, and I will convert it to Red Book standard, 16/44. The idea is then to see if you can tell the two apart by listening alone*. So far, my record on this is rather close to 100%, insufficient statistical data preventing me from claiming 100%.

* This is vital if the test is to have real meaning. It is easy to determine the sampling rate & bit depth of a recording by simple inspection. The point is to determine whether or not Red Book offers everything the normal human ear needs.
Penny's just dropped, I wasn't thinking it through properly - I see what you were getting at in your original post.

Must admit to believing the marketing hype (having never done any comparison between the two) and thinking the extra sampling rate etc must be better!

Thanks for the extra information Pluto.
 
Top