I tried comparing the receiver to a CD-Player of the same track. They synced pretty easily so it was easy to switch them back & forth. At 128K, there was a noticable difference. I wouldn't rule out the encoder (older MusicMatch), but it bothered me.

At 128k, depending on the music, you might indeed hear a difference between the MP3 and the original. 128k is the bare-minimum level of encoding where Joe Six-pack won't know the difference. I know that I can clearly hear compression artifacts on some 128k files, and it's not subtle if you know what to listen for.

However, there are some other factors besides the MP3 encoding that you need to realize before doing an A/B test like the one you described:

1) The CD player and the MP3s are going through different audio output circuitry, and as such, will have different equalization curves. So to begin with, you're not comparing apples to apples.

2) I don't know whether or not the MP3 was ripped correctly (i.e., no errors in the rip). That might cause it to sound different/bad, but wouldn't necessarily be the fault of the encoder. Remember that ripping and encoding are two different steps.

Both of those can be taken care of by the one-burn audio testing schema described earlier in this thread.

__________
Tony Fabris
_________________________
Tony Fabris