In the brochures for receivers some mention .01% and others .05% distortion. Can we really hear the difference?
Some people certainly think they can. And manufacturers can get a little tricky with those measurements. Is that will all channels driven? What is the power output when it hits .01%? What about .05%? It gets harder and harder to maintain clean sound the more channels that are playing and at higher output levels. That's where you can really separate the quality components.