Ken
New member
I am trying to understand the significance of this and how it relates to perceived sound quality. A general rule of thumb I have come across on various internet sites advises that the output impedance of the source (DAC, preamp) should be about 10% or less of the input impedance of the load (amp). This gives rise to a few questions:
1. Is this rule of thumb a generally reliable guide?
2. What audible effect, if any, does substantial deviation from the 10% guide cause?
As a hypothetical example, if the source output is 1K ohm and the amp input is 100K ohm, is that much deviation from 10% going to be audible? If so, in what ways?
1. Is this rule of thumb a generally reliable guide?
2. What audible effect, if any, does substantial deviation from the 10% guide cause?
As a hypothetical example, if the source output is 1K ohm and the amp input is 100K ohm, is that much deviation from 10% going to be audible? If so, in what ways?