dbishopbliss
New member
From what I understand CD players have a nominal output voltage of 2.5V.
Does this mean that preamps and linestages that were designed before CD Players WILL get overloaded by the signal coming from a CD player? Or is the correct term, MIGHT? In other words, do CD Players regularly output this much voltage or is it really just when playing really loud source material?
Does this mean if I choose an operating point for a linestage (or driver tube in an integrated amp) that was below the -2.5V grid line it will (or might) get overdriven by the CD player?
The reason I'm asking is the tube I'm looking at is more linear near the -2V gridline than it is near the -3V gridline. However, linearity won't do me much good if the tube is overdriven.
Could I simply add a padding resistor to the input to attenuate the voltage coming in to be less that 2V?
Does this mean that preamps and linestages that were designed before CD Players WILL get overloaded by the signal coming from a CD player? Or is the correct term, MIGHT? In other words, do CD Players regularly output this much voltage or is it really just when playing really loud source material?
Does this mean if I choose an operating point for a linestage (or driver tube in an integrated amp) that was below the -2.5V grid line it will (or might) get overdriven by the CD player?
The reason I'm asking is the tube I'm looking at is more linear near the -2V gridline than it is near the -3V gridline. However, linearity won't do me much good if the tube is overdriven.
Could I simply add a padding resistor to the input to attenuate the voltage coming in to be less that 2V?