How to attenuate a digital signal?

alanbeeb

Grumpy young fogey
Joined
Mar 5, 2004
Messages
967
Reaction score
1
Location
Edinburgh
Folks - any advice appreciated.... the problem is when feeding digital signal into my Behringer 2496, I'm getting digital clipping occuring fairly regularly, and can't see a way to lower the digital input sensitivity. I'm using AES/EBU in, coming from a monarchy 48/96 upsampler, so the input signal is at 96Khz.

Any ideas? thanks
 
Digital clipping can occur when the input to an ADC is too high (or perhaps this is actually analogue clipping...) or when the processing of data overflows numerically.

You have two devices processing digitally, an 'upsampler' and the Behringer. You need to establish whether the input to the Behringer is clipped or whether the Behringer processing is clipping.

Paul
 
What music are you listening to? A lot does clip anyway! Classical is unlikely to.

Put the DEQ in complete bypass mode and see if it still clips. If it does, try adjusting the 'Gain Offset' in the utility menu.
 
Tried the gain offset.... the clipping is much worse if I turn it up, but doesn't reduce if I turn it down. Quite possibly the clipping is present on the recordings, but I'm fairly sure I'm getting distortion which isn't happening when playing through the analogue inputs. Most of the rock stuff I've tried seems to be clipping most of the time, as you say classical recordings only doing it at the "really loud bits".

To match levels with the analogue input I'm having to run the gain offset at -6.0db when using the digital input.
 
Or simply add a couple of resistors to the signal say 10k in series and 1k to ground. this should provide a simple solution to see if thats the problem.
 
I have been in contact with Monarchy's agents and they have offered to send parts and instructions how to fit, their proposed solution sounds like the one you are suggesting. Thanks.

Now...anyone handy with soldering nearby? :D
 
How can a resistor change the level of a digital signal? The level is encoded in to the 16bit word.
 
Your just reducing the volume of the signal not the content ie reducing the voltage the reciever chip sees.
 
So the level the digital signal is sent at can cause clipping? Even though it has nothing to do with the music? o.O I would have thought as long as the level is not so high it can't decode the data, it will be either on or off, and you will have the data just the same.

Please explain?
 
they way I see it ...which could be too simplistic is that the signal overwhelms the reciever chip...
 
Well I guess we will find out :)

I can see that it might cause some sort of distortion but not sure about what type or if you would get any sound at all. Might be either too high or alright or too low. Thorsten would know surely! Where are ya mister?!
 
The signal level won't affect digital clipping - this only happens when the 16-bit word (or whatever word size is used) reaches its maximum possible value (i.e. 1111111111111111 in simplest possible terms - it's most likely more complicated than that, and I don't know enough about PCM to be sure). As Paul says, it sounds like there's an ADC in the chain somewhere that's receiving too high an input...

Alan, what's the compete equipment chain you're using?

Dunc
 
Yes your right...... I'd have thought any audible clipping must be produced outside the decoding. Its possible that the dac output is to high for the following analouge stages ? cure would still be the same but different possitioning of the attenuation components. It's still worth trying to alter the input signal level to see if that cures the problem.
 


Write your reply...
Back
Top