Unknown noise in the density spectrum of an ideal sine wave

There is something I don’t understand: I calculate the spectral density of the signal (calculating its FFT) and it seems to work correctly, but it continues to have some background noise, although I do it on a perfect sine wave with 2 frequencies (10 and 30 Hz), which I generate myself.

Of course, the noise is actually not too annoying, because it is visible only with a logarithmic scale, but even where does it come from? This is normal? Do I have an error in my signal or somewhere?

Energy spectral density of 10 + 30Hz sin wave

+4
source share
1 answer

Mostly quantization noise , but there may also be little noise from floating point rounding errors, etc. in the FFT itself.

Your “perfect sine wave” cannot be perfectly represented digitally, as you will always have ultimate precision. The difference between the theoretical value of the waveform at the time of its sampling and the actual value of the sampling is called a “quantization error”. For N-integer data, the error will usually be approximately uniformly distributed over the +/- 0.5 LSB range and will appear to be “white,” that is, have an approximately flat spectrum. Obviously, the higher the resolution of the sample (larger N), the smaller the quantization error, but since N cannot be infinite, there will always be a finite amount of quantization. For N = 16 bits, as is used, for example, Digital sound “CD quality”, quantization noise is usually about 96 dB below full scale.

+5
source

Source: https://habr.com/ru/post/1403454/


All Articles