Hi, my question applies to this topic also, so I didn’t create another one:
How would you explain that the ‘level one sigma’ measurement is systematically and substantially higher than ‘level zero sigma’? This is visible in the table in your last reply, and also on 1st image of the post. I would expect similar values for both levels. I observe the same phenomenon in an OOK simulation, using a measured MZM:
This behavior seems to be present regardless of the biasing point, amplitude, MZM loss, position of RF amps in the transceiver or bitrate.
In my simulations, the oscilloscope located after the PD shows the anticipated level of noise for the ‘one’ amplitude level, but almost no noise for the ‘zero’ level. I know this is the cause for the visual aspect of my eye diagram, but I can’t seem to normalize it (i.e. have similar noise standard deviations for both amplitude levels).
If I can try an explanation: For all the components of the system, noise is defined as being proportional to frequency (A/Hz, …), and the spectral content is broader for ones than for zeros, according to an RFSA at the end of my transceiver. Would that be a potential cause, or I am misunderstanding something?
Thank you very much,