The Designer's Guide Community Forum
https://designers-guide.org/forum/YaBB.pl
Design >> RF Design >> Phase Noise in a chain of buffers/nand gates get improved?
https://designers-guide.org/forum/YaBB.pl?num=1391020636

Message started by john_ana on Jan 29th, 2014, 10:37am

Title: Phase Noise in a chain of buffers/nand gates get improved?
Post by john_ana on Jan 29th, 2014, 10:37am

Hi,

I am simulating a clock driven divider+25% duty generation circuit. Divider 's output is 50% duty with I/Q phases. These clocks went through inverters and NAND gates to get 25% duty.

I tried couple of PSS+PNOISE methods, such as the one using PNOISE with noise type=source, noise type=jitter and noise type=timedomain.

From all simulation results, it seems the buffered I/Q has the worst phase noise/jitter ,while the NANDed output get better phase noise/jitter.

As shown in the attached noise type=source PNOISE sim result,
node out1 is divider output, node x2 is one buffer after, node I/Q are after multiple buffers, node I25 is after the "NAND" gate.

I tried noise type=jitter with a 0.5 threshold, and find the integrated (100K to 200M) Jee@ out1=6.7f,  Jee@I is about 24f, and Jee@I25 is only 4f.

It seems phase noise/jitter get improved in the chain of buffer/nand gates? For a simple chain of inverters, I think phase noise/jitter degradations are accumulative, i.e. the jitters at different stages are uncorrelated and the jitter caused in one stage can not be corrected or compensated by  later stages. How to explain the improved jitter/phase noise?

Title: Re: Phase Noise in a chain of buffers/nand gates get improved?
Post by Shady Adly on Feb 5th, 2014, 3:22pm

It depends on how do you get the 25% duty cycle!

If you are doing it by NANDing the I/Q outputs of the divider with the undivided clock, then these results are very reasonable because the zero crossings are determined by the less noisy undivided clock and not the noisy divided IQ outputs.

But if you are doing it by NANDing I with Q' and so on, then results are not reasonable because as you already mentioned jitter and phase noise are accumulated through the chain.

I am not a real expert in this but I have experienced similar issues with the later case, although it is not very rational for driven circuits, but this could sometimes happen when signal slope is not sharp and the swing is not rail to rail. Or if the buffer chain has an AC coupled buffer. Or if you have different supply voltages in the chain (A different LDO maybe with different noise spectrum)

Besides, from your plot I don't think it is reasonable to take a 12dB hit from a few or even multiple buffers!

Maybe you can try the last pnoise method "noise type=modulated" to seperate AM and PM noise because maybe you have high AM component that gets suppressed later.


Regards,
Shady

Title: Re: Phase Noise in a chain of buffers/nand gates get improved?
Post by loose-electron on Feb 17th, 2014, 1:54pm

seeing a block diagram of a circuit would be useful.

generally, the cascade of  circuits will lead to progressively worse phase noise with some exceptions.

Exceptions: Filters or spectral shaping which removes noise components.

block diagram would be good to get  

Title: Re: Phase Noise in a chain of buffers/nand gates get improved?
Post by deba on Mar 1st, 2018, 7:43am

Hi

It is a good thread. I ran into a similar problem recently. Please see the attached schematic. It is a simple sine to square buffer circuit. For simulating its phase noise, I use the SpectreRF tool from cadence. The noisetype=sources or timeaverage method is used to simulate the phase noise. The circuit is driven with an ideal sinusoid. To limit the AM noise, nodes inv1 and inv3 are clipped around their mid-points as shown. The phase noise is run on the clipped outputs. I find that the inv1_clip has higher noise compared to inv3_clip. Is this phenomenon real? My understanding is that in a cascade of inverters, phase noise will progressively degrade. Is my setup correct? Is there anything wrong?

Title: Re: Phase Noise in a chain of buffers/nand gates get improved?
Post by R.kumar on Apr 23rd, 2018, 6:06am

Noise to jitter conversion happens only around the sampling instant. So if you have probe points after each of your buffers, the jitter numbers obtained for any node is assuming you make the thresholding/sampling at that node. The noise gets accumulated as you go from one stage to next. However the noise to jitter transformation depends on the amount of noise and the signal slope. So at the output of last stage even though you have the largest noise if its slope is very sharp you get only a part of this noise translated to jitter.

Title: Re: Phase Noise in a chain of buffers/nand gates get improved?
Post by R.kumar on Apr 23rd, 2018, 6:18am

You will see that this is not the case if you have a chain of ADCs. In this case, the noise to jitter transformation is realistically happening in the circuit and this can not be removed by the succeeding stages. So if there is a change in frequency along the signal path, the noise transformation is permanent.

Title: Re: Phase Noise in a chain of buffers/nand gates get improved?
Post by shico on Jun 6th, 2018, 6:27am

Hi,

I believe you should use jitter not sources in the noise type. See here
http://www.designers-guide.org/Theory/cyclo-paper.pdf

Shico


deba wrote on Mar 1st, 2018, 7:43am:
Hi

It is a good thread. I ran into a similar problem recently. Please see the attached schematic. It is a simple sine to square buffer circuit. For simulating its phase noise, I use the SpectreRF tool from cadence. The noisetype=sources or timeaverage method is used to simulate the phase noise. The circuit is driven with an ideal sinusoid. To limit the AM noise, nodes inv1 and inv3 are clipped around their mid-points as shown. The phase noise is run on the clipped outputs. I find that the inv1_clip has higher noise compared to inv3_clip. Is this phenomenon real? My understanding is that in a cascade of inverters, phase noise will progressively degrade. Is my setup correct? Is there anything wrong?


Title: Re: Phase Noise in a chain of buffers/nand gates get improved?
Post by Horror Vacui on Jul 22nd, 2018, 9:22am

Adding more stages in cascade will improve your phase noise only if the signal transitions gets steeper and therefore remove the uncertainty of the threshold point. It is called "aperture jitter". Though if you are not plagued by aperture jitter, than I doubt that more stages could help you.

It is impossible to reduce noise by adding more stages. It would be against the law of physics. The only exceptions might be the filters which reduce the noise bandwidth to the signal bandwidth, or in ADCs/DACs oversamplig, which reduces spectral density of the quantization noise. Another seemingly exception might be noise cancellation in LNAs where the sign of noise gain and signal gain is different between two points. But even in the latter case, more noise is added into the system, just the architecture cancels some of them due to the higher signal gain by the added devices.

Title: Re: Phase Noise in a chain of buffers/nand gates get improved?
Post by rf-design on Aug 20th, 2018, 8:28am

Seems to be the same old problem as 16 years ago:

http://www.designers-guide.org/Forum/YaBB.pl?num=1036411197

The Designer's Guide Community Forum » Powered by YaBB 2.2.2!
YaBB © 2000-2008. All Rights Reserved.