I have simulated a TSPC D flip flop, whose output (Q) is logic '1' at time (zero +). When i run a monte Carlo simulation for certain corners (FF/SF say), My Q(Zero + time) is logic '0' in some case. This is giving me junk values for the delay expression i am running a Monte carlo simulation for. Example: delay for 1 single FF case -> 200p (say), Then the same simulation at certain data point (in a 2000 monte Carlo run) is giving me -5p. Because of the initial output (Q at time zero +). This leads to wrong mean and standard dev. in my final result. Is there anyway i can delete certain points from my 2000 runs Monte Carlo simulation? -- Thank you and
↧