Hi, I have a complex simulation where we expect flicker noise AND thermal noise to cause problems. So for a useful simulation I need the noisespetrum of 0.3 - 10MHz at least. The interesting timespan for the simulation is around 10us. So my sim parameters would be Fmax=10M, Fmin=0.3, tstop=3+10us and I only save the last 10us. But for that I surely get too long run times. So I tried to break up the sim into two: Simulate only with flicker noise <10kHz up to 2.999s, then save that timepoint (intention: noisesources build up) Simulate with full noise starting from the timepoint 2.999s and finish sim (intention: low noisesources are further evaluated and thermal noise is added) As I need statistics I used a bunch of shell scripts to run 100 sims and for each remove the noiseseed parameter (so they are random each time) Results look quite(1) ok, but is this the way to go? Better would be to use dynamic parameter, but that's not featured (yet) for multiple transient runs and I think fmax is anyway no parameter... (1) quite ok means: Fits better to lab than others, but in the 2nd sim all noise collapses to 0V 50ns after start and restart quickly after that.
↧