Hi Folks!
In order to obtain a derivative of an analog signal, if I use an ideal time derivative operator V(out) <+ ddt(V(in), there is a lot of noise observed at the output. I believe this is because this block does not have continuous states and the inability of the solver to take smaller steps causes the output to change rapidly.
Could anyone please suggest me a convenient option to eliminate this issue and yield a clean noiseless signal at the derivative output with adequate accuracy.
Thanks in advance!