- #1
optrix
- 33
- 0
If I have the unit impulse response function for a discrete-time LTI system (Unit sequence response?), h[n], how can I calculate the time taken for the output to fall below 1% of its initial value, after a unit impulse is applied to the input?
In particular, I have:
[tex]h[n]=(\alpha ^{-1}-\alpha )u[n-1]-\alpha \delta [n-1][/tex]
In particular, I have:
[tex]h[n]=(\alpha ^{-1}-\alpha )u[n-1]-\alpha \delta [n-1][/tex]