- #1
logarithmic
- 107
- 0
Suppose we wish to estimate a probability density given the points {x_1, ..., x_n} using a histogram [tex]\hat{f}(x)[/tex].
I have a book that says [tex]Bias(\hat{f}(x))=E_f(\hat{f}(x))-f(x)=\frac{1}{2}f'(x)(h-2(x-b_j))+O(h^2)[/tex] for [tex]x\in(b_j,b_{j+1}][/tex].
Can someone explain where the second equality comes from? I pretty sure it's a Taylor expansion, but I'm not sure how to Taylor expand the expected value.
The notation is as follows:
[tex]h[/tex] is the width of the histogram bins.
[tex]b_j[/tex] and [tex]b_{j+1}[/tex] are the boundaries of the j-th bin.
[tex]\hat{f}(x)=n_j/(nh)[/tex] for [tex]x\in(b_j,b_{j+1}][/tex], where [tex]n_j[/tex] is the number of x points in the j-th bin. and n is the number of x points in total.
Any help is appreciated.
I have a book that says [tex]Bias(\hat{f}(x))=E_f(\hat{f}(x))-f(x)=\frac{1}{2}f'(x)(h-2(x-b_j))+O(h^2)[/tex] for [tex]x\in(b_j,b_{j+1}][/tex].
Can someone explain where the second equality comes from? I pretty sure it's a Taylor expansion, but I'm not sure how to Taylor expand the expected value.
The notation is as follows:
[tex]h[/tex] is the width of the histogram bins.
[tex]b_j[/tex] and [tex]b_{j+1}[/tex] are the boundaries of the j-th bin.
[tex]\hat{f}(x)=n_j/(nh)[/tex] for [tex]x\in(b_j,b_{j+1}][/tex], where [tex]n_j[/tex] is the number of x points in the j-th bin. and n is the number of x points in total.
Any help is appreciated.