Finding the Minimum Mean Square Estimator for Scalar Parameter w

AI Thread Summary
The discussion focuses on finding the minimum mean square estimator (MMSE) for the scalar parameter w based on the observation z, which is defined as z = ln w + n. The probability density functions for w and n are provided, with f(w) being uniform between 0 and 1 and f(n) being an exponential function. A participant attempts to derive the estimator using the relationship f(z/w) = (f(n)) / g'(n) at n = z - ln w. The forum encourages sharing the specific estimator derived and the calculations for the mean square error (MSE) to validate that it is indeed the MMSE. The discussion emphasizes the need for clarity in the calculations and proof of the estimator's properties.
sant142
Messages
1
Reaction score
0
I am not able to understand how to go about this problem:

Find the minimum mean square estimator for the scalar parameter w based
on the scalar observation z = ln w + n where
f(w) =1 if 0<=w<=1;
0 else:

and
f(n) =e^-n if n>= 0;
0 else

I did f(z/w) = (f(n)) /g'(n) at n = z- ln w

Am i wrong?
 
Physics news on Phys.org
Hey sant142 and welcome to the forums.

Can you show us what estimator you got (as a function of a random sample I.I.D) and the calculations you obtained for the MSE (and proof that it is MMSE)?
 
I was reading a Bachelor thesis on Peano Arithmetic (PA). PA has the following axioms (not including the induction schema): $$\begin{align} & (A1) ~~~~ \forall x \neg (x + 1 = 0) \nonumber \\ & (A2) ~~~~ \forall xy (x + 1 =y + 1 \to x = y) \nonumber \\ & (A3) ~~~~ \forall x (x + 0 = x) \nonumber \\ & (A4) ~~~~ \forall xy (x + (y +1) = (x + y ) + 1) \nonumber \\ & (A5) ~~~~ \forall x (x \cdot 0 = 0) \nonumber \\ & (A6) ~~~~ \forall xy (x \cdot (y + 1) = (x \cdot y) + x) \nonumber...

Similar threads

Replies
2
Views
2K
Replies
11
Views
2K
Replies
3
Views
2K
Replies
1
Views
2K
Replies
5
Views
1K
Replies
7
Views
2K
Replies
2
Views
2K
Back
Top