Recent content by rayge
-
R
Iterative Minimization lemma proof
Good to know, thanks. I applied the chain rule as suggested, and got: \frac{d f(x^{k-1} + \alpha d^k)}{d\alpha} = \nabla f \cdot d^k From here, it seems reasonable that \frac{d f(x^{k-1} + \alpha d^k)}{d\alpha} = 0 is our optimal value but I don't know how to prove that from our assumptions. For...- rayge
- Post #6
- Forum: Calculus and Beyond Homework Help
-
R
Iterative Minimization lemma proof
Actually it is made clear in the beginning on the chapter that we're dealing with quadratic functions of the form f(x) = \frac{1}{2}\|Ax - b\|^2_2. My fault for not reading closely enough.- rayge
- Post #4
- Forum: Calculus and Beyond Homework Help
-
R
Iterative Minimization lemma proof
Homework Statement f(x) is the function we want to minimize. Beyond being real-valued, there are no other conditions on it. (I'm surprised it's not at least continuous, but the book doesn't say that's a condition.) We choose the next x^k through the relation x^k = x^{k-1} + \alpha_{k}d^k. We...- rayge
- Thread
- Iterative Minimization Proof
- Replies: 7
- Forum: Calculus and Beyond Homework Help
-
R
Optimization problem: minimization
sorry about that. MP(z) is the infimum of the set of all values of f(x,y) satisfying x + y \leq z, i.e. the minimum value of f(x,y) if it exists, or the greatest lower bound of it. The domain of MP(z) is the set of all z such that there are points (x,y) satisfying x + y \leq z. Formally: MP(z)...- rayge
- Post #4
- Forum: Calculus and Beyond Homework Help
-
R
Optimization problem: minimization
Homework Statement Minimize the function f(x,y) = \sqrt{x^2 + y^2} subject to x + y \leq 0. Show that the function MP(z) is not differentiable at z = 0. Homework EquationsThe Attempt at a Solution I haven't gotten anywhere because I don't understand why the solution isn't trivial, i.e. (0,0)...- rayge
- Thread
- Minimization Optimization
- Replies: 5
- Forum: Calculus and Beyond Homework Help
-
R
Graduate If g is diff'able at x_0, g(x) <= mx + b, show g'(x_0) = m
Pretty much as in the title, except one major condition: g(x_0) = m*x_0 + b. So, conditions are: g defined on the reals and is differentiable at x_0. g(x_0) = m*x_0 + b mx + b <= g(x) for all x Then show m = g'(x_0) I would love to use the intermediate value theorem, or extreme value theorem... -
R
How Does Bayesian Estimation Determine Point Estimates with Prior Distributions?
I wrote a whole reply here while totally missing what you were saying. Thanks for the response! I'll check it out.- rayge
- Post #3
- Forum: Calculus and Beyond Homework Help
-
R
How Does Bayesian Estimation Determine Point Estimates with Prior Distributions?
Homework Statement Let Y_n be the nth order statistic of a random sample of size n from a distribution with pdf f(x|\theta)=1/\theta from 0 to \theta, zero elsewhere. Take the loss function to be L(\theta, \delta(y))=[\theta-\delta(y_n)]^2. Let \theta be an observed value of the random variable...- rayge
- Thread
- Bayesian Estimate Point
- Replies: 3
- Forum: Calculus and Beyond Homework Help
-
R
Graduate Finding the MVUE of a two-sided interval of a normal
What if we construct two MVUE's, one for P(X \le c), and one for P(X \le -c), and then subtract one from the other? It still seems like we have the same problem, where the MVUE is not one-to-one...- rayge
- Post #3
- Forum: Set Theory, Logic, Probability, Statistics
-
R
Graduate Finding the MVUE of a two-sided interval of a normal
Our task is to determine if P(-c \le X \le c) has a minimum variance unbiased estimator for a sample from a distribution that is N(\theta,1). The one-sided interval P(X \le c) = \Phi(x - \theta) is unique, so constructing an MVUE is just a matter of applying Rao-Blackwell and Lehmann-Scheffe...- rayge
- Thread
- Interval Normal
- Replies: 3
- Forum: Set Theory, Logic, Probability, Statistics
-
R
Mean, variance of non-parametric estimator
Homework Statement For the nonparameteric estimator \hat{f}(x)=\frac{1}{2hn}\sum\limits_{i=1}^n I_i(x) of a pdf, (a) Obtain its mean and determine the bias of the estimator (b) Obtain its variance Homework Equations The Attempt at a Solution For (a), I think it goes like this...- rayge
- Thread
- Mean Variance
- Replies: 1
- Forum: Calculus and Beyond Homework Help
-
R
Graduate Transformation of pmf; bivariate to single-variate
Indeed! Anyone curious about how to get the answer from this, check out the Chu-Vandermonde Identity. (I ended up using the moment generating function E(e^x_1t-tx_2+tn_2), which was easier than the transformation I had been doing.)- rayge
- Post #5
- Forum: Set Theory, Logic, Probability, Statistics
-
R
Graduate Transformation of pmf; bivariate to single-variate
Thanks for the suggestion. From this I get: f(y)=\sum_{z=0}^{n_2} \binom{n_1}{y-z}\binom{n_2}{n_2-z}\Big(\frac{1}{2}\Big)^{n_1+n_2} What I want eventually is this: f(y)=\binom{n_1+n_2}{y}\Big(\frac{1}{2}\Big)^{n_1+n_2} I want very much to snap my fingers and call these equal, but I don't see it...- rayge
- Post #3
- Forum: Set Theory, Logic, Probability, Statistics
-
R
Graduate Transformation of pmf; bivariate to single-variate
Transformations always give me trouble, but this one does in particular. Assume X_1, X_2 independent with binomial distributions of parameters n_1, n_2, and p=1/2 for each. Show Y = X_1 - X_2 + n_2 has a binomial distribution with parameters n= n_1 + n_2, p = 1/2. My first instinct was...- rayge
- Thread
- Transformation
- Replies: 4
- Forum: Set Theory, Logic, Probability, Statistics
-
R
Chebychev's inequality for two random variables
(I wasn't sure how to title this, it's just that the statement resembles Chebychev's but with two RV's.) Homework Statement Let \sigma_1^1 = \sigma_2^2 = \sigma^2 be the common variance of X_1 and X_2 and let [roh] (can't find the encoding for roh) be the correlation coefficient of X_1 and X_2...- rayge
- Thread
- Inequality Random Random variables Variables
- Replies: 1
- Forum: Calculus and Beyond Homework Help