Lower Bound on Weighted Sum of Auto Correlation

In summary, a lower bound on weighted sum of auto correlation is a statistical measure that represents the minimum possible value for the weighted sum of auto correlation coefficients. It is calculated by multiplying the minimum value of each individual auto correlation coefficient by its corresponding weight, and then summing up all these values. This provides a lower bound estimate for the weighted sum of auto correlation. The significance of a lower bound on weighted sum of auto correlation lies in its ability to evaluate the quality of a statistical model, as a lower bound close to 1 indicates a strong correlation between variables. It is commonly used in data analysis to assess the level of correlation between multiple variables in a dataset and cannot be improved upon, but can be compared to the actual weighted sum of auto
  • #1
Drazick
10
2

Homework Statement



Given ##v = {\left\{ {v}_{i} \right\}}_{i = 1}^{\infty}## and defining ## {v}_{n}^{\left( k \right)} = {v}_{n - k} ## (Shifting Operator).

Prove that there exist ## \alpha > 0 ## such that

$$ \sum_{k = - \infty}^{\infty} {2}^{- \left| k \right|} \left \langle {v}^{\left ( 0 \right )}, {v}^{\left ( k \right )} \right \rangle \geq \alpha {\left\| v \right\|}^{2} $$

Basically, a Weighted Sum of the Auto Correlation Functions must be greater then its value at Zero.
Namely, the Negative Values can not be summed to match the positive values.

Homework Equations


Probably some variant of Cauchy Schwartz inequality.
Maybe taking advantage of the Auto Correlation function being Symmetric.

The Attempt at a Solution


I understand that behind the scenes the correlations of the shifted versions cant' have negative value which is up to half of the norm of the vector ## v ##.
Yet I couldn't use it to prove the above.

Thank You.
 
Physics news on Phys.org
  • #2
Defining ## a \left[ k \right] = {2}^{- \left| k \right|} ##.
Moreover, the Auto Correlation function of ## v ## defined as ## {r}_{vv} \left[ k \right] = \left \langle {v}^{\left( 0 \right)}, {v}^{\left( k \right)} \right \rangle = \sum_{n = -\infty}^{\infty} {v}_{n} {v}_{n - k} ##.
Pay attention that [Auto Correlation][1] is [Hermitian Function][2].

Using the definition of [Convolution][3] one could write:

$$ \left( {r}_{vv} \ast a \right) \left[ 0 \right] = \sum_{k = -\infty}^{\infty} {2}^{- \left| k \right| } \left \langle {v}^{\left( 0 \right)}, {v}^{\left( k \right)} \right \rangle $$

Using the [Convolution Theorem][4] one could write that:

$$ \left( {r}_{vv} \ast a \right) \left[ 0 \right] = \int_{- \pi}^{\pi} {R}_{vv} \left( \omega \right) A \left( \omega \right) d \omega $$

Where ## R \left( \omega \right) ## and ## {R}_{vv} \left( \omega \right) ## are the DTFT of ## {r}_{vv} \left[ k \right] ## and ## a \left[ k \right] ## respectively.

One should notice the DTFT of ## a \left[ k \right] ## is defined only one sided. Yet since its symmetrical it can well calculated:

$$
\begin{align*}
A \left( \omega \right) & = DTFT \left\{ a \left[ k \right] \right\} = \sum_{k = -\infty}^{\infty} a \left[ k \right] {e}^{-j \omega k} = \sum_{k = 0}^{\infty} {2}^{-k} {e}^{-j \omega k} + \sum_{k = 0}^{\infty} {2}^{-k} {e}^{j \omega k} - 1 \\
& = \frac{1}{1 - 0.5 {e}^{-j \omega}} + \frac{1}{1 - 0.5 {e}^{j \omega}} - 1 = \frac{1 - {c}^{2}}{1 - 2 c \cos \left( \omega \right) + {c}^{2}} = \alpha > 0 \quad \forall c < 1
\end{align*}
$$

In the above ## c = {2}^{-1} = 0.5 ## yet actually this will hold for any ## c < 1 ##.

So the integral is given by:

$$
\begin{align*}
\int_{- \pi}^{\pi} {R}_{vv} \left( \omega \right) A \left( \omega \right) d \omega & = \int_{- \pi}^{\pi} {R}_{vv} \left( \omega \right) \frac{1 - {c}^{2}}{1 - 2 \alpha \cos \left( \omega \right) + {c}^{2}} d \omega \\
& \geq \alpha \int_{- \pi}^{\pi} {R}_{vv} \left( \omega \right) d \omega = \alpha {\left\| v \right\|}^{2}
\end{align*}
$$

As requested.

By the way the result must be real since ## a \left[ k \right] ## is symmetric and ## {r}_{vv} ## is hermitian function and hence its transform is real.

[1]: https://en.wikipedia.org/wiki/Autocorrelation
[2]: https://en.wikipedia.org/wiki/Hermitian_function
[3]: https://en.wikipedia.org/wiki/Convolution
[4]: https://en.wikipedia.org/wiki/Convolution_theorem
 

1. What is a lower bound on weighted sum of auto correlation?

A lower bound on weighted sum of auto correlation is a statistical measure that represents the minimum possible value for the weighted sum of auto correlation coefficients. It is used to determine the degree of correlation between a set of variables.

2. How is a lower bound on weighted sum of auto correlation calculated?

A lower bound on weighted sum of auto correlation is calculated by multiplying the minimum value of each individual auto correlation coefficient by its corresponding weight, and then summing up all these values. This provides a lower bound estimate for the weighted sum of auto correlation.

3. What is the significance of a lower bound on weighted sum of auto correlation?

A lower bound on weighted sum of auto correlation is significant because it helps to evaluate the quality of a statistical model. A lower bound close to 1 indicates a strong correlation between the variables, while a lower bound close to 0 indicates little to no correlation.

4. How is a lower bound on weighted sum of auto correlation used in data analysis?

A lower bound on weighted sum of auto correlation is used in data analysis to assess the level of correlation between multiple variables in a dataset. It can help identify relationships between variables and determine the strength of these relationships.

5. Can a lower bound on weighted sum of auto correlation be improved?

No, a lower bound on weighted sum of auto correlation represents the minimum possible value for the weighted sum of auto correlation. It cannot be improved upon, but it can be compared to the actual weighted sum of auto correlation in a dataset to determine the accuracy of a statistical model.

Similar threads

  • Calculus and Beyond Homework Help
Replies
1
Views
265
Replies
1
Views
167
Replies
2
Views
391
  • Calculus and Beyond Homework Help
Replies
4
Views
909
Replies
1
Views
942
  • Topology and Analysis
Replies
3
Views
2K
  • Math Proof Training and Practice
2
Replies
61
Views
6K
  • Math Proof Training and Practice
Replies
25
Views
2K
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
927
Back
Top