Obelus
- 3
- 0
Let \mathbb X be a set of mean-shifted vectors \mathbf X_1, \mathbf X_2, \ldots, \mathbf X_k in \mathbb R^n. By mean-shifted, I mean that \text{Mean}(\mathbf X_i) = 0 \forall i \in {0, \ldots, k}. I want to find a vector \mathbf T \in \mathbb R^n that maximizes the function
f(\mathbf T) = \sum \limits_{\mathbf X \in \mathbb X} \big ( cos(\theta_{\mathbf X, \mathbf T}) \big )^2 = \sum \limits_{\mathbf X \in \mathbb X} \big ( \frac{\mathbf X \cdot \mathbf T}{||\mathbf X|| \cdot ||\mathbf T||} \big )^2
(Edit: maximizing this f does not actually get what I want. Instead, I want to maximize the geometric mean of the correlations - see my followup posts).
In other words, I want to find a vector \mathbf T that has a small angle \theta to each \mathbf X \in \mathbb X. (Actually there must be infinitely many solutions, since any linear transformation of \mathbf T is also a solution).
My real motivation is statistical: this is equivalent to finding a vector that has a high Pearson correlation coefficient with all the vectors in \mathbb X.
I can use numerical algorithms to optimize f, but I would prefer to find a closed-form solution. This problem is part of a larger algorithm that I am trying to speed up, and it spends most of its time trying to find T. Is a closed-form solution possible? Or is this problem similar to an existing problem, which I can adapt?
If k=2, the solution should just be to rotate \mathbf X_1 halfway to \mathbf X_2, using some multidimensional rotation as discussed in http://forums.xkcd.com/viewtopic.php?f=17&t=29603". Perhaps this rotation concept could be extended to find the "rotational average" of \mathbb X, if that makes sense?
Any help towards finding a closed-form solution to this problem would be appreciated.
f(\mathbf T) = \sum \limits_{\mathbf X \in \mathbb X} \big ( cos(\theta_{\mathbf X, \mathbf T}) \big )^2 = \sum \limits_{\mathbf X \in \mathbb X} \big ( \frac{\mathbf X \cdot \mathbf T}{||\mathbf X|| \cdot ||\mathbf T||} \big )^2
(Edit: maximizing this f does not actually get what I want. Instead, I want to maximize the geometric mean of the correlations - see my followup posts).
In other words, I want to find a vector \mathbf T that has a small angle \theta to each \mathbf X \in \mathbb X. (Actually there must be infinitely many solutions, since any linear transformation of \mathbf T is also a solution).
My real motivation is statistical: this is equivalent to finding a vector that has a high Pearson correlation coefficient with all the vectors in \mathbb X.
I can use numerical algorithms to optimize f, but I would prefer to find a closed-form solution. This problem is part of a larger algorithm that I am trying to speed up, and it spends most of its time trying to find T. Is a closed-form solution possible? Or is this problem similar to an existing problem, which I can adapt?
If k=2, the solution should just be to rotate \mathbf X_1 halfway to \mathbf X_2, using some multidimensional rotation as discussed in http://forums.xkcd.com/viewtopic.php?f=17&t=29603". Perhaps this rotation concept could be extended to find the "rotational average" of \mathbb X, if that makes sense?
Any help towards finding a closed-form solution to this problem would be appreciated.
Last edited by a moderator: