Hoeffding inequality for the difference of two sample means?

JanO
Messages
3
Reaction score
0
In W. Hoeffding's 1963 paper* he gives the well known inequality:

P(\bar{x}-\mathrm{E}[x_i] \geq t) \leq \exp(-2t^2n) \ \ \ \ \ \ (1),

where \bar{x} = \frac{1}{n}\sum_{i=1}^nx_i, x_i\in[0,1]. x_i's are independent.

Following this theorem he gives a corollary for the difference of two sample means as:

P(\bar{x}-\bar{y}-(\mathrm{E}[x_i] - \mathrm{E}[y_k]) \geq t) \leq \exp(\frac{-2t^2}{m^{-1}+n^{-1}}) \ \ \ \ \ \ (2),

where \bar{x} = \frac{1}{n}\sum_{i=1}^nx_i, \bar{y} = \frac{1}{m}\sum_{k=1}^my_k, x_i,y_k\in[0,1]. x_i's and y_k's are independent.


My question is: How does (2) follow from (1)?

-Jan

*http://www.csee.umbc.edu/~lomonaco/f08/643/hwk643/Hoeffding.pdf (equations (2.6) and (2.7))
 
Physics news on Phys.org
Hey JanO and welcome to the forums.

One idea I have is to let Z = X + Y and use Z instead of X in the definition.
 
Thanks Chiro for your responce.

However, I still do not understand how the term (m^{-1} + n^{-1}) comes into the bound. Isn't z=\bar{x}-\bar{y} is still bounded between [0,1]?

-Jan
 
JanO said:
Thanks Chiro for your responce.

However, I still do not understand how the term (m^{-1} + n^{-1}) comes into the bound. Isn't z=\bar{x}-\bar{y} is still bounded between [0,1]?

-Jan

Think about what happens to the variances.
 
It seems like bounded here means all most surely bounded. At least that's how Hoeffding inequality seems to be given elsewhere. I guess it then means that z=\bar{x}-\bar{y} is bounded a.s. between [\mu_x-\mu_y-\frac{1}{2}\sqrt{m^{-1}+n^{-1}}, \ \mu_x-\mu_y+\frac{1}{2}\sqrt{m^{-1}+n^{-1}}].?

Thanks again for your help!
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top