MHB Minimum Degree of a Random Graph (Probabilistic Method)

joypav
Messages
149
Reaction score
0
Problem:
Suppose that the function $p : N \rightarrow [0, 1]$ satisfies $p >> n^{-1}ln(n)$ (i.e. $n^{-1}ln(n) = o(p)$).

(a) Prove that as $n \rightarrow \infty$, the random graph $G(n, p)$ has minimum degree at least $\frac{np}{2}$ almost surely.
Idea: Look at the degree of each individual vertex $v \in V(G)$. Try to find a bound on the probability of this fixed vertex having a small degree (using Chernoff's Inequality?).

I am not getting to where I think I should with this idea.
Let $X_{u,v}$ be $1$ is $uv \in E(G)$ and $0$ if $uv \notin E(G)$.
Then, $X_v = \sum_{u \in V(G), u \neq v} X_{u,v}$ will give us the degree of $v$.
Also, the $X_{u,v}$'s are uniform/independent (as $G$ is a random graph and edges are picked independently with probability $p$).
Now we may use Chernoff's? How do I use the first thing we assumed... that $p >> n^-1ln(n)$?


(b)
Conclude that for any fixed positive integer k, the random graph $G(n, p)$ has minimum
degree at least k almost surely as $n \rightarrow \infty$.
 
Last edited:
Physics news on Phys.org
For completeness, I will post a "proof" I came up with. As you can see, at the end I skipped over some much needed justification that I couldn't seem to figure out. Some help with this would be appreciated!

However, I do think that my general idea is correct.

Proof: (a) Let $G \sim G(n,p)$. Fix $v \in V(G)$. Let $A_{v,u}$ be the event that $vu \in E(G)$ and let $X_{v,u}$ be its indicator. Then, $X_{v,u}=1$ with probability $p$ and $X_{v,u}=0$ with probability $1-p$. Because each possible edge is inserted independently, the $X_{v,u}$'s are independent. Let $X_v = \sum_{u \in V(G), u \neq v} X_{u,v}$, (giving the degree of $v$). Now we may use Chernoff's Inequality with:
$$\sigma^2 = Var[X_v] = np(1-p)$$
$$E[X_v] = \sum_{u \in V(G), u \neq v} E[X_{u,v}]=p(n-1)$$
Then,
$P[$degree of v is less than $\frac{np}{2}] = P[X_v \leq E[X_v] - \frac{np-2p}{2}] = P[X_v \leq \frac{np}{2}]$
$$ \leq e^{\frac{-(\frac{np-2p}{2})^2}{2(np+\frac{\frac{np-2p}{2}}{3})}} = e^{-\frac{3n^2p-12np+12}{28n-8}}$$
Let $X$: existence of vertices with degree less than $\frac{np}{2}$. Then,
$$P[X]= \sum_{v \in V(G)}P[X_v \leq \frac{np}{2}] \leq n\cdot e^{-\frac{3n^2p-12np+12}{28n-8}} = e^{ln(n)-\frac{3n^2p-12np+12}{28n-8}}$$
By assumption $p > > n^{-1}ln(n)$.
$$\Rightarrow ln(n)-\frac{3n^2p-12np+12}{28n-8} \rightarrow -\infty $$
$$\Rightarrow P[X]\rightarrow 0$$
Then every vertex of G must have degree at least $\frac{np}{2}$.

(b) Apply part (a). So we have $\delta(G) \geq \frac{np}{2}$ a.s. as $n \rightarrow \infty$. As long as $n \geq \frac{2k}{p}$, $\delta(G) \geq k$.
 
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Thread 'Detail of Diagonalization Lemma'
The following is more or less taken from page 6 of C. Smorynski's "Self-Reference and Modal Logic". (Springer, 1985) (I couldn't get raised brackets to indicate codification (Gödel numbering), so I use a box. The overline is assigning a name. The detail I would like clarification on is in the second step in the last line, where we have an m-overlined, and we substitute the expression for m. Are we saying that the name of a coded term is the same as the coded term? Thanks in advance.
Back
Top