MHB Minimum Degree of a Random Graph (Probabilistic Method)

Click For Summary
The discussion focuses on proving that in a random graph $G(n, p)$ with a probability function $p$ satisfying $p >> n^{-1} \ln(n)$, the minimum degree is at least $\frac{np}{2}$ almost surely as $n$ approaches infinity. The approach involves analyzing the degree of a fixed vertex using Chernoff's Inequality, where the degree is represented as a sum of independent indicator variables for edges. It is shown that the probability of a vertex having a degree less than $\frac{np}{2}$ decreases exponentially, leading to the conclusion that all vertices must have a degree of at least $\frac{np}{2}$. Consequently, it is established that for any fixed positive integer $k$, the minimum degree of the random graph is at least $k$ almost surely as $n$ increases. This conclusion is supported by the earlier findings in part (a).
joypav
Messages
149
Reaction score
0
Problem:
Suppose that the function $p : N \rightarrow [0, 1]$ satisfies $p >> n^{-1}ln(n)$ (i.e. $n^{-1}ln(n) = o(p)$).

(a) Prove that as $n \rightarrow \infty$, the random graph $G(n, p)$ has minimum degree at least $\frac{np}{2}$ almost surely.
Idea: Look at the degree of each individual vertex $v \in V(G)$. Try to find a bound on the probability of this fixed vertex having a small degree (using Chernoff's Inequality?).

I am not getting to where I think I should with this idea.
Let $X_{u,v}$ be $1$ is $uv \in E(G)$ and $0$ if $uv \notin E(G)$.
Then, $X_v = \sum_{u \in V(G), u \neq v} X_{u,v}$ will give us the degree of $v$.
Also, the $X_{u,v}$'s are uniform/independent (as $G$ is a random graph and edges are picked independently with probability $p$).
Now we may use Chernoff's? How do I use the first thing we assumed... that $p >> n^-1ln(n)$?


(b)
Conclude that for any fixed positive integer k, the random graph $G(n, p)$ has minimum
degree at least k almost surely as $n \rightarrow \infty$.
 
Last edited:
Physics news on Phys.org
For completeness, I will post a "proof" I came up with. As you can see, at the end I skipped over some much needed justification that I couldn't seem to figure out. Some help with this would be appreciated!

However, I do think that my general idea is correct.

Proof: (a) Let $G \sim G(n,p)$. Fix $v \in V(G)$. Let $A_{v,u}$ be the event that $vu \in E(G)$ and let $X_{v,u}$ be its indicator. Then, $X_{v,u}=1$ with probability $p$ and $X_{v,u}=0$ with probability $1-p$. Because each possible edge is inserted independently, the $X_{v,u}$'s are independent. Let $X_v = \sum_{u \in V(G), u \neq v} X_{u,v}$, (giving the degree of $v$). Now we may use Chernoff's Inequality with:
$$\sigma^2 = Var[X_v] = np(1-p)$$
$$E[X_v] = \sum_{u \in V(G), u \neq v} E[X_{u,v}]=p(n-1)$$
Then,
$P[$degree of v is less than $\frac{np}{2}] = P[X_v \leq E[X_v] - \frac{np-2p}{2}] = P[X_v \leq \frac{np}{2}]$
$$ \leq e^{\frac{-(\frac{np-2p}{2})^2}{2(np+\frac{\frac{np-2p}{2}}{3})}} = e^{-\frac{3n^2p-12np+12}{28n-8}}$$
Let $X$: existence of vertices with degree less than $\frac{np}{2}$. Then,
$$P[X]= \sum_{v \in V(G)}P[X_v \leq \frac{np}{2}] \leq n\cdot e^{-\frac{3n^2p-12np+12}{28n-8}} = e^{ln(n)-\frac{3n^2p-12np+12}{28n-8}}$$
By assumption $p > > n^{-1}ln(n)$.
$$\Rightarrow ln(n)-\frac{3n^2p-12np+12}{28n-8} \rightarrow -\infty $$
$$\Rightarrow P[X]\rightarrow 0$$
Then every vertex of G must have degree at least $\frac{np}{2}$.

(b) Apply part (a). So we have $\delta(G) \geq \frac{np}{2}$ a.s. as $n \rightarrow \infty$. As long as $n \geq \frac{2k}{p}$, $\delta(G) \geq k$.
 
Hello, I'm joining this forum to ask two questions which have nagged me for some time. They both are presumed obvious, yet don't make sense to me. Nobody will explain their positions, which is...uh...aka science. I also have a thread for the other question. But this one involves probability, known as the Monty Hall Problem. Please see any number of YouTube videos on this for an explanation, I'll leave it to them to explain it. I question the predicate of all those who answer this...