MHB Are Events in Random Graphs from G(n, 1/2) Independent?

  • Thread starter Thread starter joypav
  • Start date Start date
  • Tags Tags
    Graph Random
joypav
Messages
149
Reaction score
0
Let $n$ be a positive integer, and let $G$ be a random graph from $G(n, 1/2)$. Let $e_1, . . . , e_{n \choose 2}$ be the
possible edges on the vertex set ${1, . . . , n}$, and for each $i$, let $A_i$ be the event that $e_i ∈ E(G)$.
Prove that the events $A_1, . . . , A_{n \choose 2}$ are independent.Can I just have some help understanding what details I should be including here?
It's so trivial that I don't know how to write it down as a proof.
There is a probability of $\frac{1}{2}$ that any arbitrary possible edge will be an edge of the graph. And it just seems obvious that whether one edge is in the graph has nothing at all to do with if another edge is in the graph.

So then if we choose an arbitrary subset of $A_1, . . . , A_{n \choose 2}$, the probability of the events in the subset occurring will be the product of the individual probabilities of the events.
 
Physics news on Phys.org
joypav said:
Let $n$ be a positive integer, and let $G$ be a random graph from $G(n, 1/2)$. Let $e_1, . . . , e_{n \choose 2}$ be the
possible edges on the vertex set ${1, . . . , n}$, and for each $i$, let $A_i$ be the event that $e_i ∈ E(G)$.
Prove that the events $A_1, . . . , A_{n \choose 2}$ are independent.Can I just have some help understanding what details I should be including here?
It's so trivial that I don't know how to write it down as a proof.
There is a probability of $\frac{1}{2}$ that any arbitrary possible edge will be an edge of the graph. And it just seems obvious that whether one edge is in the graph has nothing at all to do with if another edge is in the graph.

So then if we choose an arbitrary subset of $A_1, . . . , A_{n \choose 2}$, the probability of the events in the subset occurring will be the product of the individual probabilities of the events.

In $G(n, p)$, the edges appear independently with probability $p$ (by definition of $G(n, p)$). So the question indeed is asking to prove something that is immediate from the definition.
 
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Thread 'Detail of Diagonalization Lemma'
The following is more or less taken from page 6 of C. Smorynski's "Self-Reference and Modal Logic". (Springer, 1985) (I couldn't get raised brackets to indicate codification (Gödel numbering), so I use a box. The overline is assigning a name. The detail I would like clarification on is in the second step in the last line, where we have an m-overlined, and we substitute the expression for m. Are we saying that the name of a coded term is the same as the coded term? Thanks in advance.
Back
Top