I have a few questions regarding the derivation of the degree of divergence for feynman diagrams. The result is $$D = [g_E] - \sum_{n=3}^{\infty} V_n [g_n]$$ (following notation in Srednicki, ##P118##)(adsbygoogle = window.adsbygoogle || []).push({});

I am trying to understand what ##[g_E]## is here? Since in this set up we are summing over all possible scalar theories ##n \in [3, \infty)##, we will have a tree level local interaction diagram corresponding to the case where ##E=i## where ##i## is some element of the set ##[3,\infty)##. So is it right to say that ##[g_E]## denotes a particular element of the set ##[3,\infty)##?

If that is correct, then in the theory $$\mathcal L = \frac{1}{2} Z_{\phi} \partial_{\mu} \phi \partial^{\mu} \phi - \frac{1}{2}Z_m m^2 \phi^2 - \frac{1}{k!} Z_g g \phi^k$$ (i.e a theory where we now include only one of the elements in the above set) the formula for ##D## now becomes ##D = [g_E] - v_k [g]## and in this case ##[g_E] = [g]##, with ##v_k## the number of times a vertex shows up in some diagram? Is that correct understanding?

**Physics Forums | Science Articles, Homework Help, Discussion**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# I Superficial degree of divergence for scalar theories

Have something to add?

Draft saved
Draft deleted

**Physics Forums | Science Articles, Homework Help, Discussion**