- #1
michael879
- 698
- 7
ok so my main question here is about the normal bekenstein bound, but I will go into why I'm asking too in case anyone has any comments on that.
The way I understand the derivation of the bekenstein bound is:
If you have some closed system with energy E bounded by radius R, you can derive the bound by "lowering" it into a black hole with mass M (radius 2M) and enforcing the 2nd law of thermodynamics: [itex]S_{bh}(M) + S_{system}(E,r) \leq S_{bh}(M+E-E_{rad}) + S_{rad}[/itex]. Neglecting the radiation if M >> E, you get: [itex]S_{system}(E,r) \leq S_{bh}(M+E) - S_{bh}(M)[/itex]. If you reduce M to the same order as R, while keeping M >> E you arrive at the bekenstein bound up to some constant. As far as I can tell this constant is determined by requiring a black hole bounded by its event horizon to have maximum entropy.
Now my question is why and how you eliminate the entropy of the radiation from this equation?? Clearly any system falling into a black hole will produce radiation which will carry away energy and entropy. I understand that if M>>E, the energy of the radiation is negligible. Why though is the entropy contribution negligible??
The reason I'm asking this question is because I am trying to derive a generalized Bekenstein bound for a system with parameters (m,q,j) by "lowering" it into a black hole with parameters (M,Q,J). By adjusting M, Q, and J you should be able to come up with a lower maximal entropy of the system as it takes more into account. The issue is that a charged system falling into a black hole emits much more radiation and an uncharged one, and if the black hole is rotating the situation is even worse! So I would like to know what restrictions I need to place to remove the radiative terms from the equation
The way I understand the derivation of the bekenstein bound is:
If you have some closed system with energy E bounded by radius R, you can derive the bound by "lowering" it into a black hole with mass M (radius 2M) and enforcing the 2nd law of thermodynamics: [itex]S_{bh}(M) + S_{system}(E,r) \leq S_{bh}(M+E-E_{rad}) + S_{rad}[/itex]. Neglecting the radiation if M >> E, you get: [itex]S_{system}(E,r) \leq S_{bh}(M+E) - S_{bh}(M)[/itex]. If you reduce M to the same order as R, while keeping M >> E you arrive at the bekenstein bound up to some constant. As far as I can tell this constant is determined by requiring a black hole bounded by its event horizon to have maximum entropy.
Now my question is why and how you eliminate the entropy of the radiation from this equation?? Clearly any system falling into a black hole will produce radiation which will carry away energy and entropy. I understand that if M>>E, the energy of the radiation is negligible. Why though is the entropy contribution negligible??
The reason I'm asking this question is because I am trying to derive a generalized Bekenstein bound for a system with parameters (m,q,j) by "lowering" it into a black hole with parameters (M,Q,J). By adjusting M, Q, and J you should be able to come up with a lower maximal entropy of the system as it takes more into account. The issue is that a charged system falling into a black hole emits much more radiation and an uncharged one, and if the black hole is rotating the situation is even worse! So I would like to know what restrictions I need to place to remove the radiative terms from the equation
Last edited: