Is an infinite series of random numbers possible?

In summary: The set of random numbers exhausts integral order and corresponding numerical magnitude. Random numbers may be generated by exchanging orders with magnitudes.I know it's sound philosophical, but how would you generate a random sequence of numbers?If you want to 'generate' something, you'll have to use some apparatus...like a coin toss for example.
  • #71
Loren Booda said:
I'm back up. Would you like to proceed at a more leisurely pace?

I've got a lot of classes today, but after classes tomorrow I'll share my thoughts.
 
Physics news on Phys.org
  • #72
I wonder how come moderators haven't closed this thread loooooong ago. Not only has it drifted away from the OP a lot, but

it also does not deal mainly with mathematics anymore but physics, and it seems to be a social two-convo...

DonAntonio
 
  • #73
DonAntonio said:
I wonder how come moderators haven't closed this thread loooooong ago. Not only has it drifted away from the OP a lot, but

it also does not deal mainly with mathematics anymore but physics, and it seems to be a social two-convo...

DonAntonio

Well this has branched into a discussion on physics and entropy relates to the idea of randomness in a very natural way. Also this does relate to mathematics and not specifically physics because the conversation deals with the topic of entry in a general information theoretic context, which is also probabilistic and statistical and not just physical.

Also now that you have popped in, it is now a social three-convo ;).
 
  • #74
[Speculation]Sequences that most closely describe a set of random numbers are those representable by a minimum number of algorithms.
 
  • #75
Loren Booda said:
[Speculation]Sequences that most closely describe a set of random numbers are those representable by a minimum number of algorithms.

Ok, I'm back from classes so I'll go over your previous post shortly. I'll answer this one now though.

What you need to just clarify is if the sequence itself is finite or infinite.

If the sequence is finite, the answer is a definite yes since there are going to be a finite number of mappings for every combination of values (we also assume the range is a finite set as well: if it isn't then we basically have a variation of the infinite case).

If the case is infinite, then we have an infinite number of mappings possible if there is no constraints placed on the sequence.

If you place a constraint on the sequence (even in a stochastic sense), then you will get constraints on the types of mappings you can get but it depends specifically on the constraints.

One way to clarify this idea is to look at the entropy of the actual mapping itself. The mapping is a form of information and you can calculate the entropy of the mapping in a similar kind of way by calculating the information content of the sequence. If the sequence itself is finite, the range finite, and we assume maximum entropy by saying each sequence element has maximal relative to every other element, then we should get a maximum entropy that depends on both the cardinality of the range set as well as the length of the sequence.

If there are given constraints, then they will most likely lower the entropy depending on what the constraints actually are.

With the right constraints you could have infinite sequences with bounded entropy, but in the general case you can't assume this.

For an intuitive way to picture this, think of our range being [0,1] or [Tails,Heads] and then think about a generalized sequence of length n that can take on these. If we have no constraints the entropy is going to be equal to n and if n is unbounded, then so is the entropy.

However again, if we have the right constraints, then the entropy may be bounded for an infinite sequence under those constraints.

It would be interesting actually coming up with the classes of all sequences that are bounded in entropy with general domain and codomain that fit this criteria.
 
  • #76
Loren Booda said:
[Speculation]A black hole emits thermally, but has lost all of its infalling information except for "No-Hair" quantities. Thus Hawking radiation is a function of mass, angular momentum and charge (i.e., temperatures of a black body). All quantum numbers have been reprocessed into those three. It would be difficult to differentiate between any order from the black hole horizon itself and anomalies near it.

http://en.wikipedia.org/wiki/Cosmic_censorship_hypothesis "The weak cosmic censorship hypothesis asserts there can be no singularity visible from future null infinity. In other words, singularities need to be hidden from an observer at infinity by the event horizon of a black hole." If there were naked singularities, perhaps they would interact and share relative entropy between themselves, point-to-point. The calculated value of entropy for the black hole is actually the relative entropy bounded by the hole's event horizon and its singularity.

As far as my limited understanding goes in physics, it is thought that since light can not escape from a black-hole that EM information under this paradigm ought not to either if this model is valid. I don't know about other kinds of information, but at least the implication (and please correct me if I am wrong about this) is that photons under the given conditions that have been observed are not able to escape a black-hole which is where I think all of these ideas stem from.

Now from a general point of view we have to consider all the information. In physics we usually associate the information with particles of certain kinds and we have forces for these as well as fields which are accounting in using the modern field theories of physics.

Now here's the kicker: what if we haven't found all the information yet? What if there is another particle or something similar with it's own force-carrier and field, or even if it doesn't have a force-carrier and just works completely non-locally?

If you wanted to model this kind of non-local interaction, one way that I see visually is that you can model this kind of information exchange under a situation where the distance between any set of two points is zero. Mathematically, in any space if you have two points then all metrics need to positive when we deal with d(x,y) where x is not y, but consider for the moment that you have such a metric space with this property. What are the implications of this?

So to answer the question specifically it will depend on whether all the known particles that we have are actually a representation of all the information in the system and also with regard to the interactions that are bound on these bits of information.

If the only information is the information contained in electrons, photons, protons, neutrons and all that other jazz and the assumptions for the constraints we have are also right, then mathematically it seems sound.

I'm skeptical though that we have discovered all the 'fields' as you would put it though. The real answer to this is currently unknown, but I imagine that if there are new information quantities and mechanisms to communicate the information, then they will be found in something like the LHC.

However if you have to rely on mathematical arguments and existing data without having access to a particle accelerator with massive energies, you could look at any experimental situation where you get entropy anomalies.

Also the thing is that we don't just have black-holes in the lab or nearby (at least to my knowledge :)) which means that we can't get the actual data, but then again if (and this is an IMO hypothesis) you can create a black-hole type scenario by inducing a situation of enough entropy so that this mechanism is created (using the ideas talked about earlier in this very thread), then what will happen is that you could create such an element and study what happens.

In the RHIC experiment, they had what they called a 'fireball' when they smashed gold-ions together. If this was representative of 'entropy-control' or 'stability-enforcement', then it could give a bit of insight as to how a 'black-hole like mechanism' should act in an information theoretic context.

Again I would stress that in situation, people would really be playing with fire, but it would ultimately help understand energy in a way that is not currently understood.

I actually think that the idea of deriving physical laws from a stability argument or through a minimization problem of functionals is actually a better way to do physics.

Now a lot of physicists will say that this is just a mathematical exercise and they are right when the say this. But to me, it makes sense that the best way to understand a really complex system when you have a really really small sub-set of data is not only to look at the data and extrapolate the entire system from this data, but to do the reverse.

In other words, you start off with an argument that makes sense on a level that is consistent with both observation and mental intuition or reasoning and then from the space of all possibilities that can exist, introduce the right constraints and then come up with the potential solutions.

This is what I see in string theory and for the above reason I think that this line of thinking is much much better than trying to look at the tiniest sub-set of data and trying to extrapolate an entire system based off this data.

I'm not saying that we don't need to experiment because that is absurd but what I'm saying is that doing physics from a derivational standpoint at least conceptually like the way I have seen in some instances of my reading with string theory makes more sense than trying to take data and just fit it to a model: we need both, and the derivational method IMO provides more understanding of what is really going on.

In terms of energy in a general context, you need to think about the conditions for the system in layers. The first one I would impose is that things don't blow up but don't get end up static. The idea of blowing up means that you will need to understand all the possibilities where things can literally blow-up and this means incorporating chaos theory, dynamics and this kind of thing.

You also want the system to make sense: this means you incorporate logic. If a system doesn't make sense logically, then you should probably get rid of it. Although this is far removed from what physicists would deem useful, the idea of logic should be considered since it helps identify systems that can easily be discarded from consideration: remember you want the minimal number of constraints you need but not enough that you are missing key information: again like Einstein said - make it simple but no simpler.

Once you have these situations, you get possibilities that make sense from an intuitive viewpoint. Although this is very general, what you then do is consider other constraints that narrow things down. One might be the requirements for life: this will introduce constraints which will narrow things down. You need to also think of stability and other requirements for living things which will add more constraints and reduce the solution space.

This is why I see things like String theory as the better alternative for understanding something like physics and it's child sciences like chemistry, biology, psychology and so on over just collecting data and trying to fit it.

[Speculation]Statistics of quanta in black holes relies on a supersymmetry there between fermions and bosons:

Conventional black hole physics has sole extensive measurable quantities charge, mass, and angular momentum (the "No Hair" theorem). From these, the Hawking temperature, T, can be found. The statistical distribution n[B. H.] is a function of T, and predicts the occupation of the hole's internal quantum states with unobservable quanta:

n[B. H.]=n[F. D.]+n[B. E.]=csch(ε/κT)

where it is assumed that T is much greater than the T<sub>F</sub> for this black hole.

The quantum within that normally designates Fermi-Dirac or Bose-Einstein statistics by its half- or whole-integer spin values has "lost its hair."

Note: Black hole equilibrium above requires the constraints put forth by Stephen Hawking in his seminal paper, Black Holes and Thermodynamics (Phys Rev D, 15 Jan 1976, p. 191-197).

http://en.wikipedia.org/wiki/Hidden_variable_theory -- (regarding encoded information), Bell's theorem would suggest (in the opinion of most physicists and contrary to Einstein's assertion) that local hidden variables are impossible. Some have tried to apply this to entangled states straddling the black hole horizon.

Pair production at the black hole horizon entangles an infalling virtual particle with its infalling (or escaping) antiparticle. Is this the only instance of either two entangled virtual particles annihilating each other or a escaped particle divulging the quantum numbers (other than "No Hair" quantities) of the fallen partner? The pair production creates opposite spins which do not measurably correlate for either the infalling-infalling particles or the infalling-escaping particles. Spin is macroscopically conserved by the hole in either case.

I will take a look at this later.

[Speculation]The black hole acts as a stabilizer by virtue of its great symmetry. If you have a mass of "intermediate" symmetry (of "No Hair" variables) and collide it with a black hole, the symmetry of the black hole would at least temporarily decrease. If a "high" symmetry mass collides with another of "low" symmetry, their resultant symmetry would be "intermediate." Pure mass, angular momentum and charge are of "high" symmetry, whereas other quantum numbers would be of "intermediate" symmetry. So only the "No Hairs" impose their symmetry on the geometry of the hole, while others "can't get out of the hole." Thus a Schwarzschild black hole becomes more massive, rotating or charged.

This is the thing about the black-hole.

I view that a black-hole type mechanism would basically be a way to control energy if it could be utilized.

But interestingly enough, this is kind of paradoxical in a way because if a black-hole's role was to create a situation of stability, then how could you create a situation where you 'change' this stability.

The thing I see happening is that if you merge two of these objects together in some way and can control the process, then you will effectively be controlling the entropy and thus controlling energy in the context of the limits of entropy restrictions within the black-hole mechanism itself.

However in saying this, I imagine that there will be limits in how effectively this can be done 'practically' and to me (this is my opinion), the idea of just creating a process that allows to create black-hole the size of the Earth from an initial plank scale black-hole doesn't make sense for some reason.

It might done if we we were able to harness every bit of energy in the absolute kind of sense, but I get a feeling that it's just not going to happen considering how limited we are in harnessing even the most basic levels of energy (we still boil water to produce electricity and we live in a world full of gigaherz computers!)

For this reason, even if it were possible, with the ways we harness energy now I'm not holding my breath. If we were able to harness energy in an absolute way, we would effectively be what most people would be called 'God' and for some reason, I am thankful that currently on this planet, that this is not the case for anyone.

[Speculation](Referring to the paper you cited)Loschmidt's Paradox would apply to Newtonian dynamics, statistical mechanics, quantum mechanics and general relativity, all being time reversible. Thus the paradox seems trivial, and as stated "one cannot prove" it.

The Fluctuation Theorem appears more plausible. In the manner of familiar statistical mechanics, two simple probabilities (of entropies representing antisymmetric processes) in the system limit yield the Second Law. It remains unvalidated.

Asymmetric time seems to be the sticking point with establishing violation of the second law. Simply put, we need a universal theory which incorporates time asymmetry to begin with. Building from limited theories, I believe, is putting the cart before the horse.

Staticity or chaos? First assume an Anthropic Principle. Next to the big bang, possibly the most powerful, turbulent entity of the universe is a supernova -- which leaves behind a black hole remnant! The black hole rebounds the might of the supernova. There is a point at which the supernova and black hole are sharing physics, Hawking radiation counteracting free quarks. Mass is fed into the nascent black hole, compressing even more the horizon, which most likely started as a plurality of such surfaces. As black holes merged from Planck to stellar, their entropy, and thus their temperature, accelerated as the sum of their radii squared. Where there once was a fluid of black holes and extreme turbulence is now a relatively cold gravitationally collapsed object within a ghostly nebula.

The funny thing with time is that it is only one kind of order in a general information-theoretic system.

Time is an order between successive events and although it is a good way to understanding things, it's not the only order that is available.

In the general case, you can think about everything with respect to everything else but for us this is just too hard to fathom and comprehend let alone actually do, even with the help of a computer with it's fast processing power that leaves humans a long way behind in the dark.

I'm not going to speculate on the conjectures in your last paragraph and the answer has been said before and it's in two forms: 1) I don't know if we have discovered all the types of information that we can get access to and 2) We don't know how to harness even the most basic of energy.

I would wait until we see what happens when we look at situations of high energy concentration and high entropy of a given kind. Again, I'm kind of glad that the way the world is at the moment, that if the current status is any indication of energy development, then it's probably a good thing we are boiling water to drive our turbines and our TV's, microwaves and computers as well as our factories.

In terms of cosmology, my view is that if things become too disordered then we will get a black-hole scenario like you would see with a collapsing star but if we don't, then I don't think that it will necessarily happen.

It seems that just at least from observation, that things were intended at least at a few levels to be ordered and not chaotic and if you don't believe me, look outside: look at the order in the living ecosystem, the way that things just get done without having to really think about them and this kind of thing. Every scientist in every field IMO will tell you this and I think that they will all admit that it's amazing that everything just 'works' the way it does.

Also with the time-asymmetry, I can see reasons why you would have this in terms of the evolution of a system for a general system. If systems were to evolve, then the fact that they would evolve tells me that there is going to be criterion to follow for something to evolve which would mean that things have to progress in some way. That's the shortest way I can put what I am thinking right now.

But in terms of the information, and the manipulation of energy, this is again not easy to answer because we don't actually know the limits of this. If we knew how to manipulate energy in an absolute sense we would be what most people refer to as 'God', because God in many ways (which I think is unfair) is synonymous with control. The idea of controlling things unfortunately is why I think it is again probably a good idea that we still boil water to power our TV sets.
 
  • #77
I find it interesting that Hawking wanted to know the mind of God: If God really does control energy, then Hawking has certainly understand the mind better than a lot of other people ;).
 
  • #78
[Speculation]Statistics of quanta in black holes relies on a supersymmetry there between fermions and bosons:

Conventional black hole physics has sole extensive measurable quantities charge, mass, and angular momentum (the "No Hair" theorem). From these, the Hawking temperature, T, can be found. The statistical distribution n[B. H.] is a function of T, and predicts the occupation of the hole's internal quantum states with unobservable quanta:

n[B. H.]=n[F. D.]+n[B. E.]=csch(ε/κT)

where it is assumed that T is much greater than the T<sub>F</sub> for this black hole.

The quantum within that normally designates Fermi-Dirac or Bose-Einstein statistics by its half- or whole-integer spin values has "lost its hair."

Note: Black hole equilibrium above requires the constraints put forth by Stephen Hawking in his seminal paper, Black Holes and Thermodynamics (Phys Rev D, 15 Jan 1976, p. 191-197).

http://en.wikipedia.org/wiki/Hidden_variable_theory -- (regarding encoded information), Bell's theorem would suggest (in the opinion of most physicists and contrary to Einstein's assertion) that local hidden variables are impossible. Some have tried to apply this to entangled states straddling the black hole horizon.

Pair production at the black hole horizon entangles an infalling virtual particle with its infalling (or escaping) antiparticle. Is this the only instance of either two entangled virtual particles annihilating each other or a escaped particle divulging the quantum numbers (other than "No Hair" quantities) of the fallen partner? The pair production creates opposite spins which do not measurably correlate for either the infalling-infalling particles or the infalling-escaping particles. Spin is macroscopically conserved by the hole in either case.

I think that Bell's theorem is the right way to go, but it has to be adapted for generic information theoretic systems with generic information and communication properties. In other words, you use the ideas in Bell's theorem but you extend it to find inequalities that would be used to verify any kind of non-local communication phenomenon as well as a mathematical method of determining statistically whether more information exists to account for such anomolies.

You don't have to know what that information actually is and what it relates to, but just to say that 'according to the data, there is a statistical chance that we don't have all the information to explain what is going on': that kind of thing.

Personally I think that Einstein was bound by the idea that information must travel like a ball travels through the air when you throw it, and I imagine to this very day that most people still think like this.

To answer your question specifically for the fermion and bosons, again I know this might seem like a cop-out, but the answer is that I would need more information.

You will need to collect data and using the kind of techniques I touched on above when maturely developed, you would ascertain firstly whether there is an information exchange going on that does not involve the data and then based on this statistical analysis move to then focus on the conjecture. The short answer is that I would be speculating and that currently I just don't know.

You see the thing is that all of these analyses are done on the premise that we know both the communication mechanisms and the information content explicitly for the whole system.

What I am proposing is that you develop techniques to say the following:

'Ok, I've got this data, I've identified the information as blah blah blah, I've got this model for the communication mechanism. Let's see if there is a statistical possibility given the data that a) the information model that we have is not complete and b) the communication model that we have is not complete. If either of these seem likely then we're missing something big-time which means we find out where the anomaly that caused this is and take it from there. If there is a low chance of an anomaly, then we need to consider the nature of the data and the kinds of energies involved. If we are dealing with strictly low energies or largely a spectrum of low-energy scales then it might be a wise idea to keep that in mind when doing an analysis of some sort.'

You then take the above idea and mix it with the 'derivation approach'. In other words given the data, are there any 'guiding principles' like the stability and staticity constraints that can be extrapolated from the data.

The thing is you need both. If you show some kind of confidence that you don't have all the information, it's really hard to proceed with an analysis that assumes you do. It's also useful to acknowledge an information gap because when you do an analysis, you can take this into account especially when you do some kind of conjecture especially in mathematical physics. It strengthens the analysis because it allows one to consider not only what they don't know, but also the specifics of it if you narrow the analysis down to what information caused it.

Also for these kinds of questions, we are conjecturing about situations which involve extremely high energies and if we lose account of this, then we might be 'putting the cart before the horse' so to speak.

So to conclude, first thing is to extend Bell's theorem as outlined above and statistically find whether there is an anomaly corresponding to the information and communication model that is being used to see if there is statistical evidence to show it is 'incomplete' and then take it from there before conjectures are made that are too inductive.

I do think though that studying black-holes, even when we can't do this kind of experimentation is useful because the black-hole mechanism will generate a lot of fruitful discussion on the limits and handling of energy in stable systems and for this reason alone it is extroadinarily useful because it allows us to formulate the kind of 'guiding principles' that we can use to come up with the right kind of constraints that are intuitive from deductive reasoning rather than inductive reasoning and although I recall Hawking saying that 'all his work might have been for nothing' (or something to that extent), I actually think that his life was spent in a very fruitful endeavor when you consider what the consequences are in this context.
 
  • #79
As far as my limited understanding goes in physics, it is thought that since light can not escape from a black-hole that EM information under this paradigm ought not to either if this model is valid. I don't know about other kinds of information, but at least the implication (and please correct me if I am wrong about this) is that photons under the given conditions that have been observed are not able to escape a black-hole which is where I think all of these ideas stem from.

Now from a general point of view we have to consider all the information. In physics we usually associate the information with particles of certain kinds and we have forces for these as well as fields which are accounting in using the modern field theories of physics.

Now here's the kicker: what if we haven't found all the information yet? What if there is another particle or something similar with it's own force-carrier and field, or even if it doesn't have a force-carrier and just works completely non-locally?


[Speculation]I understand that information which otherwise has the potential to "reach" infinity (the spin and mass effects of gravity, through gravitons, and the charge effects of E-M radiation, through photons) has the potential to escape the black hole's event horizon through Hawking radiation. The photons or gravitons which escape a black hole do so obeying a black body spectrum.

What if such particles, according to their specific energies, together fulfill black body states so that such a spectrum is indistinguishable, part photonic and part gravitonic? That is, black body in energy yet anomalous in particle species.

The Higgs seems a candidate for an entity of greater information See http://en.wikipedia.org/wiki/Higgs_boson The Higgs boson is a hypothetical elementary particle predicted by the Standard Model (SM) of particle physics. It belongs to a class of particles known as bosons, characterized by an integer value of their spin quantum number. The Higgs field is a quantum field with a non-zero value that fills all of space, and explains why fundamental particles such as quarks and electrons have mass. The Higgs boson is an excitation of the Higgs field above its ground state.

The existence of the Higgs boson is predicted by the Standard Model to explain how spontaneous breaking of electroweak symmetry (the Higgs mechanism) takes place in nature, which in turn explains why other elementary particles have mass. Its discovery would further validate the Standard Model as essentially correct, as it is the only elementary particle predicted by the Standard Model that has not yet been observed in particle physics experiments. The Standard Model completely fixes the properties of the Higgs boson, except for its mass. It is expected to have no spin and no electric or color charge, and it interacts with other particles through weak interaction and Yukawa interactions. Alternative sources of the Higgs mechanism that do not need the Higgs boson are also possible and would be considered if the existence of the Higgs boson were ruled out. They are known as Higgsless models.


__________


If you wanted to model this kind of non-local interaction, one way that I see visually is that you can model this kind of information exchange under a situation where the distance between any set of two points is zero. Mathematically, in any space if you have two points then all metrics need to positive when we deal with d(x,y) where x is not y, but consider for the moment that you have such a metric space with this property. What are the implications of this?

So to answer the question specifically it will depend on whether all the known particles that we have are actually a representation of all the information in the system and also with regard to the interactions that are bound on these bits of information.

If the only information is the information contained in electrons, photons, protons, neutrons and all that other jazz and the assumptions for the constraints we have are also right, then mathematically it seems sound.

I'm skeptical though that we have discovered all the 'fields' as you would put it though. The real answer to this is currently unknown, but I imagine that if there are new information quantities and mechanisms to communicate the information, then they will be found in something like the LHC.

However if you have to rely on mathematical arguments and existing data without having access to a particle accelerator with massive energies, you could look at any experimental situation where you get entropy anomalies.

Also the thing is that we don't just have black-holes in the lab or nearby (at least to my knowledge :)) which means that we can't get the actual data, but then again if (and this is an IMO hypothesis) you can create a black-hole type scenario by inducing a situation of enough entropy so that this mechanism is created (using the ideas talked about earlier in this very thread), then what will happen is that you could create such an element and study what happens.

In the RHIC experiment, they had what they called a 'fireball' when they smashed gold-ions together. If this was representative of 'entropy-control' or 'stability-enforcement', then it could give a bit of insight as to how a 'black-hole like mechanism' should act in an information theoretic context.


[Aside]Non-local interactions
To dramatize what's happening in this EPR experiment, imagine that Green
detector is on Earth, and Blue detector is on Betelgeuse (540 light-years away)
while twin-state correlated light is coming from a spaceship parked halfway in
between. Although in its laboratory versions the EPR experiment spans only a
room-size distance, the immense dimensions of this thought experiment remind
us that, in principle, photon correlations don't depend on distance.
The spaceship acts as a kind of interstellar lighthouse directing a Green light
beam to earth, a Blue light beam to Betelgeuse in the opposite direction.
Forget for the moment that Green and Blue detectors are measuring something
called "polarization" and regard their outputs as coded messages from the
spaceship. Two synchronized binary message sequences composed of ups and
downs emerge from calcite crystals 500 light-years apart. How these two
messages are connected is the concern of Bell's proof.
When both calcites are set at the same angle (say, twelve o'clock), then PC =
1. Green polarization matches perfectly with Blue. Two typical synchronized
sequences of distant P measurements might look like this:

GREEN:UDUDDUDDDUUDUDDU
BLUE: UDUDDUDDDUUDUDDU

If we construe these polarization measurements as binary message sequences,
then whenever the calcites are lined up, the Blue observer on Betelgeuse gets
the same message as the Green observer on Earth.
Since PC varies from 1 to 0 as we change the relative calcite angle,
there will be some angle α at which PC = 3/4. At this angle, for every four
photon pairs, the number of matches (on the average), is three while the
number of misses is one. At this particular calcite separation, a sequence
of P measurements might look like this:

GREEN:UDDDDUDDDUDDUDDU
BLUE: UDUDDDUDDUUDUDDU

At angle α, the messages received by Green and Blue are not the same but
contain "errors"—G's message differs from B's message by one miss in every
four marks.
Now we are ready to demonstrate Bell's proof. Watch closely; this proof is so short
that it goes by fast. Align the calcites at twelve o'clock. Observe that the messages are
identical. Move the Green calcite by α degrees. Note that the messages are no longer
the same but contain "errors"—one miss out of every four marks. Move the Green calcite
back to twelve and these errors disappear, the messages are the same again. Whenever
Green moves his calcite by α degrees in either direction, we see the messages differ
by one character out of four. Moving the Green calcite back to twelve noon restores
the identity of the two messages.
The same thing happens on Betelgeuse. With both calcites set at twelve noon,
messages are identical. When Blue moves her calcite by α degrees in either direction, we
see the messages differ by one part in four. Moving the Blue calcite back to twelve noon
restores the identity of the two messages.
Everything described so far concerns the results of certain correlation experiments
which can be verified in the laboratory. Now we make an assumption about what might
actually be going on—a supposition which cannot be directly verified: the locality
assumption, which is the core of Bell's proof.
We assume that turning the Blue calcite can change only the Blue message; likewise
turning the Green calcite can change only the Green message. This is Bell's famous
locality assumption. It is identical to the assumption Einstein made in his EPR paradox:
that Blue observer's acts cannot affect Green observer's results. The locality
assumption—that Blue's acts don't change Green's code—seems entirely reasonable:
how could an action on Betelgeuse change what's happening right now on Earth?
However, as we shall see, this "reasonable" assumption leads immediately to an
experimental prediction which is contrary to fact. Let's see what this locality
assumption forces us to conclude about the outcome of possible experiments.
With both calcites originally set at twelve noon, turn Blue calcite by α degrees, and at
the same time turn Green calcite in the apposite direction by α degrees. Now the
calcites are misaligned by 2α degrees. What is the new error rate?
Since turning Blue calcite α degrees puts one miss in the Blue sequence (for every
four marks) and turning the Green calcite α degrees puts one miss in the Green
sequence, we might naively guess that when we turn both calcites we will gel exactly
two misses per four marks. However, this guess ignores the possibility that a "Blue
error" might fall on the same mark as a "Green error"—a coincidence which produces
an apparent match and restores character identity. Taking into account the possibility of
such "error-correcting overlaps," we revise our error estimate and predict that whenever
the calcites are misaligned by 2α degrees, the error rate will be two misses—or less.
This prediction is an example of a Bell inequality. This Bell inequality says: If the
error rate at angle α is 1/4, then the error rate at twice this angle cannot be greater
than 2/4.
This Bell inequality follows from the locality assumption and makes a definite
prediction concerning the value of the PC attribute at a certain angle for photon pairs in
the twin state. It predicts that when the calcites are misaligned by 2α degrees the
difference between the Green and Blue polarization sequences will not exceed
two misses out of four marks. The quantum facts, however, say otherwise. John
Clauser and Stuart Freedman carried out this EPR experiment al Berkeley and
showed that a calcite separation of 2α degrees gives three misses for every four
marks - a quite substantial violation of the Bell inequality.
Clauser's experiment conclusively violates the Bell inequality. Hence one of
the assumptions that went into its derivation must be false. But Bell's argument
uses mainly facts that can be verified - photon PCs at particular angles. The only
assumption not experimentally accessible is the locality assumption. Since it
leads to a prediction that strongly disagrees with experimental results, this
locality assumption must be wrong. To save the appearances, we must deny
locality.
Denying locality means accepting the conclusion that when Blue ob
server turns her calcite on Betelgeuse she instantly changes some of
Green's code on Earth. In other words, locations B and G some five
hundred light years apart are linked somehow by a non-local interaction.
This experimental refutation of the locality assumption is the factual basis
of Bell's theorem: no local reality can underlie the results of the EPR
experiment.
Nick Herbert, Quantum Reality: Beyond the New Physics (Anchor, 1987, ISBN 0-385-23569-0)

[Speculation]Does the violation of the probabilistic Bell inequality relate to a like second law of thermodynamics? Would black hole Hawking radiation obey a "Bell equality"?

The best way to detect a black hole may be to seek its spectrum of annihilation. This may be relatively thermal at first but also discretized -- as the hole diminishes, so the number of constituent particles to radiate and fill out the Planck curve. The upper limit on such spectra may determine the upper limit on black hole density.

Given that a "Planck datum" is the smallest unit of information, how many would be necessary to describe our physical universe? Maybe a myriad of identical cosmological, intersecting black holes would similarly suffice.

Since the highly symmetric black hole requires high energy to create, we will gradually produce entities of closer and closer approximations in symmetric reactions. On the other hand, we may assemble a pocket watch.
 
  • #80
In the RHIC experiment, they had what they called a 'fireball' when they smashed gold-ions together. If this was representative of 'entropy-control' or 'stability-enforcement', then it could give a bit of insight as to how a 'black-hole like mechanism' should act in an information theoretic context.

Again I would stress that in situation, people would really be playing with fire, but it would ultimately help understand energy in a way that is not currently understood.

I actually think that the idea of deriving physical laws from a stability argument or through a minimization problem of functionals is actually a better way to do physics.

Now a lot of physicists will say that this is just a mathematical exercise and they are right when the say this. But to me, it makes sense that the best way to understand a really complex system when you have a really really small sub-set of data is not only to look at the data and extrapolate the entire system from this data, but to do the reverse.

In other words, you start off with an argument that makes sense on a level that is consistent with both observation and mental intuition or reasoning and then from the space of all possibilities that can exist, introduce the right constraints and then come up with the potential solutions.

This is what I see in string theory and for the above reason I think that this line of thinking is much much better than trying to look at the tiniest sub-set of data and trying to extrapolate an entire system based off this data.

I'm not saying that we don't need to experiment because that is absurd but what I'm saying is that doing physics from a derivational standpoint at least conceptually like the way I have seen in some instances of my reading with string theory makes more sense than trying to take data and just fit it to a model: we need both, and the derivational method IMO provides more understanding of what is really going on.

In terms of energy in a general context, you need to think about the conditions for the system in layers. The first one I would impose is that things don't blow up but don't get end up static. The idea of blowing up means that you will need to understand all the possibilities where things can literally blow-up and this means incorporating chaos theory, dynamics and this kind of thing.

You also want the system to make sense: this means you incorporate logic. If a system doesn't make sense logically, then you should probably get rid of it. Although this is far removed from what physicists would deem useful, the idea of logic should be considered since it helps identify systems that can easily be discarded from consideration: remember you want the minimal number of constraints you need but not enough that you are missing key information: again like Einstein said - make it simple but no simpler.

Once you have these situations, you get possibilities that make sense from an intuitive viewpoint. Although this is very general, what you then do is consider other constraints that narrow things down. One might be the requirements for life: this will introduce constraints which will narrow things down. You need to also think of stability and other requirements for living things which will add more constraints and reduce the solution space.

This is why I see things like String theory as the better alternative for understanding something like physics and it's child sciences like chemistry, biology, psychology and so on over just collecting data and trying to fit it.

__________


[Speculation]Indeed, hypothesis is the precursor of experiment, but experiment itself spurs on hypotheses. Without experimental apparatus, hypothesis is more intuitive than physical, but experiment probably preceded hypothesis in much of ancient history.

Reductionism is a widely accepted way to do science, although my botanist friend pokes fun at me for that approach. Stability might be approached through the constancy of the most sensible variables. I have found that comparing limits (as one might compare entropy relative to macroscopic and microscopic horizons) yields simplified mathematical answers to physics.

String theory has an input of experience (in the Standard Model) but not output (predictions or computability). Strings make a good starting place, though, because of their stability (reproducibility) and universality (symmetries).

As John Archibald Wheeler coined, physics is "It from Bit." In other words, all physics could be derived from a binary code.


__________


This is the thing about the black-hole.

I view that a black-hole type mechanism would basically be a way to control energy if it could be utilized.

But interestingly enough, this is kind of paradoxical in a way because if a black-hole's role was to create a situation of stability, then how could you create a situation where you 'change' this stability.

The thing I see happening is that if you merge two of these objects together in some way and can control the process, then you will effectively be controlling the entropy and thus controlling energy in the context of the limits of entropy restrictions within the black-hole mechanism itself.

However in saying this, I imagine that there will be limits in how effectively this can be done 'practically' and to me (this is my opinion), the idea of just creating a process that allows to create black-hole the size of the Earth from an initial plank scale black-hole doesn't make sense for some reason.

It might done if we we were able to harness every bit of energy in the absolute kind of sense, but I get a feeling that it's just not going to happen considering how limited we are in harnessing even the most basic levels of energy (we still boil water to produce electricity and we live in a world full of gigaherz computers!)

For this reason, even if it were possible, with the ways we harness energy now I'm not holding my breath. If we were able to harness energy in an absolute way, we would effectively be what most people would be called 'God' and for some reason, I am thankful that currently on this planet, that this is not the case for anyone.


http://en.wikipedia.org/wiki/Penrose_process The Penrose process (also called Penrose mechanism) is a process theorized by Roger Penrose wherein energy can be extracted from a rotating black hole. That extraction is made possible because the rotational energy of the black hole is located, not inside the event horizon of the black hole, but on the outside of it in a region of the Kerr spacetime called the ergosphere, a region in which a particle is necessarily propelled in locomotive concurrence with the rotating spacetime. All objects in the ergosphere become dragged by a rotating spacetime. In the process, a lump of matter enters into the ergosphere of the black hole, and once it enters the ergosphere, it is split into two. The momentum of the two pieces of matter can be arranged so that one piece escapes to infinity, whilst the other falls past the outer event horizon into the hole. The escaping piece of matter can possibly have greater mass-energy than the original infalling piece of matter, whereas the infalling piece has negative mass-energy. In summary, the process results in a decrease in the angular momentum of the black hole, and that reduction corresponds to a transference of energy whereby the momentum lost is converted to energy extracted.


[Speculation]Paradoxically, the black hole has an upper universal limit of its luminosity at its least mass-energy, but a lower universal limit of its luminosity at its greatest mass-energy.

Physics is a great invention in that it has the potential for peace, as mathematics has the potential for truth. Energy, tempered by truth, has the potential for peace.

[Aside]Recall that a black hole is the equivalent of a white hole according to Hawking.

__________

The funny thing with time is that it is only one kind of order in a general information-theoretic system.

Time is an order between successive events and although it is a good way to understanding things, it's not the only order that is available.

In the general case, you can think about everything with respect to everything else but for us this is just too hard to fathom and comprehend let alone actually do, even with the help of a computer with it's fast processing power that leaves humans a long way behind in the dark.

I'm not going to speculate on the conjectures in your last paragraph and the answer has been said before and it's in two forms: 1) I don't know if we have discovered all the types of information that we can get access to and 2) We don't know how to harness even the most basic of energy.

I would wait until we see what happens when we look at situations of high energy concentration and high entropy of a given kind. Again, I'm kind of glad that the way the world is at the moment, that if the current status is any indication of energy development, then it's probably a good thing we are boiling water to drive our turbines and our TV's, microwaves and computers as well as our factories.

In terms of cosmology, my view is that if things become too disordered then we will get a black-hole scenario like you would see with a collapsing star but if we don't, then I don't think that it will necessarily happen.

It seems that just at least from observation, that things were intended at least at a few levels to be ordered and not chaotic and if you don't believe me, look outside: look at the order in the living ecosystem, the way that things just get done without having to really think about them and this kind of thing. Every scientist in every field IMO will tell you this and I think that they will all admit that it's amazing that everything just 'works' the way it does.

Also with the time-asymmetry, I can see reasons why you would have this in terms of the evolution of a system for a general system. If systems were to evolve, then the fact that they would evolve tells me that there is going to be criterion to follow for something to evolve which would mean that things have to progress in some way. That's the shortest way I can put what I am thinking right now.

But in terms of the information, and the manipulation of energy, this is again not easy to answer because we don't actually know the limits of this. If we knew how to manipulate energy in an absolute sense we would be what most people refer to as 'God', because God in many ways (which I think is unfair) is synonymous with control. The idea of controlling things unfortunately is why I think it is again probably a good idea that we still boil water to power our TV sets.


[Speculation]Suppose all orders in a physical information-theoretic system are finitely related. Perhaps all physical information is directly connected to all other physical information. Would most probable universes tend to be "overly simplistic" or "overly chaotic, " or are orders with life favored due to the Anthropic principle? It may be deceptive to compare our universe with universes seemingly of the first two types.

Relative measurements, like those of time, involve only two points out of a whole universe.

"Time is nature's way of keeping everything from happening at once." -- Woody Allen

__________


I think that Bell's theorem is the right way to go, but it has to be adapted for generic information theoretic systems with generic information and communication properties. In other words, you use the ideas in Bell's theorem but you extend it to find inequalities that would be used to verify any kind of non-local communication phenomenon as well as a mathematical method of determining statistically whether more information exists to account for such anomolies.

You don't have to know what that information actually is and what it relates to, but just to say that 'according to the data, there is a statistical chance that we don't have all the information to explain what is going on': that kind of thing.

Personally I think that Einstein was bound by the idea that information must travel like a ball travels through the air when you throw it, and I imagine to this very day that most people still think like this.

To answer your question specifically for the fermion and bosons, again I know this might seem like a cop-out, but the answer is that I would need more information.

You will need to collect data and using the kind of techniques I touched on above when maturely developed, you would ascertain firstly whether there is an information exchange going on that does not involve the data and then based on this statistical analysis move to then focus on the conjecture. The short answer is that I would be speculating and that currently I just don't know.

You see the thing is that all of these analyses are done on the premise that we know both the communication mechanisms and the information content explicitly for the whole system.

What I am proposing is that you develop techniques to say the following:

'Ok, I've got this data, I've identified the information as blah blah blah, I've got this model for the communication mechanism. Let's see if there is a statistical possibility given the data that a) the information model that we have is not complete and b) the communication model that we have is not complete. If either of these seem likely then we're missing something big-time which means we find out where the anomaly that caused this is and take it from there. If there is a low chance of an anomaly, then we need to consider the nature of the data and the kinds of energies involved. If we are dealing with strictly low energies or largely a spectrum of low-energy scales then it might be a wise idea to keep that in mind when doing an analysis of some sort.'

You then take the above idea and mix it with the 'derivation approach'. In other words given the data, are there any 'guiding principles' like the stability and staticity constraints that can be extrapolated from the data.

The thing is you need both. If you show some kind of confidence that you don't have all the information, it's really hard to proceed with an analysis that assumes you do. It's also useful to acknowledge an information gap because when you do an analysis, you can take this into account especially when you do some kind of conjecture especially in mathematical physics. It strengthens the analysis because it allows one to consider not only what they don't know, but also the specifics of it if you narrow the analysis down to what information caused it.

Also for these kinds of questions, we are conjecturing about situations which involve extremely high energies and if we lose account of this, then we might be 'putting the cart before the horse' so to speak.

So to conclude, first thing is to extend Bell's theorem as outlined above and statistically find whether there is an anomaly corresponding to the information and communication model that is being used to see if there is statistical evidence to show it is 'incomplete' and then take it from there before conjectures are made that are too inductive.

I do think though that studying black-holes, even when we can't do this kind of experimentation is useful because the black-hole mechanism will generate a lot of fruitful discussion on the limits and handling of energy in stable systems and for this reason alone it is extroadinarily useful because it allows us to formulate the kind of 'guiding principles' that we can use to come up with the right kind of constraints that are intuitive from deductive reasoning rather than inductive reasoning and although I recall Hawking saying that 'all his work might have been for nothing' (or something to that extent), I actually think that his life was spent in a very fruitful endeavor when you consider what the consequences are in this context.

[Speculation]My derivation regarding statistics of quanta in black holes says simply that particles (bosons and fermions), having entered a black hole, now can be said to obey a unique random distribution, derived from their statistics, which agrees with the "No Hair" theorem. The derivation is much like Bell's theorem in that a forbidden region is not allowed information exchange, but statistics short of information exchange.

http://en.wikipedia.org/wiki/Bell's_theorem In theoretical physics, Bell's theorem (a.k.a. Bell's inequality) is a no-go theorem, loosely stating that:

"No physical theory of local hidden variables can reproduce all of the predictions of quantum mechanics."
 
  • #81
Loren Booda said:
[Speculation]I understand that information which otherwise has the potential to "reach" infinity (the spin and mass effects of gravity, through gravitons, and the charge effects of E-M radiation, through photons) has the potential to escape the black hole's event horizon through Hawking radiation. The photons or gravitons which escape a black hole do so obeying a black body spectrum.

What if such particles, according to their specific energies, together fulfill black body states so that such a spectrum is indistinguishable, part photonic and part gravitonic? That is, black body in energy yet anomalous in particle species.

The Higgs seems a candidate for an entity of greater information See http://en.wikipedia.org/wiki/Higgs_boson The Higgs boson is a hypothetical elementary particle predicted by the Standard Model (SM) of particle physics. It belongs to a class of particles known as bosons, characterized by an integer value of their spin quantum number. The Higgs field is a quantum field with a non-zero value that fills all of space, and explains why fundamental particles such as quarks and electrons have mass. The Higgs boson is an excitation of the Higgs field above its ground state.

The existence of the Higgs boson is predicted by the Standard Model to explain how spontaneous breaking of electroweak symmetry (the Higgs mechanism) takes place in nature, which in turn explains why other elementary particles have mass. Its discovery would further validate the Standard Model as essentially correct, as it is the only elementary particle predicted by the Standard Model that has not yet been observed in particle physics experiments. The Standard Model completely fixes the properties of the Higgs boson, except for its mass. It is expected to have no spin and no electric or color charge, and it interacts with other particles through weak interaction and Yukawa interactions. Alternative sources of the Higgs mechanism that do not need the Higgs boson are also possible and would be considered if the existence of the Higgs boson were ruled out. They are known as Higgsless models.

First I want to say something about this in terms of my own thoughts.

The first thing I would need to know include what specific particle types are bound by the forces of a black-hole. Again I am not a physicist. This has to do not only with communication itself between different information elements, but also with regard to the elements themselves.

I understand that photons should be constrained in that they should not be able to escape the boundary of the black-hole itself, but I'm curious if there are any arguments or experimental evidence that suggest not only the other particles like neutrons, protons and the force carrier particles like gluons, bosons, fermions.

To really give a better answer to your question, I would prefer if you could point me to a page that show any assumed constraints for these particles as well as discussion and experiments for this. I imagine the experiments would come from particle accelerators given the kinds of energies involved and I wouldn't be surprised if these kinds of experiments are still in progress currently.

If any force carrier is allowed to escape either 'physically' or communicate with other information bits outside the horizon, then this, with some context should answer your question.

The thing is though that because of the energies we are talking about are very high, then it's not going to be easy to speculate on this given that it is not going to be intuitive in terms of the normal physical intuition.

In terms of your question about photonic and gravitonic, again the first thing I would want to test experimentally is a kind of bell analogue I mentioned earlier about whether you have effects that statistically demonstrate whether communication is going on. In terms of whether you get a situation where information bits can actually 'escape' from the whole (as predicted by Hawking) and what that form is, this is something that needs to be considered in the context of communication and physical location of information.

If the information itself can for lack of a better word 'move' through space-time and cross the boundaries of a black-hole, then this is a different scenario to whether there is communication between bits of information.

The ability for information (or what we call particles whether they are force-carriers or just plain old particles themselves like an electron) to travel means that you should just consider this in a normal physical context and this kind of analyses is well established.

The communication problem is different because it means that you need to consider that space-time itself plays no role in the ability for information to communicate and also to change. The change might be realized in a decay of some sort or another transformation of that information into something else (but it's pretty much a decay analogy). When you have this then one needs to start with communication constraints not only for intuitively thinking about what can happen, but also to test it.

My guidelines for a communication mechanism

With regards to energy, I would like to know how you reference different particles together for some kind of relative energy. If energy is just the standard definition of applying a force of some kind on a 'thing' (like a particle), then to me that indicates that in terms of the information if there is some kind of conservation going on, the information before and after will correspond to the energy in that there will be an equilibrium in terms of 'information content'.

If there is an equilibrium of 'information content', then this means that if there is a theory of such conservation, then you actually test in the accelerator if a transition of some sort takes place whether its an interaction (like what you would expect physically with say force-carriers interacting with particles causing a force) or whether its something like a particle decay.

Based on this transition, you can also test whether you are expecting more or less information to be generated in terms of it's content.

Now I am know I am vague about the idea of information content, but I would probably start out by looking at the rules currently for the standard particles in the standard model as well as string theory by seeing how you relate the particles not only terms of charge, but also in terms of the group structures and subsequently the algebraic representation of the particles themselves as well as the entropy of the structures.

This is just an IMO suggestion, but I would imagine that the entropy of the structures themselves would answer this because all this actually reflects is the information itself. So instead of seeing say your standard model as blah blah quarks, see the whole structure as an entropic definition and come up with conservation laws regarding the entropies of the structures themselves rather than seeing this as isolated things 'glued' together.

With this, theorems regarding entropies can be established and predictions made that can be tested in the particle accelerators themselves.

You would have to incorporate all known properties (including things like spin) to make the mathematics more specific and hence the predictions more specific. If you want I could make this suggestion more specific as well, but I just need to get some context for the structures and the interconnections themselves.

With regard to black-body comments for photons and gravitons, the above framework would actually help answer your question more specifically since you would get a better understanding of the entropic value of not only the particles and force-carriers as isolated entities, but also as structures which are built from other structures. From this entropy theorems can be deduced and ideas be built that would also help people like string theorists understand the entropic properties of strings at particular vibrations of frequencies, which again could be used for predictive purposes.

In short, a framework of entropic conservation for different scales: super-atomic, atomic, and sub-atomic information packets would be established that would replace energy with entropy.

[Aside]Non-local interactions
To dramatize what's happening in this EPR experiment, imagine that Green
detector is on Earth, and Blue detector is on Betelgeuse (540 light-years away)
while twin-state correlated light is coming from a spaceship parked halfway in
between. Although in its laboratory versions the EPR experiment spans only a
room-size distance, the immense dimensions of this thought experiment remind
us that, in principle, photon correlations don't depend on distance.
The spaceship acts as a kind of interstellar lighthouse directing a Green light
beam to earth, a Blue light beam to Betelgeuse in the opposite direction.
Forget for the moment that Green and Blue detectors are measuring something
called "polarization" and regard their outputs as coded messages from the
spaceship. Two synchronized binary message sequences composed of ups and
downs emerge from calcite crystals 500 light-years apart. How these two
messages are connected is the concern of Bell's proof.
When both calcites are set at the same angle (say, twelve o'clock), then PC =
1. Green polarization matches perfectly with Blue. Two typical synchronized
sequences of distant P measurements might look like this:

GREEN:UDUDDUDDDUUDUDDU
BLUE: UDUDDUDDDUUDUDDU

If we construe these polarization measurements as binary message sequences,
then whenever the calcites are lined up, the Blue observer on Betelgeuse gets
the same message as the Green observer on Earth.
Since PC varies from 1 to 0 as we change the relative calcite angle,
there will be some angle α at which PC = 3/4. At this angle, for every four
photon pairs, the number of matches (on the average), is three while the
number of misses is one. At this particular calcite separation, a sequence
of P measurements might look like this:

GREEN:UDDDDUDDDUDDUDDU
BLUE: UDUDDDUDDUUDUDDU

At angle α, the messages received by Green and Blue are not the same but
contain "errors"—G's message differs from B's message by one miss in every
four marks.
Now we are ready to demonstrate Bell's proof. Watch closely; this proof is so short
that it goes by fast. Align the calcites at twelve o'clock. Observe that the messages are
identical. Move the Green calcite by α degrees. Note that the messages are no longer
the same but contain "errors"—one miss out of every four marks. Move the Green calcite
back to twelve and these errors disappear, the messages are the same again. Whenever
Green moves his calcite by α degrees in either direction, we see the messages differ
by one character out of four. Moving the Green calcite back to twelve noon restores
the identity of the two messages.
The same thing happens on Betelgeuse. With both calcites set at twelve noon,
messages are identical. When Blue moves her calcite by α degrees in either direction, we
see the messages differ by one part in four. Moving the Blue calcite back to twelve noon
restores the identity of the two messages.
Everything described so far concerns the results of certain correlation experiments
which can be verified in the laboratory. Now we make an assumption about what might
actually be going on—a supposition which cannot be directly verified: the locality
assumption, which is the core of Bell's proof.
We assume that turning the Blue calcite can change only the Blue message; likewise
turning the Green calcite can change only the Green message. This is Bell's famous
locality assumption. It is identical to the assumption Einstein made in his EPR paradox:
that Blue observer's acts cannot affect Green observer's results. The locality
assumption—that Blue's acts don't change Green's code—seems entirely reasonable:
how could an action on Betelgeuse change what's happening right now on Earth?
However, as we shall see, this "reasonable" assumption leads immediately to an
experimental prediction which is contrary to fact. Let's see what this locality
assumption forces us to conclude about the outcome of possible experiments.
With both calcites originally set at twelve noon, turn Blue calcite by α degrees, and at
the same time turn Green calcite in the apposite direction by α degrees. Now the
calcites are misaligned by 2α degrees. What is the new error rate?
Since turning Blue calcite α degrees puts one miss in the Blue sequence (for every
four marks) and turning the Green calcite α degrees puts one miss in the Green
sequence, we might naively guess that when we turn both calcites we will gel exactly
two misses per four marks. However, this guess ignores the possibility that a "Blue
error" might fall on the same mark as a "Green error"—a coincidence which produces
an apparent match and restores character identity. Taking into account the possibility of
such "error-correcting overlaps," we revise our error estimate and predict that whenever
the calcites are misaligned by 2α degrees, the error rate will be two misses—or less.
This prediction is an example of a Bell inequality. This Bell inequality says: If the
error rate at angle α is 1/4, then the error rate at twice this angle cannot be greater
than 2/4.
This Bell inequality follows from the locality assumption and makes a definite
prediction concerning the value of the PC attribute at a certain angle for photon pairs in
the twin state. It predicts that when the calcites are misaligned by 2α degrees the
difference between the Green and Blue polarization sequences will not exceed
two misses out of four marks. The quantum facts, however, say otherwise. John
Clauser and Stuart Freedman carried out this EPR experiment al Berkeley and
showed that a calcite separation of 2α degrees gives three misses for every four
marks - a quite substantial violation of the Bell inequality.
Clauser's experiment conclusively violates the Bell inequality. Hence one of
the assumptions that went into its derivation must be false. But Bell's argument
uses mainly facts that can be verified - photon PCs at particular angles. The only
assumption not experimentally accessible is the locality assumption. Since it
leads to a prediction that strongly disagrees with experimental results, this
locality assumption must be wrong. To save the appearances, we must deny
locality.
Denying locality means accepting the conclusion that when Blue ob
server turns her calcite on Betelgeuse she instantly changes some of
Green's code on Earth. In other words, locations B and G some five
hundred light years apart are linked somehow by a non-local interaction.
This experimental refutation of the locality assumption is the factual basis
of Bell's theorem: no local reality can underlie the results of the EPR
experiment.
Nick Herbert, Quantum Reality: Beyond the New Physics (Anchor, 1987, ISBN 0-385-23569-0)

[Speculation]Does the violation of the probabilistic Bell inequality relate to a like second law of thermodynamics? Would black hole Hawking radiation obey a "Bell equality"?

The best way to detect a black hole may be to seek its spectrum of annihilation. This may be relatively thermal at first but also discretized -- as the hole diminishes, so the number of constituent particles to radiate and fill out the Planck curve. The upper limit on such spectra may determine the upper limit on black hole density.

Given that a "Planck datum" is the smallest unit of information, how many would be necessary to describe our physical universe? Maybe a myriad of identical cosmological, intersecting black holes would similarly suffice.

Since the highly symmetric black hole requires high energy to create, we will gradually produce entities of closer and closer approximations in symmetric reactions. On the other hand, we may assemble a pocket watch.

I will answer this shortly.
 
  • #82
Loren Booda said:
__________

[Speculation]Indeed, hypothesis is the precursor of experiment, but experiment itself spurs on hypotheses. Without experimental apparatus, hypothesis is more intuitive than physical, but experiment probably preceded hypothesis in much of ancient history.

Reductionism is a widely accepted way to do science, although my botanist friend pokes fun at me for that approach. Stability might be approached through the constancy of the most sensible variables. I have found that comparing limits (as one might compare entropy relative to macroscopic and microscopic horizons) yields simplified mathematical answers to physics.

String theory has an input of experience (in the Standard Model) but not output (predictions or computability). Strings make a good starting place, though, because of their stability (reproducibility) and universality (symmetries).

As John Archibald Wheeler coined, physics is "It from Bit." In other words, all physics could be derived from a binary code.

I think Wheeler was right on the money to think about physics in terms of information and I did read about this in the past and it has influenced me quite a bit.

Just following from the prior response, thinking in terms of information helps quantify physics in a more absolute sense because informational entropy is all in the same currency.

Instead of Euros, Dollars, Yen, and Sterling we have Gold. I know it's not the best analogy but the point I'm making is that entropy is a standard quantity regardless of particle type and other properties and the move to think of physical systems in terms of entropy is IMO, a step forward: not just for describing properties like velocity, temperature and other things which are a product of particles or physical systems, but also of the things that define the particles themselves and more importantly the interconnections and the structure of the particles themselves.

http://en.wikipedia.org/wiki/Penrose_process The Penrose process (also called Penrose mechanism) is a process theorized by Roger Penrose wherein energy can be extracted from a rotating black hole. That extraction is made possible because the rotational energy of the black hole is located, not inside the event horizon of the black hole, but on the outside of it in a region of the Kerr spacetime called the ergosphere, a region in which a particle is necessarily propelled in locomotive concurrence with the rotating spacetime. All objects in the ergosphere become dragged by a rotating spacetime. In the process, a lump of matter enters into the ergosphere of the black hole, and once it enters the ergosphere, it is split into two. The momentum of the two pieces of matter can be arranged so that one piece escapes to infinity, whilst the other falls past the outer event horizon into the hole. The escaping piece of matter can possibly have greater mass-energy than the original infalling piece of matter, whereas the infalling piece has negative mass-energy. In summary, the process results in a decrease in the angular momentum of the black hole, and that reduction corresponds to a transference of energy whereby the momentum lost is converted to energy extracted.

This is a problem I see with using things like energy.

If you standardize all information, including the structures and definitions of the particles and any other thing in terms of entropy, then you have a consistent framework to deal with.

It started with time in that we could only consider situations with global arrow: it didn't make sense to think of everything having their own clock, but that's what happened. It didn't make sense to have spaces with non-zero curvature intuitively, but it was necessary.

Now with information, information is just information. It has no type, because its just information. It has no magnitude: that is just something that we made up to describe in our constrained information. You can't have negative information: you can have the absense of information sure, but you have either have information or you don't have it. So in short, it needs no type or classification (unless you specifically do this yourself) in a general way, and in terms of entropy: it's universal in terms of a measure as it is treated the same for all information in a universal alphabet.

In fact you can even have metrics with entropy, just like you have metrics for space-time and this would be the best thing I would do is to establish this idea from the ground up so that it can get to a point where you predict entropies or bounds of some sort which would help you make inferences in which you can adjust the models and repeat the process over and over.

[Speculation]Paradoxically, the black hole has an upper universal limit of its luminosity at its least mass-energy, but a lower universal limit of its luminosity at its greatest mass-energy.

Physics is a great invention in that it has the potential for peace, as mathematics has the potential for truth. Energy, tempered by truth, has the potential for peace.

[Aside]Recall that a black hole is the equivalent of a white hole according to Hawking.

That's the thing I worry about in terms of energy. Call me cynical, but I'm afraid that being able to manipulate that kind of energy without some real solid discussion between the scientists, the politicians, and basically the people at large is an extroadinarily dangerous thing.

Energy is the lifeblood of a modern civilization and like you implied, it needs to be taken very seriously in terms of how it is applied. I personally don't want myself, nor any other living creature being sent back to the stoneage all because of some imbecile not taking a minute to think about the consequences of their actions: it just frightens me.

Not only this, think of every single government that would want access to such a discovery and think hard: a discovery that would generate this kind of energy would be not only a wonder but also a weapon. People are amazed (I am saddened) by atomic bombs which only release a fraction of their potential power.

Can you imagine what will happen if not only can you direct, but control such energy?

Like I said before: currently I am unfortunately glad that we still boil water to power our factories and our computers and TV sets.

[Speculation]Suppose all orders in a physical information-theoretic system are finitely related. Perhaps all physical information is directly connected to all other physical information. Would most probable universes tend to be "overly simplistic" or "overly chaotic, " or are orders with life favored due to the Anthropic principle? It may be deceptive to compare our universe with universes seemingly of the first two types.

Relative measurements, like those of time, involve only two points out of a whole universe.

"Time is nature's way of keeping everything from happening at once." -- Woody Allen

Interesting question.

I don't know if any possible universe that could be realized is realized somewhere, but I would tend to think that there are some kind of constraints (although I can't be certain).

The stability and staticity constraints seem intuitive enough and although these cut things down a lot, they are still variable enough so that they offer the kinds of scenarios of both extremely chaotic and extremely orderly systems.

So as an example with this line of thinking you could rule out systems where say the G constant (gravitational constant) just randomly changes value from +1 to -10000 and then back to 1 whenever it wants.

On the same token though, it means that you could universes that have very peaceful and orderly behaviours as well as ones that do not, but the ones that do not would probably blow-up at least to allow the system to not blow-up entirely.

With regards to information being connected to other information, I would say that this is an emphatic yes IMO: the part for us is figuring this out and understand what this not only means for us, but what it means for everything else as well especially if we are all connected.

I imagine understanding this will bring us together, not just as human beings but as beings in general.

[Speculation]My derivation regarding statistics of quanta in black holes says simply that particles (bosons and fermions), having entered a black hole, now can be said to obey a unique random distribution, derived from their statistics, which agrees with the "No Hair" theorem. The derivation is much like Bell's theorem in that a forbidden region is not allowed information exchange, but statistics short of information exchange.

http://en.wikipedia.org/wiki/Bell's_theorem In theoretical physics, Bell's theorem (a.k.a. Bell's inequality) is a no-go theorem, loosely stating that:

"No physical theory of local hidden variables can reproduce all of the predictions of quantum mechanics."

Maybe you could show me this (possibly as an attachment, or maybe through latex).
 
  • #83
I just realized I put my own responses in the quotes, so just know I meant them not to be quotes but replies.

Also for the last one, I meant to say if you could show your derivation either through attachment, latex or PM if at all possible.
 
  • #84
http://en.wikipedia.org/wiki/Planck_mass The minimum energy needed to produce the (smallest) Planck black hole is 2 x 10-5 gm=2 x 1016 ergs= 1028 electron volts.

__________

Regarding "information content"

http://en.wikipedia.org/wiki/No-hair_theorem
". . . More generally, every unstable black hole decays rapidly to a stable black hole; and (modulo quantum fluctuations) stable black holes can be completely described at any moment in time by these eleven numbers:

mass-energy M,
linear momentum P (three components),
angular momentum J (three components),
position X (three components),
electric charge Q.

These numbers represent the conserved attributes of an object which can be determined from a distance by examining its gravitational and electromagnetic fields. All other variations in the black hole will either escape to infinity or be swallowed up by the black hole."

__________

[Speculation]A black hole, in the classical sense, obeys

GMEH2/REH=MBHc2

That is, MEH=REHc2/G

Where G is Newton's gravitational constant, REH is the event horizon radius, MBH is the black hole mass, and c the speed of light.

A quantized approximation yields the number of quanta on the event horizon: REH2/L*2, where L* is the Planck length. REH/L* is the root mean square.

MEH ± ∆mq(REH/L*)=REHc2/G ± (h/c∆rq)(REH/L* ),

where ∆rq is the quantum uncertainty of length, h is Planck's constant, and ∆mq is the quantum uncertainty of mass.

The order of the quantum tunneling distance to the event horizon radius for a solar-mass black hole is approximately (REH/∆rq)=(hc/G)1/2/L*=1

__________

[Speculation]The Bell experiment implies that their are three types of "communication" -- censored, probabilistic and informative. The first allows no exchange of radiation, the second allows exchange of thermal energy, and the third, communication exchange.

An "internal observer," relative to its black hole, sees numerous particles of many kinds, as we see the microwave backround radiation. A "surface observer" sees many particles, but all as a Hawking blackbody. They are produced in pairs across the event horizon finite energy barrier, some infalling and fewer infalling/outgoing. The latter are like local pairs produced in radioactive decay for an EPR experiment.

The Bell EPR "non-local observer" experiment does not allow faster-than-light signaling. It does, however, allow patterns of data to be transmitted faster than light speed, but effectively only one-way. My own explanation is that signaling from the original decay travels both in our non-local macroverse and in local microverses central everywhere to the macroverse. This allows a relatively sub-light (macro) signal to reinterfere with a relatively super-light (micro) signal.

__________http://en.wikipedia.org/wiki/File:Standard_Model_of_Elementary_Particles.svg [Speculation] Enjoy the profound symmetry. That particle matrix may have a fundamental information content associated with it. There also might be a limit to the number of resonances (excited particles) possible.

Energy is a conserved quantity, symmetric to invarience under time translations. (See http://en.wikipedia.org/wiki/Noether's_theorem#Example_1:_Conservation_of_energy .) As I mentioned before, the second law of thermodynamics may not be a law at all. Will the universe eventually approach heat death or entropy death?

________

http://en.wikipedia.org/wiki/Geon_(physics) "In theoretical general relativity, a geon is an electromagnetic or gravitational wave which is held together in a confined region by the gravitational attraction of its own field energy. They were first investigated theoretically in 1955 by J. A. Wheeler, who coined the term as a contraction of "gravitational electromagnetic entity.

Since general relativity is a classical field theory, Wheeler's concept of a geon does not treat them as quantum-mechanical entities, and this generally remains true today. Nonetheless, Wheeler speculated that there might be a relationship between microscopic geons and elementary particles. This idea continues to attract some attention among physicists, but in the absence of a viable theory of quantum gravity, the accuracy of this speculative idea cannot be tested."

[Speculation]I call the former an Electronic Black Hole (EBH). "A gravitational black hole abhors a 'naked' mass singularity, but allows it the observable property of charge, with correspondent electromagnetic field. Similarly, the horizon radius r for "electronic black holes" (where mec2=e2/4πε0r, r=2.81 x 10-13 cm) limits what we may eventually know about the electromagnetic structure of a charged particle. An electronic black hole (E.B.H.), typical below the scale of a proton, has a particular charge whose electrical potential magnitude equals its associated rest mass-energy. E.B.H.'s are entities so gravitationally bound against electric repulsion at a given radius as to be reproduced by the energy of attempted E-M measurement. As with strong force quark isolation, charge singularity (i. e., E.B.H.) observation itself denies direct ("naked") E-M information.

Might the E.B.H.s have less entropy than similar particles from the Standard Model?

__________

[Speculation]Given the wrong circumstances, most people would be obediant enough to carry out the Milgram experiment to its completion. http://en.wikipedia.org/wiki/Milgram_experiment "

_________[Speculation]Stability and staticity exist relative to chaos and astronomically accelerating dynamics.

__________

[Speculation]My derivation regarding statistics of quanta in black holes says simply that particles (bosons and fermions), having entered a black hole, now can be said to obey a unique random distribution, derived from their statistics, which agrees with the "No Hair" theorem. The derivation is much like Bell's theorem in that a forbidden region is not allowed information exchange, but statistics short of information exchange.

http://en.wikipedia.org/wiki/Bell's_theorem In theoretical physics, Bell's theorem (a.k.a. Bell's inequality) is a no-go theorem, loosely stating that:

"No physical theory of local hidden variables can reproduce all of the predictions of quantum mechanics."
[Speculation (from post #67)]
Statistics of quanta in black holes relies on a supersymmetry there between fermions and bosons:

Conventional black hole physics has sole extensive measurable quantities charge, mass, and angular momentum (the "No Hair" theorem). From these, the Hawking temperature, T, can be found. The statistical distribution n[B. H.] is a function of T, and predicts the occupation of the hole's internal quantum states with unobservable quanta:

n[B. H.]=n[F. D.]+n[B. E.]=csch(ε/κT)

where it is assumed that T is much greater than the T<sub>F</sub> for this black hole.

The quantum within that normally designates Fermi-Dirac (n[F. D.]) or Bose-Einstein (n[B. E.])statistics by its half- or whole-integer spin values has "lost its hair."

Note: Black hole equilibrium above requires the constraints put forth by Stephen Hawking in his seminal paper, Black Holes and Thermodynamics (Phys Rev D, 15 Jan 1976, p. 191-197).
http://en.wikipedia.org/wiki/Hidden_variable_theory -- (regarding encoded information), Bell's theorem would suggest (in the opinion of most physicists and contrary to Einstein's assertion) that local hidden variables are impossible. Some have tried to apply this to entangled states straddling the black hole horizon, that is, to pair production of particles and whether entangled states can stay so with one having infallen. Communication of this third kind is based primarily on entanglement, with the "non-local observer" (above). They are more constrained with observation than an "surface observer." Thus if the "non-local observer" respects entanglement, so does the straddling "surface observer."
 
  • #85
I've got to fly (classes very soon), so I'll and try and look at this later tonight and give my thoughts tomorrow.
 
  • #86
Cantor already did this.
 
  • #87
chiro said:
I've got to fly (classes very soon), so I'll and try and look at this later tonight and give my thoughts tomorrow.

If you have to, just wing it. Hoping you find relationships between classes.
 
  • #88
Loren Booda said:
[Speculation]The Bell experiment implies that their are three types of "communication" -- censored, probabilistic and informative. The first allows no exchange of radiation, the second allows exchange of thermal energy, and the third, communication exchange.

An "internal observer," relative to its black hole, sees numerous particles of many kinds, as we see the microwave backround radiation. A "surface observer" sees many particles, but all as a Hawking blackbody. They are produced in pairs across the event horizon finite energy barrier, some infalling and fewer infalling/outgoing. The latter are like local pairs produced in radioactive decay for an EPR experiment.

The Bell EPR "non-local observer" experiment does not allow faster-than-light signaling. It does, however, allow patterns of data to be transmitted faster than light speed, but effectively only one-way. My own explanation is that signaling from the original decay travels both in our non-local macroverse and in local microverses central everywhere to the macroverse. This allows a relatively sub-light (macro) signal to reinterfere with a relatively super-light (micro) signal.

This is the thing I was hoping to imply with regard to experimental testing regarding a general Bell-Theorem scenario in the way that you would statistically test firstly a) that some non-local communication exchange was going on and b) whether it was bound by known constraints (the obvious one being the speed of light).

Again the thing is that our understanding of physics is unsurprisingly physical and this is not meant as a derogatory statement, but rather as a statement to allow for the possibility that we have communication happening that is not locally spatial.

If you wanted to incorporate this physical idea of thinking you could simply use the idea of space-time structures where points could join at will when they need to, to actually model this using the standard calculus techniques we use to model phenomena in terms of local changes (as represented by the derivatives and partial derivatives of physical equations in the classical sense).

This kind of thing of having a dynamic manifold that allows this is of course not a new concept and has been studied extensively in gravitational theory for quite a while. All I'm suggesting is to instead of interpreting in this context, you treat it more or less as a general information system with general communication exchange and then place the constraints on the information and the communication mechanism in that context.

What this ends up doing is that you don't try and think about communication in terms of particles and force-carriers in a physical sense like you would when you think about the situation where you have two billiard balls on a snooker table where you hit one and it hits the other and the communication exchange is basically a 'physical' thing.

Again this is just my perspective and I don't really think about communication requiring a local mechanism like you would expect if you thought about it in a physically intuitive context.

The short answer that I would speculating with regard to your question, but if you wanted to know what I would do personally I would develop the theory in this context where you don't use specific local models of physics, but rather ones of a non-local nature and then find either contradictions or support for non-local behaviour of any sort. You could almost think of this as some kind of cellular automata but with even less restrictions on the communication mechanism itself.

What this means is essentially looking at generalized models that don't rely on differentials but something broader and I know that I would cop a lot of flak for this, especially from the physics community because it seems overly complex and perhaps un-necessary: the point is that with a framework like this in combination with statistical analysis that is tailored for inference in this type of problem: you can get the data and rule out (at least with some measure of confidence) whether you get this happening or not and instead of putting your theorems in terms of simply local properties: you put them in non-local ones.

Another reason this is hard mathematically is that we have to introduce analyses that correspond to this: with dx/dt or dy/dx we only think of local changes, but with a non-local framework, the scope is a lot more broad.

Like I said I am not a physicist, but if I put the physical laws in this context with a lot of effort (I know that physics is not an easy endeavor), I would see it in a way that would make sense to me both computationally, statistically and information wise. This would take a lot of effort, but then again the results could be fruitful.

http://en.wikipedia.org/wiki/Planck_mass The minimum energy needed to produce the (smallest) Planck black hole is 2 x 10-5 gm=2 x 1016 ergs= 1028 electron volts.
__________

Regarding "information content"

http://en.wikipedia.org/wiki/No-hair_theorem
". . . More generally, every unstable black hole decays rapidly to a stable black hole; and (modulo quantum fluctuations) stable black holes can be completely described at any moment in time by these eleven numbers:

mass-energy M,
linear momentum P (three components),
angular momentum J (three components),
position X (three components),
electric charge Q.

These numbers represent the conserved attributes of an object which can be determined from a distance by examining its gravitational and electromagnetic fields. All other variations in the black hole will either escape to infinity or be swallowed up by the black hole."

The next thing you would do is to convert this to an entropy measure itself. We have entropy in terms of the states of 'stuff' itself (like the particles), but the next thing to do would be to express the 'structure' in terms of an entropy.

The reason for this is that you would something in a way that you can deal with universally. Once you have the entropy of the information and the structure for a system, you can treat it in a common way. In terms of the entropy for the structure, this will depend on the information content of the structure itself.

This is why I think information theory is important because most people, if they ever consider entropy, they consider only the entropy of the realization of bits of information that have a particular structure, class, or classification and because of this, you can't say if you have 100 particles (bosons, whatever) treat them in a true unified way.

What typically happens in my own reading, is that the theories kind of 'glue-stuff' together using for example group structures. In a situation where you treat any structure in the same context, you overcome the shortcomings of this problem.

Of course, it's not simply that easy. Firstly you have to be able to move back and forth between entropy, algebra, the realizations of your information in a fluid manner.

What currently happens is that in mathematics we have numbers and for the most part, the information content of the numbers let alone the algebras that are associated with system descriptions is completely left out. We don't think of this and as a result when it comes to understand the real information (and thus entropy) of the entire system, we have these two frameworks that are not compatible with each other.

Like the previous question, this again would require a lot of mathematical development that would incorporate again computer science, information theory, mathematics and statistics in a highly developed way.

http://en.wikipedia.org/wiki/File:Standard_Model_of_Elementary_Particles.svg [Speculation] Enjoy the profound symmetry. That particle matrix may have a fundamental information content associated with it. There also might be a limit to the number of resonances (excited particles) possible.

Energy is a conserved quantity, symmetric to invarience under time translations. (See http://en.wikipedia.org/wiki/Noether's_theorem#Example_1:_Conservation_of_energy .) As I mentioned before, the second law of thermodynamics may not be a law at all. Will the universe eventually approach heat death or entropy death?

This is just in line with my thoughts earlier in this thread, but I think that the universe as a whole will not reach either total chaos or total staticity but globally remain in a state between those two spectrums. If I'm wrong, I'm wrong, but I will say that this is a prediction by me (I know there is no data or mathematics), but my reasoning for this is the same as before: with any chance of staticity you will get your entropy death which means any kind of dynamics of the system is destroyed and thus it can not evolve. Subsequently too much chaos will result in a system that ends up growing ever so unstable until the point where order can never be restored thus making the system go beyond a point of 'no return' so to speak.

While you can reach these situations, for the same reasons above I predict that you will not be able to globally create a situation where you create entropy death or heat death.

As for sub-regions, this would have to be investigated theoretically and mathematically and I can't really comment on the specifics of this because frankly I don't know.

http://en.wikipedia.org/wiki/Geon_(physics) "In theoretical general relativity, a geon is an electromagnetic or gravitational wave which is held together in a confined region by the gravitational attraction of its own field energy. They were first investigated theoretically in 1955 by J. A. Wheeler, who coined the term as a contraction of "gravitational electromagnetic entity.

Since general relativity is a classical field theory, Wheeler's concept of a geon does not treat them as quantum-mechanical entities, and this generally remains true today. Nonetheless, Wheeler speculated that there might be a relationship between microscopic geons and elementary particles. This idea continues to attract some attention among physicists, but in the absence of a viable theory of quantum gravity, the accuracy of this speculative idea cannot be tested."

[Speculation]I call the former an Electronic Black Hole (EBH). "A gravitational black hole abhors a 'naked' mass singularity, but allows it the observable property of charge, with correspondent electromagnetic field. Similarly, the horizon radius r for "electronic black holes" (where mec2=e2/4πε0r, r=2.81 x 10-13 cm) limits what we may eventually know about the electromagnetic structure of a charged particle. An electronic black hole (E.B.H.), typical below the scale of a proton, has a particular charge whose electrical potential magnitude equals its associated rest mass-energy. E.B.H.'s are entities so gravitationally bound against electric repulsion at a given radius as to be reproduced by the energy of attempted E-M measurement. As with strong force quark isolation, charge singularity (i. e., E.B.H.) observation itself denies direct ("naked") E-M information.

Might the E.B.H.s have less entropy than similar particles from the Standard Model?

The thing about the speculation of these black-hole scenarios is the same that I wrote about previously in the context of things like the Penrose-Process.

In the Penrose Process that you mentioned earlier about extracting energy from black-holes, this question reminds me of the same kind of scenario in that in these situations you are able to explicitly control the process of energy (and hence information) distribution in a very controlled manner.

If it ends up that you have the Penrose-Process, the process where you can have naked singularities or similar kinds of processes, I don't think it will be an easy thing because again doing these kinds of things is equivalent to controlling energy, since if the black-hole scenario represents the maximum entropy situation and the process itself is just an energy distribution mechanism in the forms of stabilization, then to me it suggests that the fact that this happens happens to make sure things don't screw up and because of all effects going on in this situation, the only way you could achieve these scenarios is if you could control it in any kind of semi-certain way.

Like I said before, for the most part, we are still boiling water through coal and we use nuclear energy and in my mind it is ridiculous but at the same time I am unfortunately glad because if we had the ability to control energy like we would do with something like a black-hole, then the fact that human beings would be behind this terrifies me.

Figuring out the black-hole scenario in absolute death to me is the equivalent of being for lack of a better word 'God'.

In terms of your question though, again it depends on the information and any communication that is happening (potentially) between it and anything else.

Again with black-holes we think that just because it is a black-hole and just because light can't theoretically escape it, then apart from your situations with Hawking radiation, there must be no communication going on.

This is an assumption using classical intuition of billiard balls and from a scientific point of view, I would rather test it from a general non-local statistical inferential analysis over using a local one.

I tend to think that it's best to start with the idea that everything is potentially talking to everything else because from that you can be sure that at least from the statistical point of view that there either is evidence for this to be a general principle or for it to not be general.

If it wasn't general and the data was reliable, then ok that's how it is but I would want to see data from a high energy environment that is similar to the characteristics of a black-hole mechanism.


[Speculation]Given the wrong circumstances, most people would be obediant enough to carry out the Milgram experiment to its completion. http://en.wikipedia.org/wiki/Milgram_experiment "[/QUOTE]

Personal responsibility, or more properly put, the lack of it, is the thing that let's evil breed. People lie to themselves everyday thinking everything is ok and when you have a situation where you have group or social reinforcement, then this makes it a lot harder.

When people take personal responsibility for themselves it means they think long and hard of what they are doing. It also means that people will acknowledge their faults, their wrongdoings, and their ugly side.

It's unfortunately a lot easier to just lie to themselves even though they know better and again it's no better when everyone thinks the same way which ends up establishing the social norms that create the chaos we have.

Anyone that chooses to deny personal responsibility at any level will become the perfect Milgram experiment participant and in a situation where you have what the participant thinks is a 'norm', then it becomes a lot harder due to the characteristics of our social makeup and how social situations affect us.

Most people call this peer pressure and other words, but usually it boils down to usually a personal security issue of some sort and the fact that choosing to be the Milgram candidate enforces some kind of gaurantee for said security.

It's hard to think by yourself and it's hard to act that way when you see the rest of the world acting in the complete opposite manner.

[Speculation]Stability and staticity exist relative to chaos and astronomically accelerating dynamics.

Chaos theory I think would be the best way to study this formally in terms of statisticity and also through chaotic bounding.

It would be interesting to take into account the acceleration of the universe (is this what you're asking) with respect to what that does for chaos in any finite subregion.

If you had things shrinking, then I see the situation for chaos becoming more imminent due to the kind of argument you get when you consider a standard statistical mechanics problem if you put matter in a box and the box shrunk with the matter itself being conserved (I know I'm using the physical intuition here so forgive me ;)) assuming we are talking situations where you have this pattern (which is a lot of situations).

By having acceleration, you actually do the reverse: you create a situation where it becomes harder to create an unnecessary chaotic situation which means that you create a great chance of things becoming a lot more ordered and I think it's a good thing to favor ordered scenarios as opposed to chaotic scenarios.

[Speculation (from post #67)]
http://en.wikipedia.org/wiki/Hidden_variable_theory -- (regarding encoded information), Bell's theorem would suggest (in the opinion of most physicists and contrary to Einstein's assertion) that local hidden variables are impossible. Some have tried to apply this to entangled states straddling the black hole horizon, that is, to pair production of particles and whether entangled states can stay so with one having infallen. Communication of this third kind is based primarily on entanglement, with the "non-local observer" (above). They are more constrained with observation than an "surface observer." Thus if the "non-local observer" respects entanglement, so does the straddling "surface observer."
[Speculation]A black hole, in the classical sense, obeys

GMEH2/REH=MBHc2

That is, MEH=REHc2/G

Where G is Newton's gravitational constant, REH is the event horizon radius, MBH is the black hole mass, and c the speed of light.

A quantized approximation yields the number of quanta on the event horizon: REH2/L*2, where L* is the Planck length. REH/L* is the root mean square.

MEH ± ∆mq(REH/L*)=REHc2/G ± (h/c∆rq)(REH/L* ),

where ∆rq is the quantum uncertainty of length, h is Planck's constant, and ∆mq is the quantum uncertainty of mass.

The order of the quantum tunneling distance to the event horizon radius for a solar-mass black hole is approximately (REH/∆rq)=(hc/G)1/2/L*=1

I will answer this in the next post.
 
  • #89
Also just thinking about your acceleration question, the entropy constraint that I would test would be based on isotropic ideas.

In other words, the idea is that you would isotropic properties through space for the chaos and staticity constraints as a first basis for a model and then see how the forces affect this and adjust for this.

If the universe really did 'stretch' as a function of time, then it would make the staticity and chaotic requirements a lot easier (tending to favor more order than chaos) and this property of expansion would make this situation a lot easier.

In terms of specifics, this would require analyzing how combinations of things affect entropy and thus chaos and staticity, but again just using the statistical mechanics analogy above, if you apply the idea isotropically through space then it makes this a hell of a lot easier.
 
  • #90
This thread has gone waaay of topic. It now deals with physics and not with mathematics anymore, so it is not suitable for this forum. Furthermore, I can see lots of speculation happening which is not allowed here.

Thread locked.

If you two want to keep talking, you should probably PM each other.
 
<h2>1. Can an infinite series of random numbers truly be random?</h2><p>This is a complex question that has been debated among scientists and mathematicians for years. Some argue that true randomness is impossible to achieve, while others believe that certain mathematical formulas can generate truly random numbers.</p><h2>2. How can we test if an infinite series of random numbers is truly random?</h2><p>There are various statistical tests that can be used to determine the randomness of a series of numbers. These tests look for patterns or biases in the data that could indicate non-randomness.</p><h2>3. Is there a limit to how long an infinite series of random numbers can be?</h2><p>Technically, an infinite series has no limit. However, in practical terms, there are limitations to how long a series of random numbers can be generated. This is due to computational constraints and the fact that truly random numbers cannot be generated by a computer algorithm.</p><h2>4. Are there any real-life applications for an infinite series of random numbers?</h2><p>Yes, infinite series of random numbers are used in various fields such as cryptography, statistical analysis, and simulations. They are also used in computer programming for tasks such as generating unique IDs or creating randomized elements in games.</p><h2>5. Can an infinite series of random numbers be predicted or controlled?</h2><p>No, the whole point of a random series is that it cannot be predicted or controlled. However, some algorithms claim to generate "pseudo-random" numbers that may appear random but can be predicted with certain knowledge of the algorithm used.</p>

1. Can an infinite series of random numbers truly be random?

This is a complex question that has been debated among scientists and mathematicians for years. Some argue that true randomness is impossible to achieve, while others believe that certain mathematical formulas can generate truly random numbers.

2. How can we test if an infinite series of random numbers is truly random?

There are various statistical tests that can be used to determine the randomness of a series of numbers. These tests look for patterns or biases in the data that could indicate non-randomness.

3. Is there a limit to how long an infinite series of random numbers can be?

Technically, an infinite series has no limit. However, in practical terms, there are limitations to how long a series of random numbers can be generated. This is due to computational constraints and the fact that truly random numbers cannot be generated by a computer algorithm.

4. Are there any real-life applications for an infinite series of random numbers?

Yes, infinite series of random numbers are used in various fields such as cryptography, statistical analysis, and simulations. They are also used in computer programming for tasks such as generating unique IDs or creating randomized elements in games.

5. Can an infinite series of random numbers be predicted or controlled?

No, the whole point of a random series is that it cannot be predicted or controlled. However, some algorithms claim to generate "pseudo-random" numbers that may appear random but can be predicted with certain knowledge of the algorithm used.

Similar threads

  • General Math
Replies
7
Views
1K
  • General Math
Replies
17
Views
365
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
24
Views
3K
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
782
Replies
20
Views
1K
Replies
2
Views
1K
Replies
11
Views
2K
Replies
4
Views
283
Back
Top