
#19
Mar2912, 03:21 AM

P: 4,570

The best way to describe this is to have all of these distributions be uniform because this distribution is the one that maximizes entropy. If you do this for all possible conditional probabilities, then you will get a distribution that is purely random. From this distribution you will get hints about the kinds of processes that you could construct. If you want something random, but not purely random then you don't have to do anywhere near as much work, but if you want a process that is random the best way possible, you need to construct the above system and from there decide what kind of process would really emulate this distribution. 



#20
Apr112, 12:01 AM

P: 3,408

S=entropy κ=Boltzmann's constant ln=natural logarithm Ω=number of states __________ 1. Does true randomness accompany a transfinite number of states? 2. Is information about states restricted by a finite speed of light? 3. Can multiple states interfere, e.g. achieve minimum entropy? . 



#21
Apr112, 12:18 AM

P: 4,570

For number 2, I have to reiterate that the above talks about randomness with respect the distributions of the states that are related to each other and are independent of the actual process itself. If you want to answer your question you need to consider further constraints that relate to a specific process. Again the above considers a general process with a particular property. I am not a physicist, but what you need to do is specifically outline how a law or a relationship between variables modifies the properties of the distributional information of the various joint distributions which then modifies the entropy, and from that you can get an idea of the real measure of the random nature of the process. You should note that the physical systems are not purely random in the sense I have described above, because there is a lot of deterministic features known that we utilize and exploit everyday for various purposes. If the world were absolute truly random and completely unpredictable, then the order that we observe and make use of every day would not be present. For 3, I wish to say that minimization of entropy gives an indication of order while maximization of entropy gives us an indication of disorder. Physicists and natural scientists usually have a goal of finding order and this directly relates to entropy. The other thing is that you don't just want to consider the entropy characteristics of the original distributions, but also those of possible transformations of the data and hence the possible transformations of the distributions that 'make sense'. Make sense depends on both mathematical ideas and domain ideas, but for mathematical ones you want to consider at the very minimum convergence and probably topology and differentiability as well. If you want an example of entropy minimization, think of entanglement. Instead of having two objects that would be classified as purely random, instead what we see is a reduction of entropy from that case since one has a direct effect on the other and this result shows a form of order that would otherwise not be seen in a purely random system. In fact it is this property of minimum entropy in a variety of circumstances that has allowed us to obtain formulas like the ones you find in your science textbooks: it is this ability to quantify this order accurately enough that allows us to even understand this system we call reality or the 'universe', and I imagine that as time goes by we will be able to find transformations of our distributional representations that allow us to see even more order and subsequently a way to quantify it, just as Newton quantified gravity. 



#22
Apr212, 09:15 PM

P: 3,408

For number 2, I should have asked whether pure randomness can occur in a finite universe. The relations within the equation for statistical entropy I gave are standard, and only simplistic logarithmic information derives from them. States approaching infinity on a microscopic level are just as likely on a macroscopic level. For number 3, quantum entropy would either involve destructive interference with change to the negative, or would involve constructive interference with change to the positive. I believe statistical mechanics gives classical transformations of our distributional representations. For entropy to have a true "law," I guess that entanglements must transfer their statistics instantaneously (rather than at a finite speed of c) and thermally. 



#23
Apr212, 10:37 PM

P: 4,570

I think the answer is going to be yes because the statespace does not have to necessarily define the nature of the process. The thing you have to remember is that a process can take in a finitestate space and map it to a finitestate space like say relating the history of die rolls to the next one in terms of probability. The statespace itself is fixed, but the process could go on forever and ever infinitely and although you have a process with a finite state space, it doesn't mean that the properties of the underlying process itself are not purely random. Think of the problem of whether you could define a cointoss process so that every new toss has no arbitrage chance of being biased in any way given the entire history of the process. If you think you can define a process that is unpredictable in this nature, your answer is yes. If you think you can't your answer is no. I haven't shown a proof, but if I were to give one I would try and show that there always exists a process that given N observations, that the complete joint distribution of the N+1 observation given all the prior observations has maximal entropy (i.e. all values are the same). If this is shown, then you have proven that your answer is a resounding yes. To give you an example, imagine that you have a process corresponding to an infinite periodic sequence where one period consists of {0,1,2,3,4,5} in that order and repeats forever. Now if you try and calculate P(X = a) for a = {0,1,2,3,4,5} you will always get 1/6 which implies maximal entropy. But for this process we know that P(X_n+1 = a X_n = (a1) MOD 6) = 1 because of the periodicity. The entropy of this joint distribution is zero which implies absolute determinism. From this statement although the nonconditional entropy is maximal, the joint ones are completely the opposite and through this we have found complete order which is exhausted at this level and thus the process is deterministic. The order of the process if it exists, will be hidden somewhere in the conditional joint distributions, not in the nonconditional probability distribution. Again, it needs to be defined what the macroscopic space is and if possible, the mapping between states in different spaces. I think you'll find that because of the way things are seen macroscopically in comparison to microscopically, there will be some kind of significant reduction in the state space for the macroscopic classifications in contrast to the microscopic ones which will look more like a projection than a bijection. 



#24
Apr312, 11:08 PM

P: 3,408

chiro,
Please explain the general meaning of the symbols and variables used in P(X_n+1 = a X_n = (a1) MOD 6) = 1. Thanks for your dedication. 



#25
Apr312, 11:34 PM

P: 4,570

I'll expand it out for all states and then you should see how I used the mod function: Here we go: P(X_(n+1) = 0  X_(n) = 5) = 1 P(X_(n+1) = 1  X_(n) = 0) = 1 P(X_(n+1) = 2  X_(n) = 1) = 1 P(X_(n+1) = 3  X_(n) = 2) = 1 P(X_(n+1) = 4  X_(n) = 3) = 1 P(X_(n+1) = 5  X_(n) = 4) = 1 All other probabilities for all other first order conditional combinations are zero and you can show this by various probability identities and exhaustion of the probability space. The X_(n+1) refers to the "n+1"th observation for the process and the X_(n) refers to the "n"th observation for the process. You could for this example associate n as a time parameter of a onedimensional process. The above process can only take on the values {0,1,2,3,4,5} which means we only have to consider going from one value in this list to another value in this list. 



#26
Apr412, 11:46 PM

P: 3,408

Gedankenexperiments
__________ Please consider whether each of the following pairs is relatively entropic: 1. The big bang singularity and its imminent nonsingularity 2. Electron selfenergy at a point and a spatial perspective 3. A cosmologist observing his selfinclusive universe 4. A closed universe and the black holes within 5. A quantum measurement and its measuring device 6. A vacuum of virtual particles 7. Turbulence at temperature T→∞ 8. Black bodies at temperature T __________ How many conditional entropies would there be given N nonconditional entropies? __________ S=∫ΔQ/T The article at http://en.wikipedia.org/wiki/Negativ...ature#Examples reads: ""Since we started with over half the atoms in the spindown state, initially this drives the system towards a 50/50 mixture, so the entropy is increasing, corresponding to a positive temperature. However, at some point more than half of the spins are in the spinup position. In this case, adding additional energy reduces the entropy, since it moves the system further from a 50/50 mixture. This reduction in entropy with the addition of energy corresponds to a negative temperature." 



#27
Apr512, 02:28 AM

P: 4,570

In terms of how many conditional entropies there are, this depends on the statespace. Usually if we were to study the system classically, then we probably would have considered things in terms of spatial locality. What I mean by this is that in a classical context, matter or whatever kind of components that make up what we call matter, energy or whatever would have been treated in the way that things that are more local spatially would have more of an effect on the physical properties of that matter and thus affect everything from temperature to everything else and thus affect its entropy. The easiest way to think about this in terms of the local analytic viewpoint is to consider modelling a dynamical system like a fluid. Although you get all kinds of chaotic effects, typically by modelling everything as a continuum where everything is modelled in a way where all the local effects add up to produce the final cglobal behaviour. In fact anything modelled with standard calculus will use this idea that by knowing local changes (usually in the form of a derivative), then the global changes can be found by how local changes accumulate and this is why when systems can be modelled this way, why calculus is so useful because it gives us a framework for doing exactly this. But now you have to consider the situation when you are analyzing things that are not spatially 'close' or local (in the context I mentioned above). When this happens, we need to consider not only spatially local effects but things that are 'nonlocal'. And this is the kind of thing that needs to considered in quantum mechanics and a lot of experiments are working on trying to understand this very thing. In a classical way of analysis (physically that is), this is not only completely foreign in terms of our intuitive understanding and experience, it is a lot harder to deal with mathematically. In terms of your questions, I can only answer them without using physical constraints: in other words, I will try and attack these from a mathematical viewpoint and not from one which is considered by physicists and this might be a little dissappointing but it will still, in my opinion give you some more understanding. The thing for all your problems is that as a general rule, order is found when entropy is minimized. Now the thing is that what we call 'time' is only one kind of order. Depending on the system, there are most likely going to be 'many' kinds of orders. In classical physics, time itself has a very good order to it in such a way that the conditional entropies in this context are very highly minimized in a way that the models give us something that is highly predictable which is just a result of a very low conditional entropy in the context of the system with regard to various conditional measures. Intuitively, with calculus, the way we order things is always in terms of locality with respect to some form of a variable that is usually temporal or spatial in nature (often a system that involves a mix of the two). In terms of temporal, it has grown slowly from observation, first from a macroscopic level and then to a microscopic level but the idea is the same: relate local changes in space and time to a process and use calculus to model some form of global behaviour of the physical world. But in the context of a general high statespace, highly complex general process, it may have many different kinds of orders and one order will typically hide a lot of information about the system in general in such a way that although the mathematical conclusions are correct, the interpretation may be very limited and in some ways detract from an otherwise higher understanding. There is no problem with finding orders, but it needs to be considered that there might be other orders either from the raw system itself or from a transformed variant that gives an insight that can not be seen from the existing order that has been either chosen or subsequently discovered. I will have to look up some of these things specifically later on to see what they correspond to mathematically, but the key issue in the above is to first define the states and then slowly describe the conditional distributions that are derived mostly from how these states interact. You may find that interactions are constrained between specific parts of the system in the same kind of manner that you get local interactions spatially in terms of classical physics, but the thing is that the constraints are notspatially local or even temporally (in the way that we see it) local and in this context you need to use a different way of analyzing the system. Mathematically the way to describe the conditional entropies would be firstly to define the collection of all possible conditional probability distributions and then define an entropy for that distribution. You could also define things like relative entropies as well and all of this can be found in an information theory book, and I recommend Thomas and Cover's Information Theory 2nd edition book which you can buy on Amazon if you are really interested (also look on Wikipedia if you just want definitions and not something as formal). What you'll find in highly ordered systems (no matter what the order) is that the majority of joint distributions have entropies of zero (or close enough to zero) that the order is easily determinable. If this is not the case, then it takes a lot more work (and subsequently 'appears' random). I will look at the article in a little while: for now I hope the above has helped you. 



#28
Apr512, 02:42 AM

P: 4,570

It should be noted that this framework of physics grew out of the exercise of studying 'heat' and unsurprisingly many of the ways that we generate energy (that is later converted to electrical energy) is to this very day, still generated by heat. With coal we generate heat and that goes to energy, with nuclear we do the same thing and convert that to energy. The same thing for petroleum based forms of energy as well. It's all based on heating stuff up and converting it to energy. There are exceptions for example using say hydrodynamic power stations, or wind power amongst other things but for most of the energy generation we just create enough heat so that we can create steam and drive a turbine which in my mind is absolutely ridiculuous but that's the way it is. The result of creating more 'order' out of a system has been done in the lab (specifically look for experiments done previously at the Australian National University and there are probably others) but not to the extent where we would do it so much on a 'large scale'. I'll have a look at the actual definitions of how they define heat in detail later on if you like, but it is true that a decrease in the entropy characteristics with how 'heat' is defined in terms of that particular entropy measure (which is going to be implicitly defined by the physical model which I need to read) will result in a decrease of the associated quantity as it's defined. 



#29
Apr512, 03:14 AM

P: 4,570

For 1, this is a very good question. Here are my thoughts on this: Many people have advocated that the 2nd law of thermodynamics should hold in the context of describing the physical universe with respect to the order that we call 'time' in that entropy should always be increasing if not at least staying the same. To me I would say that this is only 'halfright' and in some ways misleading because an ever increasing entropy for a system means that the system gets 'more chaotic' if this happens for every form of entropy. People talk about plates breaking, experiments with heat and other things that show a good argument for the entropy increase scenario and nonsurprisingly time itself is defined by the 2nd law of thermodynamics (it's one way, but it's a very important definition in physics). But if you consider all the different kinds of entropies that exist, I see evidence that the above is clearly not true. We have a lot of order in terms of some known approximations in physics and other scientific systems. Look around and just see the order that exists on our planet in terms of lifeforms behaving with one another and in terms of any phenomena that has a high amount of stability with respect to its environment. In other words, some things in some contexts are producing situations where things become 'moreordered' rather than 'more disordered'. This leads me to infer that the 2nd law of thermodynamics applied across the board to represent the entire universe is faulty in its reasoning because if this was the case the universe would be in every respect, in a complete and utter state of chaos and this is not the case. So with respect to entropies again I have to state that in a complex system there are going to be many different kinds of orders and I imagine when comparing and contrasting the different entropies of initial big bang and other states that the same argument needs to be applied. We currently do not have many different orders and when we start to get more insight and hence more different orders, we will start to explore this idea more clearly and more deeply. To answer your question, you have to specify the kinds of orders used and because of this I can't give a decent answer to your question because it's too broad. For this to be answered we would need to know how information is exchanged between things inside a black hole (in the event horizon) and things beyond the horizon. If it turns out that information is exchanged (I think this is in debate currently) then that will make a huge difference with how we form constraints for the joint distributions and entropies and it also means we have to consider a system that is much much larger and more complex. If things are completely isolated, then this simplifies things dramatically but again with Hawkings idea of evaporation from blackholes I have a feeling that if the theory is correct or even if the idea is correct in terms of some form of radiation, then this means that essentially there is 'communication' (information exchange) going on and this needs to be taken into account. Also if there is kind of entanglement that is not spatiotemporally local (i.e. action at a distance in the context of between two different spacetime boundaries) then this would make it even more broader. In terms of the measurement problem, again this is going to relate to any analysis of the joint distributions with respect to anything that is associated with the device. It doesn't make our problem any easier because we will need to consider orders that are much much harder to extrapolate from the properties of our system than we currently do now, but again the idea of finding orders is the same except we are considering it in a different context. You need to note that you will need to look at different orders other than the standard ones mentioned if they indeed do exist. One thing I will mention though is that if there is some kind of arbitrage mechanism that exists to keep things stable then this could be used to formulate the properties of the various distributions and test it experimentally. I am not a physicist though. The idea behind arbitrage in the way I am describing is that the system would have to account in whatever way it can so that there is not enough determinism in the system to produce a particular point of instability. If the system was weak enough so that it could be exploited to create instability that was detrimental to the function of the system itself, then this would cause a kind of 'systemwide turbulence' that would be utterly destructive. It's my opinion, but it's based on the idea of creating a system that doesn't essentially 'blow up inadvertendly'. 



#30
Apr712, 03:21 AM

P: 3,408

Blackbodies at temperature T  http://en.wikipedia.org/wiki/Black_body A black body is an idealized physical body that absorbs all incident electromagnetic radiation, regardless of frequency or angle of incidence.
A black body in thermal equilibrium (that is, at a constant temperature) emits electromagnetic radiation called blackbody radiation. The radiation is emitted according to Planck's law, meaning that it has a spectrum that is determined by the temperature alone, not by the body's shape or composition. A black body in thermal equilibrium has two notable properties: It is an ideal emitter: it emits as much or more energy at every frequency than any other body at the same temperature. It is a diffuse emitter: the energy is radiated isotropically, independent of direction. An approximate realization of a black body is a hole in the wall of a large enclosure. Any light entering the hole is reflected indefinitely or absorbed inside and is unlikely to reemerge, making the hole a nearly perfect absorber. The radiation confined in such an enclosure may or may not be in thermal equilibrium, depending upon the nature of the walls and the other contents of the enclosure. A vacuum of virtual particles  http://en.wikipedia.org/wiki/Vacuum_state In quantum field theory, the vacuum state (also called the vacuum) is the quantum state with the lowest possible energy. Generally, it contains no physical particles. Zeropoint field is sometimes used as a synonym for the vacuum state of an individual quantized field. According to presentday understanding of what is called the vacuum state or the quantum vacuum, it is "by no means a simple empty space", and again: "it is a mistake to think of any physical vacuum as some absolutely empty void." According to quantum mechanics, the vacuum state is not truly empty but instead contains fleeting electromagnetic waves and particles that pop into and out of existence. The presence of virtual particles can be rigorously based upon the noncommutation of the quantized electromagnetic fields. Noncommutation means that although the average values of the fields vanish in a quantum vacuum, their variances do not. The term "vacuum fluctuations" refers to the variance of the field strength in the minimal energy state, and is described picturesquely as evidence of "virtual particles". It is sometimes attempted to provide an intuitive picture of virtual particles based upon the Heisenberg energytime uncertainty principle: Δ E Δ t ≥ hbar , (with ΔE and Δt energy and time variations, and hbar the Planck constant divided by 2π) arguing along the lines that the short lifetime of virtual particles allows the "borrowing" of large energies from the vacuum and thus permits particle generation for short times. Anthropic entropic principle  I hypothesize that, rather than observers (life) be where entropy density is high, they exist where entropy density is low. The act of observation itself could rely on semicoherent radiative interaction, and so tend participants toward lower entropy density. A cosmologist observing his selfinclusive universe  I believe this could be modeled by your staircase algorithm, chiro, where observation cycles to the event horizon and back, and as speculated by early relativists, observers could see themselves gravitationally imaged about the circumference. Anentropy  I think that entropy depends not only on the states of a configuration, but also on the network of interconnections (entanglement) between states." Anentropic" by nature of retrospection, this latter "pattern memory" potentially surpasses entropy's information exponentially in magnitude. Reciprocity of entropy  In practice, the inequality in the second "law" of thermodynamics may be the crux of the argument against it being a true law. This law may be violated for a nonisolated system. But might it also not hold in general timelike spacetime? 



#31
Apr712, 04:58 AM

P: 4,570

I'm going to give my thoughts on a topic by topic basis since there is a lot in this post. Again, these are just my opinions and I welcome any feedback you may have whether it's mathematical or just in the nontechnical spoken manner which I will prefer to use in these posts.
So far it seems that the current idea is that every known force has a force mechanism that is represented by particles that are in standard model and that some people are still looking for similar mechanism for gravity which they call a 'graviton'. In other words for forces to act there is a physical exchange of these 'carriers' with other particles that initiate a force and thus change the properties of a physical system or a particle. Now with regard to some kind of localness, this makes intuitive sense because in terms of analyzing physical changes (which include subsequent changes in physical states which quantify energy characteristics) because spatiotemporally, at least in terms of local changes because it gets rid of the thing that Einstein referred to as 'spooky action at a distance' which is something that is hard if it did exist for most scientists to grasp since the world is viewed in terms of local spatiotemporal changes in the way that we use derivatives in calculus to represent local properties of a function. I've diverted a bit from the question so I'll get back on track, but I stress that is important to consider that if anything has a hint or just plain and simply is nonlocal then this means new analyses are needed. I have said it above but I think it's important to reiterate. Now lets think about this in terms of entropy for the blackbody. We know that entropy relies on not only the nature (shape) of the distribution itself, but also the number of states and I wish to talk about this now. If the number of states is indeed finite, then any associated relative entropy of that system will also be finite. The question then remains, how do we identify the states if they are finite? The evaluation of the states is something that is probably the most important part of understanding physical laws because not only does it give predictive power, but it also allows a better understanding. The methods that are currently used include different forms of quantization. The quantization schemes differ from theory to theory, but the idea is the same: there are not going to be an uncountable number of states within some finite representation. We might for example take the idea to quantize spacetime in a variety of ways and this is something that is being worked on. The quantization might say that for example that all physical elements can only occupy certain states individually like a lattice. Another theory might argue that only specific 'combinations' can exist for something to be called a state. This would be analogous to phenomena found in the Standard Model with say the requirement for quark configurations in various atomic particles. It also might be even more complex where again where it is a nonlocal and more complex version of the quark phenomenon. The point of the above is that once we can show one way or another that for some finite region (might be everything contained within a spacetime boundary or even a subset) always has a bounded entropy for all relative joint distributions, then you know that there is a quantization of states and that the relative entropies will give 'hints' about what the quantization scheme actually is depending on the nature of the conditional distributions and the complex of those distributions. So with the above said, even for something like a blackbody that has those properties, if there really exists a proper quantization within some finite region of some sort, then any kind of entropy in this space will always be bounded even for something like a blackbody. Remember I'm talking about the statespace of the system. 



#32
Apr712, 05:07 AM

P: 4,570

The thing about observation is that it is not an isolated incident even when you only consider observations of one particular instrument. The thing about observation is that it is not a singleevent phenomena but it is a multievent phenomena. Observations are not isolated: they rely on other observations as well. If you expected for something with respect to a set of observations to get more ordered with respect to some ordered set of observations, then the entropy would in this context decrease with respect to this particular sequence of observations. Mathematically if our ordered set of observations was {S1,S2,S3,....} = S then P(SU) represents the distribution and we would expect a resultant entropy decrease for a respective measure of order in this context. I know this might seem like a copout, but correlation does not imply causation. Intuitively though, it would seem that entities would have some kind of impetus to minimize various conditional entropy measures as to create order rather than attempt to increase entropy to create more chaos. 



#33
Apr712, 05:15 AM

P: 4,570

The response I have for this is the same as what I said above: if all measures of entropy where increasing then we would expect systems to get more chaotic and not less chaotic. I'm not saying that different entropy measures will always violate the 2nd law, but what I'm saying is that the idea of continually increasing chaos is not what we experience. We can talk about plates breaking and all these kinds of things that support it, but again there is a huge amount of order in our universe in so many ways and this tells me that not everything gets more chaotic and some things get a hell of a lot more ordered. Following this thread I'm inclined to go review current theories and mathematical constraints for the various theories later on, but for now I can say that as a whole, I do not know at this time enough about the constraints to give a qualitative and specific answer. 



#34
Apr712, 05:18 AM

P: 4,570





#35
Apr712, 05:29 AM

P: 4,570

Just as a thought experiment imagine if you could in some region, nullify the energy for that region. This would mean that for this region if this was the case everything would be completely static and there would be no possibility for any kind of dynamic behaviour. With regard to virtual particles being used to 'borrow' energy, again in terms of statespace I would consider this as part of the system and not something that is isolated from it. The fact that it exists or at least the mechanism in some form exists means that it should be included in whatever way that is appropriate. The big thing at least in my mind is this: how does the quantization of energy (as given by E = hf) relate to the quantization of the medium used to represent it? Moreover, how does all of this affect the supremum of the entropy measures in some finite space that I was talking about earlier? Personally I think the nature and quantities related with the lowest states tell us a lot about the nature of the system. I'm going to have to at some point take a closer look at these kinds of things: you've got me interested now damnit! 



#36
Apr712, 05:32 AM

P: 4,570

I've heard about the nature of cyclic structures in physics like cyclic time and so on, but I can't really comment on the specifics. 


Register to reply 
Related Discussions  
C++ random numbers!  Programming & Computer Science  3  
random numbers  Math & Science Software  1  
Probability: Infinite Convergent Series and Random Variables  Set Theory, Logic, Probability, Statistics  2  
Random Numbers  Linear & Abstract Algebra  2  
Infinite series of complex numbers.  Calculus & Beyond Homework  1 