What creates information in a 2D particle system and how can it be measured?

  • Thread starter Thread starter hddd123456789
  • Start date Start date
  • Tags Tags
    Data Information
AI Thread Summary
The discussion centers on measuring information in a 2D particle system and the concept of phase space. A participant suggests that understanding all possible states of the system could equate to knowing its information content, but questions arise about how specific arrangements of particles (like circular or rectangular formations) represent information. The conversation explores the relationship between data, mathematical structures, and information creation, emphasizing the role of entropy and state-space dimensions in quantifying information. Additionally, the concept of metadata is introduced, suggesting that tracking state variables and their histories can provide deeper insights into the system's behavior. Overall, the dialogue highlights the complexity of defining and measuring information within mathematical and programming contexts.
hddd123456789
Messages
92
Reaction score
0
Hey folks, so the extent of my knowledge in math is basic concepts from Calc I (I'm more of a programmer). Now, I was working on a 2D particle system and had a random thought: if I wanted to gauge the total amount of information contained in the system, how would I do it? The first idea I had was to somehow build a list of every possible state in the system thinking that if you could comprehensively know all the states of a system, this should equate to knowing all the information about the system.

But then I thought, what if in such a state in a 2D particle system you had all the particles lined up along the perimeter of a circle. Doesn't this qualify as information? Or if all the particles lay along the perimeter of a rectangle? Or if a particle moved through time in a path that resembled a parabola? Wouldn't we all consider this to be some form of information? But none of this would be obvious from a comprehensive list of possible states, which can be thought of as the full set of data for the particle system.

Assuming that the above would qualify as information, I got to thinking, what "creates" the information? Anyway, I got to this statement which I know is not a theory, or postulate, or hypothesis, or conjecture and certainly not scientific, but just a statement that seems to make sense: information exists when data represents mathematical structures, or conversely, mathematics is the set of axioms that, when applied to data, produces information.

Does it make any sense? I'd love some feedback on this.
 
Last edited:
Mathematics news on Phys.org
What you have described is the concept of phase space. Your first idea is a phase space with many dimensions (this is a very important concept as it is the foundation of the quantification of entropy which determines the ultimate fate of the (open) universe). When particles are constrained to lie on a circle, the phase space collapses such that the new phase space has far fewer dimensions. With particles constrained to lie on a rectangle, you have a different phase space with an approximately similar number of dimensions.

I am not sure what you are getting at with your statements about information, and in any case these don't seem to be mathematical statements but philosophy (yes I am dodging the question!) so I don't really have any feedback there other than to present my own description of mathematics adapated from Wikipedia: Mathematics is searching for patterns in abstract data, formulating conjectures regarding patterns and resolving the truth or falsity of conjectures by proofs built upon axioms.

I hope there is something there to continue your interest?
 
Last edited:
hddd123456789 said:
Hey folks, so the extent of my knowledge in math is basic concepts from Calc I (I'm more of a programmer). Now, I was working on a 2D particle system and had a random thought: if I wanted to gauge the total amount of information contained in the system, how would I do it? The first idea I had was to somehow build a list of every possible state in the system thinking that if you could comprehensively know all the states of a system, this should equate to knowing all the information about the system.

But then I thought, what if in such a state in a 2D particle system you had all the particles lined up along the perimeter of a circle. Doesn't this qualify as information? Or if all the particles lay along the perimeter of a rectangle? Or if a particle moved through time in a path that resembled a parabola? Wouldn't we all consider this to be some form of information? But none of this would be obvious from a comprehensive list of possible states, which can be thought of as the full set of data for the particle system.

Assuming that the above would qualify as information, I got to thinking, what "creates" the information? Anyway, I got to this statement which I know is not a theory, or postulate, or hypothesis, or conjecture and certainly not scientific, but just a statement that seems to make sense: information exists when data represents mathematical structures, or conversely, mathematics is the set of axioms that, when applied to data, produces information.

Does it make any sense? I'd love some feedback on this.

Hey hddd123456789 and welcome to the forums.

This is an interesting question.

If you have a state-space and the relationships between those states, then the first thing to do would be to determine the dimension of the system.

The dimension corresponds to the number of independent 'degrees of freedom' that the system corresponds to. This depends on the nature of the system and how it is defined but whether its a linear system or not, this has to be done.

Then in conjunction with this you may have to resort to finding various measures of entropy. Again the specifics depend on the exact type of system you are dealing with, but essentially the measures of entropy will give you an idea of the information content needed to describe the system and its data/state space.

In terms of representing the data, this is very tricky.

There are different ways of creating the right information to do the kind of thing you are thinking of (like a parabola to represent the data) and each has its own perspective.

Kolomogorov complexity is used in the idea that you could 'find a computer program' that generates the information. This idea basically says that given some information, find the minimal computer program that will generate it back for you. This kind of idea is based on Turing Machines and associated theory.

The other idea is to choose an appropriate language construct that is optimized for this particular representation. You could in essence use this in conjunction with the above by creating a turing machine that unpacks information from a certain class of data types.

In this situation you need to have basically some expert knowledge of the data/information and its domain to get rid of anything redundant.

In terms of "information creation", once you specify the state-space and system itself, then you can begin to look at this after you have done the above things (and no doubt more!). The reason why you have to do this is because once you get a real idea of the process, the creation of information becomes a function more or less of both of these. You have to consider what you describe as information.

The classic way to measure information is through 'information theory' which basically uses different types of entropy measurement. However this is a highly quantitative procedure that gives little context.

What this information really represents is a more qualitative statement. For example your data could fit some complex function exactly in a way that describes the system and its associated data perfectly, but that may not give you anything useful. This is the hardest part because in many definitions information is the stuff you get when you 'make sense of data' and with what your situation is defining, your state-space is just data.
 
Hey, hddd:

Not much insight on what you are trying to do and I am not even sure I understand what it is, I just wanted to say that noticing that you say you are more of a programmer, it seems to be that what you are wondering about is additional information about your data...or as typically said "data about data"...starting to sound familiar? Yes, this is typically refer to as metadata.

You know, you have your set of state variables and their corresponding states...then, you are starting to wonder if such data complies with specific alignments like parabolic or circular, etc...it seems to me that you are talking about metadata...it seems to me that this can of information could be found out after the fact...knowing the state of every state variable...or, if you keep track of history (the states each state variable has gone through), then you could also investigate past states, trajectory, behavior, etc.

Anyway, just thinking aloud...
 
MrAnchovy said:
What you have described is the concept of phase space. Your first idea is a phase space with many dimensions (this is a very important concept as it is the foundation of the quantification of entropy which determines the ultimate fate of the (open) universe). When particles are constrained to lie on a circle, the phase space collapses such that the new phase space has far fewer dimensions. With particles constrained to lie on a rectangle, you have a different phase space with an approximately similar number of dimensions.
While my technical vocabulary is lacking, I think I get what you're saying. Actually I got the idea of building a list of every possible state in the system from a book which happened to mention this idea of a phase space with as many dimensions needed to hold every possible variable in the system.
MrAnchovy said:
I am not sure what you are getting at with your statements about information, and in any case these don't seem to be mathematical statements but philosophy (yes I am dodging the question!) so I don't really have any feedback there other than to present my own description of mathematics adapated from Wikipedia: Mathematics is searching for patterns in abstract data, formulating conjectures regarding patterns and resolving the truth or falsity of conjectures by proofs built upon axioms.

I hope there is something there to continue your interest?
Well I agree that it isn't a strictly mathematical statement (dealing with interdisciplinary ideas as such) but as they say, you have to have some reasoned belief in the truth of an idea to be able to explore it. And, well...let's just say I don't approach this line of thinking as philosophy. And with plenty of interest to boot!
 
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Suppose ,instead of the usual x,y coordinate system with an I basis vector along the x -axis and a corresponding j basis vector along the y-axis we instead have a different pair of basis vectors ,call them e and f along their respective axes. I have seen that this is an important subject in maths My question is what physical applications does such a model apply to? I am asking here because I have devoted quite a lot of time in the past to understanding convectors and the dual...
Back
Top