What is Information? - Exploring the Unseen

In summary: And the cat decides... don't move.In summary, 200 years ago, "energy" was being discovered and since then, it has been a commonly used term, yet few truly understand its nature. However, scientists have a clear understanding of what energy is. This is in contrast to the concept of "information", which has been explored through mathematical theories but still lacks a physical, biological, and psychological theory. The concept of "information" may hold the key to understanding the subconscious and consciousness. In scientific and non-scientific terms, information can be defined as a declarative statement or a trace of something that has happened in the past, with a close relation to memory and time. In the context of
  • #1
adi
6
0
200 years ago, "energy" was being discovered and since then, one pronounces it every day, without ever being conscious. Everybody talks about energy, yet few understand it. Nevertheless, scientists understand very well what energy is.

Nobody understands "information", despite three mathematical theories of information, as those of Shannon, Kolmogorov and Bennet. There is no physical theory of information. There is no biological theory of information. Moreover, there is no psychological theory of information.

Maybe this concept, once well defined and quantified, would lead to an approach to the subconscient, to the consciousness.

So, what is "information"? In scientific and non scientific words ...
 
Last edited:
Physics news on Phys.org
  • #2
information is..

In attempt to initiate the discussion...

information is any declarative statement that gives you a clue of something...

by the way i think that this topic belongs to other forum, right?
 
  • #3
This discution will be mostly philosophical at its very beginning. I am curios what one thinks the information is, using his own words.

Besides that, there are also scientific theories and I would be interested in discussing them too.

I suggest creating a list of "human definitions". You have already suggested one: " a statement, an affirmation".

I suggest one too: "a trace, a proof of something that has happened in the past". A direct relation to memory, to time.
 
  • #4
I think that the concept is related to the so-called information entropy. Systems with low information entropy have more information that systems with high information entropy
 
  • #5
And thus one enters the physics approach of "information". This definition of "information" is the opposite of the notion of information entropy in statistical physics.

A system has a great number of micro-states that lead to the same macro-state. When tending to equilibrum, the system tends to the macro-state with the largest number of microstates possible in those exact contraintes of the system. Entropy is proportional to the natural logarithm of the number of micro-states. Thus, at equilibrium, the entropy is maximum.

Moreover, at equilibrium, every micro-state has the same probability, the inverse of the number of microstates. If one wants to describe a system at equilibrium, more the entropy is great, more the number of micro-states is great and less the probability to do it right.

The greater the entropy of the system in equilibrium, the smaller one's "guess ability" is. One usually expresses that by: "when the entropy of a system increases, his information decreases".
 
Last edited:
  • #6
A system has a great number of micro-states that lead to the same macro-state. When tending to equilibrum, the system tends to the macro-state with the largest number of microstates possible in those exact contraintes of the system. Entropy is proportional to the natural logarithm of the number of micro-states. Thus, at equilibrium, the entropy is maximum.

Moreover, at equilibrium, every micro-state has the same probability, the inverse of the number of microstates. If one wants to describe a system at equilibrium, more the entropy is great, more the number of micro-states is great and less the probability to do it right.
You're talking here of thermodynamical entropy

An interesting measure of information is Kolmogorov complexity.
In it, the information of an object depends on the minimal length of a computer program that is able to output the object. However, the exact Kolmogorov complexity is not possible to calculate, only upper bounds
 
  • #7
My definition

Any ensemble of data that helps you to predict a physical occurence.
 
  • #8
Bonjour,
IMHO and I can be wrong,

When you relate information to the inverse of entropy, that's a qualitative relation. When I evaluate entropy of an information system, I refer to the information quality not the information quantity.

For example, take the "telephone" game plays by very young persons. You could estimate the information entropy upon the number of persons throught which the "phrase" goes through. Each one, flowing from listening -...- to speaking phases, collaborates to increase the total entropy. This evaluation is irrelevant to the "phrase" contents, except when the "phrase" is in another "langage".

"Ceci esta von informatiom.", four languages and one typo!

You could interpret entropy as energy related to the (estimated and always increasing under transformation) system's incertainty.

May be your question shall be "What is entropy?" but in this case I play phone game!
 
  • #9
Hello Imagine,
nice to see a post from you again.
I remember from "decision theory" that one property of
information (as modeled in that context) is that
it has some chance of affecting one's behavior

There is a kind of behavioral definition. I look out of the window into the garden and see a cat lying in a patch of sunlight and I ask "what is information for that cat?"

The answer is that there is a lot of data present in the garden which, for the cat, are not information. The only data that qualify are those which evoke a response. If I tap on the window, it will move its ear. Or if a bird calls.

Likewise with a consumer making economic decisions. If he cannot decide which thing to buy then he must "get more information". What qualifies as information depends on whether it has some chance of affecting his decision.

Making a prediction is one kind of behavior. So this definition of information appears to include what Tyger just said----info is data that affects predictions. Indeed it must be data which affects any kind of decision (whether or not mediated by a prediction)----or so it seems in the context of "decision theory".

If one insists on limiting the concept of info to physics, or to astronomy, then the physicist takes the place of the cat. The consumer deciding on which cheese to buy is replaced by the astronomer judiciously and prudently selecting which model of the universe he will believe in.
 
  • #10
Hello marcus,
Don't forget the piano. Yes, a piano is falling onto the poor cat.

What I mean is, in your decision process, you are gathering and filtering information. These are decisions. You decide which information is pertinent, or not, for you (the cat). In fact, "delta" something is information. The bird, the bug, the piano, the rain, the mouse, as soon as your senses get these "deltas". Think about camouflage which is targetting no "delta" at all.

Your undecided customer have to get more "delta" (statuquo is not a solution) to lower his decision incertainty upto a point where he is subjectively comfortable with that level of entropy. Business as usual.

Let's talk about mexican jumping beans ! When you observe mexican jumping bean, "delta" something appears but you are interpreting these upon your background, or previously recorded "deltas". In fact, mexican bean jumps because they like mexican music.
 
  • #11
Also, in possibility theory, nonspecificity can serve as a measure of information

Just for your information, there are 3 theories that interpret information in creationism terms:
Gitt information
Spetner information
Dembski information

I'm not deffending this 3 theories (really I don't know what they claim)
 
  • #12
I think that the most simple definition to INFORMATION is may be:

An awareness to an association between at least two different physical and/or non-physical processes and/or states.
 
Last edited by a moderator:
  • #13
Information doesn't exist!

I think there is a problem with 'information' as a concept -- basically, it doesn't exist in the sense that the laws of physics don't 'exist', but they are claims we thinking beings make in trying to make sense of our world.
Here's why many defs of information are problematic to me imho.

1.defining information as an occurence in the world that affects a thinking being (like a cat!) in some way begs the question 'who judges what qualifies as 'affecting' the cat?' That little bug hidden behind the plant in the garden is generating events in the cat's realm and thus at some level affecting the cat, as is my typing here today (assuming the cat exists - and that I'm not some bot just spewing information) -- ahem, sorry. So we're all affecting each other to some degree, but someone has to claim 'hey, that matters to me, or to my cat'. So we don't have much to stand on here but claims made about thresholds of when things 'affect' other things. (Which is not making this def meaningless, but just subjective to individual preferences for the word 'affects')

2. defining information as any kind of length or amount measured in a physical occurence (ie Shannon's information theory and all variants) is simply the building a gauge of some sort, and if such a definition of information is argued, then it cannot really be 'falsified', but it is also useless for understanding much beyond comparing how different phenomena stack up to each other in regards to that gauging of so-called 'information'.

3. Stating that information is something learned or noticed or a thought of some kind, like a mental reaction that comes from some phenomena (note not the phenomena itself) seem very popular in common understandings of the term, but they all run up against the issue of being definitions that equate 'information' with brain activity (all brain activity or any subset of brain activity). Again, I don't argue that such a def. is false but it doesn't add much since brain activity is already a well-termed thing in science --- notably unlike 'energy' a couple of centuries ago.

There exists no "information" as an important, exclusive 'thing' in the world, as opposed to what Weiner says: 'there is matter, energy, and information.'

Then again, 'important' and 'exclusive' are claims as well, so this is just my opinion...
 
  • #14
Speaking physically, I would say information is order vs Chaos or non order. This would include QM entanglement, the information one particle has that implys information about an entangled particle.
I just came from the Philosopy section where we are discussing information in terms of materialism. Is information real, material or unreal, subjective?
An EM wave can contain static, nothing or contain order, information just as photons, electrons etc. and can transport that information from one location to another as well as one media to another. It is by the ordering of characteristics of the wave or particle that information is carried. Does this make it real, physical? What about the image or thought that the information conveys, is it real does it exist? I don't know and I think it best left over in the philosophy form. But then how much closer can we get to philosophy than Theory development?
 
  • #15
State of energy hold in an isolated frame

What is information?
To me information is a state of energy hold in an isolated frame. Information is essential a state of unbalance.
2. Isolation is key. Without isolation no unbalance. One would become zero again without isolation.
Isolation means a boundary. Where does the boundary comes from?
3. Next question: Information about what ... coming from where?
Clearly from event(s) that what was/were 'before'. So the cause or causing factors. Previous parameters which were excited and caused the unbalanced state of energy.
4. Previous? So there must be history.
but: History that STILL HOLDS to fund the present! [cfr. how the historical combinations of atoms (Past) make a specific neurotransmitters-molecule (Now)]. Thus previous information still 'in-corporated' inside the Now-event.
5. Now the prime key however is what is energy?
Here we see the heuristic paradox. Correct interpretation of Einstein leads us to conclude that Energy is a specific expression of space-time. So 'sub-energy hold in an isolated frame' means: spacetime incorporated in spacetime.
So information is a sub-set of spacetime in relation to a specific level of in another way (hierarchic) expressed spacetime.

Now we come to my own theory about an unbreakable membrane that has a self-fertilizing infolding. (I will post that soon on the Theory development forum)

(see the mechanism: http://www.superstringtheory.com/forum/dualboard/messages12/672.html )

With the pelastrating membrane such essential historical integrity is reached in an engineering approach.
Next to the internal or local interactions (inter-actors) the Membranes (in essence a SINGLE membrane) are conductors of LOCAL + NON-LOCAL oscillations.

LOCAL INTERACTION: In the animated gif above: http://www.superstringtheory.com/forum/dualboard/messages13/319.html you can see how knowledge (information) knots are created. The purple event couples three layers of spacetime. In our consciousness discussion you may say that the lowest red layer represent Jung unconsciousness, the mid-layer the individual unconsciousness and the blue top-layer of spacetime: individual consciousness.

NON-LOCAL INTERACTIONS: Each of these three layers are just extensions as 'different' layer networks, which finally are united again in ONE deeper membrane layer again. I repeat: in essence it's a SINGLE membrane. I call the transfer of information over non-local sub-membranes: membrane causality.

So it's is a mechanism of restructuring spacetime ... No tricks, No magic superposition ... show of hands.
That there is mechanism doesn't mean that Deistic ideas are excluded. People who want so can say: 'That's how HE did it ... maybe'.

Historical integrity leads to memory and brings storage of experience.
Next links may give you some extra mental pictures or reflections.

About envelops:
http://www.superstringtheory.com/forum/dualboard/messages12/602.html
http://www.superstringtheory.com/forum/dualboard/messages13/787.html
http://www.superstringtheory.com/forum/metaboard/messages18/51.html

That's for now.

Dirk
 
Last edited by a moderator:
  • #16
Information is relative

I think information at its fundamental level is something like a coordinate in a system. It tells us something only relative to the system. And without a system, or a context, information doesn't exist.
 
  • #17
My definition of information.

The information this all that is not "nothing”.
 
  • #18
Michael's definition of information.

Originally posted by Michael F. Dmitriyev
The information this all that is not "nothing”.

Hi Michael,

your post doesn't contains much information. ;-)
Can you explain more?

Thanks

dirk
 
  • #19


Originally posted by pelastration
Hi Michael,

your post doesn't contains much information. ;-)
Can you explain more?

Thanks

dirk
Hi pelastration,
I think, that my definition of the information contains a maximum of information. It corresponds to definition of concept "object" and contains all of them.
I’ll remind that “object” is that we can allocate in the world around us and give to it some characteristic or the name, at least. It is the information accessible for the further processing. Existence of object is information already. Existence of set of objects which we do not suspect at all, does not deny their existence but does not contain for us any information. Because for us it is "nothing".
 
  • #20


Originally posted by Michael F. Dmitriyev
Hi pelastration,
I think, that my definition of the information contains a maximum of information. It corresponds to definition of concept "object" and contains all of them.
I’ll remind that “object” is that we can allocate in the world around us and give to it some characteristic or the name, at least. It is the information accessible for the further processing. Existence of object is information already. Existence of set of objects which we do not suspect at all, does not deny their existence but does not contain for us any information. Because for us it is "nothing".

An "object" contains at least two type of information:
(1) A proof of 'Existence' like you call it.
(2) Historical information about how the object came to existence. That historical information may be hidden to our observation systems but it is essentially stored inside the object.

Dirk
 
  • #21


Originally posted by pelastration
An "object" contains at least two type of information:
(1) A proof of 'Existence' like you call it.
(2) Historical information about how the object came to existence. That historical information may be hidden to our observation systems but it is essentially stored inside the object.

Dirk
May be, as a part of the whole. Though the main task of any object this self-preservation.
I think, each object contains much more information, than it contains in your list. Its general volume should provide the program of existence of an object. This program not rigid, but multiple. The choice of a variant depends on signals acting as feedback. The more complex object has the more complex program.
 

1. What is information?

Information is a broad term that refers to knowledge, data, or facts that are communicated or received through various forms such as language, symbols, or signals. It can also be described as the meaning or significance that can be extracted from data.

2. How is information different from data?

Data is raw, unprocessed information that has not yet been organized or interpreted. On the other hand, information is data that has been analyzed and structured in a way that makes it meaningful and useful.

3. What is the role of information in science?

Information is crucial in science as it helps scientists to understand and explain the natural world. It is used to gather evidence, make predictions, and communicate findings to others. In scientific research, information is also used to build upon existing knowledge and develop new theories.

4. Can information be measured?

Yes, information can be measured using different units such as bits, bytes, or words. These units represent the amount of data or knowledge contained in a particular piece of information. However, the quality and value of information cannot be measured quantitatively.

5. How has technology impacted the way we access and share information?

Technology has greatly influenced the way we access and share information. With the internet and digital devices, information can be accessed and shared instantly and globally. This has increased the speed and efficiency of information dissemination, but also raises concerns about the accuracy and reliability of information in the digital age.

Similar threads

  • Other Physics Topics
Replies
1
Views
7K
  • New Member Introductions
Replies
1
Views
84
  • Astronomy and Astrophysics
Replies
12
Views
3K
Replies
35
Views
8K
  • STEM Academic Advising
Replies
8
Views
916
Replies
2
Views
73
  • STEM Academic Advising
Replies
7
Views
859
Replies
6
Views
2K
Replies
10
Views
1K
Back
Top