Padraic: there is no simple explanation of information that is very useful. At least I have never found one. Maybe someone will surprise!
One thing we can say: In general, information as the answer to some question should reduce your uncertainty about which among possible answers is correct.
A source of a message should have HIGH uncertainty, otherwise you'll know what the message contains before it is sent...we say it has high entropy, yet we want a received message to have LOW uncertainty...so we can understand it!
yet a stream of bits like 1,1,1,1,1,1,1,1,1,1,1,1,1...doesn't tell you much...you know what the next bit will be...
The above ideas pertain most directly to communications 'information'. From an engineering [applications] perspective, that much is pretty well understood.
Let's say I want to gather some "information" about local particles: Accelerated detectors will register different particle counts than an inertial detector! This is called the Unruh
Effect. So how many local particles and the energies (temperature) I observe is not so simple.
Let's switch and talk about manipulating 'information' (bits) in a computer:
only erasure costs energy...that means it increases entropy...so when entropy increases information decreases. This is called Lanudaur's Principle.
http://en.wikipedia.org/wiki/Landauer_Principle
So at the end of the universe as entropy is at a maximum...information will be at a minimum.
Reversible processes behave one way, irreversible another.
As another perspective: someone posted this elsewhere and I liked it:
The above means different observers see different things based on different available information but they should not be inconsistent.This also applies in cosmology.
Finally, you might want to look at 'entanglement', an idea of quantum mechanics.