Hey, a few things about information.
Information represents a reduction a space of possibilities. For instance say I rolled a 8 sided die, and tell you its less than 3. I gave you some information because while before all sides were equally likely in your mind for the outcome (1/8 prob of each), now 1 and 2 both have 1/2 probability, the rest are 0 probability. You can quantify this information via probability. Since the probability the outcome was less than 3 was 1/4, the information gained is log2(4/1) = 2 bits of information. If I instead told you it was even, that would be 1 bit, and if I told you the exact number, that would be 3 bits: 2^3 = 8, the number of sides of the die. If there are 2^n possibilities of what can be in a file, than it takes n bits - log(n/1) to represent a version of that file.
That's a sketch. For more precise calculations Google Shannon Information Entropy. Information is about changes in that entropy of a probability distribution.
That's classical information. Classical computers (what we use) destroy classical information all the time, and I understand they release it as heat. Quantum information is a different beast. Recent work shows it can be sent non-locally, without physical methods sending it:
http://phys.org/news/2015-03-quantum-scheme-states-transmitting-physical.html
My understanding is the quantum information exists in a way where if you measure it (thus turning it into classic information) its weird magic properties are lost. You can't send classical information without some mechanism. Also, acts on quantum information are reversible, no information is lost. After you learn about logic gates and do a little matrix math, look up Toffoli gates used to model quantum computers, and see how you can always get the input from the output, in effect reversing every computation. This is impossible with gates like OR/AND.