If you're interested in how entropy in information theory works, these videos:
https://www.youtube.com/playlist?list=PLbg3ZX2pWlgKDVFNwn9B63UhYJVIerzHL
are quite good.
In physics, entropy plays a role as a measure of the amount of information (in bits) that you don't know about a system.
In thermodynamics, all you know about a system are its macroscopic properties (total energy, volume, particle number, etc). The entropy in this case is the remaining amount of information (in bits) about the system (down to the state of the last particle). Boltzmann's constant is added so that we can keep our old units of temperature.
In quantum physics, there are entropic uncertainty relations (as mentioned by Physics Monkey). What these relations tell you is that is it not possible to prepare a particle to have a definite position and momentum (so that there is no remaining information about the position and momentum of the particle to be known).
What makes entropic uncertainty relations particularly nice is that you can use them to derive other information based limits in quantum measurement.
As an example, information exclusion relations are derived from entropic uncertainty relations.
What these exclusion relations tell us is that the more a measurement tells you about the position of a particle, the less it can also tell you about its momentum, No matter how clever your measurement, if you learn everything about the position of a particle, you learn nothing about its momentum and vise versa.