## Qualitatively define entropy

Hi,
I want to know a concrete qualitative definition of entropy.
If we define it to be a measure of randomness (disorder) in a system then as per intuition it would mean that a system with less probability in a given microstate will have greater entropy. But as per statistical mechanics
S=-K∑ [Pi log Pi]
It means a system with lesser probability will have less entropy isn’t it?
And how is it possible to express about equation as follows
S=-K log N?
 First let me explain why entropy is related to probability. If you have two systems, then each of them will be in an unknown macrostate X1 and X2 respectively. For one isolated system entropy makes no sense. These two macrostates have some number of micro state realizations N1(X1) and N2(X2). So the compound state (X1,X2) will have N1*N2 realizations which is also proportional to the probability of the compound state p(X1, X2). So the most likely state is the one with N1*N2 -> max To avoid multiplication we introduce S=log(N) and now our condition is S_1+S_2 -> max (which is no more than saying we want the most likely compound state) In our model we have (energy) micro states k1, k2, ..., kn for each system. One macro state X specifies how many particles are in each of thes states. So X= a1 particles in state k1; a2 particles in state k2; ... To distribute (a1+a2+...+an) particle in such a way there are a1!a2!...an!/(sum a_i)! possibilities (multinomial distribution). If you now define p_i = a_i/(sum a) and use the approximation $a!\sim a^a$, then you get the equation for entropy $$S\propto \sum_i p_i\ln(p_i)$$ You see it's all connected and derives from pure probability theory.