LTP said:
The entropy of a system is
S = k*ln W
If we obtain some information about the microstates, e.g. know the location and velocities of some of the molecules, then W is decreased by a amount w, so
S1 = k*ln(W-w)
That is an decrease in entropy, i.e. S1 < S.
Do physicists still "believe" so?
Yes, at least for von Neumann entropy. The point is that to "know" this microstate information, means that you've limited the number of states the system can be in because now it are those microstates that are compatible with your information: the only way to know that information, is by forcing it somehow in a subset of states.
In practice this doesn't make any difference, because for a statistically significant number of particles (from the moment it starts making sense doing thermodynamics), the amount of knowledge you'd need to gather is so huge before it starts influencing the numerical value of entropy, that it is not feasible.
The reason why this lowers entropy is that you can USE that information to go to a "real" lower-entropy state.
Imagine a gas in a box, which is a mixture of two components A and B. Now, you know that the entropy of this mixture is higher than if component A was in the left half of the box and component B in the right half (eventually with a wall in between). It is the entropy of mixture.
But imagine that somehow, we know all the microstates of motion of all the particles A and B. Imagine that we have still a wall in the box, but this time with a tiny shutter. We could use all that information to pilot the "Maxwell demon", each time we KNOW that a molecule A is going to the left, or a molecule B is going to the right.
So we could go back to the state before the mixture, without (with an ideal demon) spending any "entropy" elsewhere.