This is a classical model, not a quantum model. In a quantum model, the particle is described by a state vector in a Hilbert space; the x, y, and z components of position are parameters that pick out which particular state vector it is. But the Hilbert space itself is not the 3-dimensional space of the x, y, z position vector.
This is a somewhat different sense of the word "information" from the one you're using when you ask about the relationship between information and entropy. When Susskind talks about information not being destroyed, he is referring to quantum unitary evolution; basically he is claiming that unitary evolution can never be violated. But that just means that, as far as the quantum state of an entire system is concerned, its evolution is deterministic: if you know the state at one instant, you know it for all time. And if you know the system's exact state for all time, its entropy is always zero, by definition.
However, if the entire system contains multiple subsystems (such as multiple particles), then it might be possible to assign a nonzero entropy to the individual subsystems, because the subsystems might not have definite states due to entanglement. This is the sort of case the professor you mentioned was talking about. For example, suppose we have a two-electron system in the singlet spin state (i.e., total spin zero); for simplicity we'll ignore their positions (if it matters, consider them to be in some bound state like an atomic orbital with no transition possible). The total entropy of this system is zero, because we know its exact state. But each individual electron has a nonzero positive entropy, because it doesn't have a definite state; its spin could turn out to be in any direction when measured. However, there is also a negative entropy due to entanglement; this is because the electron spins must be opposite, so once we have measured one electron, we know the directions of both electrons' spins. So the total entropy is still zero for the system as a whole.