As has previously been mentioned "beginning", "end" and "disorder" as english words really convey no understanding of what's going on. Physics is a theoretical framework expressed in mathematics. However, since the vast majority of people do not have sufficient mathematical proficiency to understand theory so it then becomes necessary, when conversing with the layman public, to try and provide the "jist" of what's going on in english words, which, unfortunately, creates a whole lot of trouble. For example, entropy IS a measure of the number of microstates available to a given system, it is expressed as k*ln(omega) where k is a constant (called the Boltzmann constant) and omega is the number of microstates available to the system. Now that's what Entropy IS. Now, at some point, someone, who didn't understand quantum mechanics of statistical mechanics demanded that the concept be expressed in a much simpler way, and then confusion begins. I think the whole "disorder" idea came from a certain line of reasoning that went something like this:
-a microstate is a possible configuration of a given system, a system with low entropy has few micro states and thus can only be in a small number of states (for example, the simplest of systems might be a single confined electron which can only be in two states, either it has a spin up or a spin down)
-therefore, a system of low entropy can kinda be thought to be confined/simple/ordered (once again, none of these words truly encapsalate the idea)
-and thus if a low entropy system is 'ordered' then a high entropy system is 'disordered' and voila you have the oft misunderstood idea that entropy is a measure of disorder. However, for example, if one considers a 'dirty room' with clothes on the floor, bed unmade, etc. one might, in english, say it is disordered and one might then say if the same room was 'clean' and all the clothes folded and put in drawers and the bed made then it is more 'ordered'. However, the entropy of the room is still the same, it has neither gained or lost possible states you just switched it between them. So you see how an english explanation gets you into trouble.
So to sum up, entropy is a measure of the number of states available to a system. That's what it is