Entropy, without a doubt. The more I learn about it, the less I understand it. Anybody who thinks that if they learn the thermodynamic definition, the statistical mechanical description, the information theory description, and can solve any problem involving entropy, that they then understand entropy, is, I think, fooling themselves.
There's something really deep going on with entropy. There is no such thing as "before" and "after" in time in any physical law except the second law of thermodynamics, where entropy is defined. Yet we all understand "before" and "after" - our consciousness, our sense of time "flows forward". This is very unscientific, but I just have much the same weird feeling as when studying relativity, how our perception of the distinction between space and time is subjective, not "real" in some sense. In quantum mechanics, what you know about a system is what you choose to know, in some sense. The more you know about one aspect of a system, the less you know about another aspect. Regarding entropy as an information-theoretic concept, entropy is a measure of what you don't know. Classically, the more you know about a system, the less entropy it has. Only in classical physics can you completely "know" a system, and the concept of entropy evaporates, and there is no before or after. But, then, the real world is not classical.
I can calculate the hell out of most problems and get it right. I understand where many legs of the elephant are, but I still can't see the entropy elephant in the living room.