Entropy has two main definitions: one related to randomness and the other to energy unavailable for work. Statistical entropy is broader in scope, while thermodynamic entropy applies specifically to systems near equilibrium. The Gibbs entropy formula demonstrates the equivalence of these two concepts under certain conditions, particularly within a canonical ensemble. For systems close to thermodynamic equilibrium, statistical entropy can be accurately approximated by thermodynamic entropy. Understanding this relationship is crucial for grasping the principles of thermodynamics and statistical mechanics.