Yes, I didn't clarify that. I always draw quasistatic lines on an indicator diagram as a solid line, non-quasistatic, but with some state function remaining constant, as a dotted line. (e.g. Joule expansion). Reversible is a subset of quasistatic, so yes, reversible can always be plotted with a solid line. Non-quasistatic, with no state function constant, its just two unconnectable points. (Can't think of an example right now).
Regarding definitions of entropy, I think Lieb and Yngvason have taken the next step beyond Caratheodory. Caratheodory's definition of classical entropy is restricted to quasistatic transformations from equilibrium state 1 to equilibrium state 2, while Lieb and Yngvason's definition of classical entropy removes the quasistatic constraint. Their papers are rather hairy, but Thess gives a more user friendly description - see
https://www.amazon.com/dp/3642133487/?tag=pfamazon01-20 - but I only get the general idea of Lieb-Yngvason and Caratheodory, I still haven't mastered either. My hunch is that Lieb and Yngvason are on to something.
Yes, the "entropy of mixing" problem, where you have A particles on one side, B particles on the other, separated by a partition, both sides at the same temperature and pressure, but obviously not at the same chemical potentials. Then you quickly remove the partition. When equilibrium finally occurs, the two types are fully mixed and the total entropy is larger than the sum of the two original entropies and the chemical potentials are uniform. Usually done for two non-reacting ideal gases, so that the final entropy can actually be calculated. Unlike Joule expansion, you could assume LTE, where any small volume element is a thermodynamic system in equilibrium with a universally constant T and P, and actually solve the problem for two ideal gases as it develops in time. It would be interesting to calculate the entropy density as a function of position as it develops in time. I've been thinking of doing this, just to get a better grasp of entropy creation.