Grinkle said:
What is the dividing line between a macro object and a quantum object?
If asked, I'd have said (with no foundation for saying) a molecule is as macro an object as a chair or a table with behavior that can be accurately described by classical physics.
There is no dividing line beetween a macro object and an quantum object. As far as we know, all phenomena regarding matter and radiation is described by quantum theory, i.e., relativistic quantum field theory. At least there's not one exception found for any system ever observed which violates the fundamental principles of QT.
The disinction between macroscopic objects from quantum objects, i.e., the apparent behavior of macroscopic objects according to classical mechanics and classical field theory is just the amazing number of microscopic degrees of freedom, which are rather irrelevant to effectively describe the behavior of such systems. E.g., you understand a whole lot about the motion of the planets around the Sun by just making the very much simplifying an abstract assumption that the planets and the sun can be described as classical "point particles". It's just not so relevant that these are extended objects as far as the motion of the planets around the sun is concerned.
It's of course different if it comes to an understanding of the Sun or the planet itself. Then it might be interesting to now, how it is composed what is its intrinsic mechanics etc. Of course, here we'd also not use quantum theory to describe every little detail, in the extreme the constitution of the system in terms of quarks and electrons together with the four interactions holding these systems together. E.g., to understand the Sun, it's enough to use (magneto-)hydrodynamics and Newtonian gravity etc.
In other words, we use (quantum) statistical physics to reduce the zillions of zillions of microscopic degrees of freedom to the effective description of a few macroscopic observables, depending on the problem you want to understand choosing appropriate relevant macroscopic degrees of freedom. The state of the system is then described sufficiently by these relevant degrees of freedom, which can be defined as averages over many microscopic degrees of freedom.
On the other hand, if you wish to explore more and more details about large objects down to their quantum behavior, it becomes more and more complicated to prepare these systems in states, where quantum effects become relevant. That's due to decoherence. One funny example is the double-slit experiment with the rather large bucky-ball molecules, i.e., a bound state of 60 carbon atoms. These are not that many degrees of freedom yet (I'd call it a mesoscopic system), but it's already difficult to isolate the balls enough from the environment that one gets coherent enough matter-wave like states to do the double-slit experiment. This was done by Zeilinger et al some years ago, and first of all they had to cool down the bucky-balls to sufficiently low temperatures to bring them to low energies of their intrinsic states. Otherwise the bucky-balls would produce a lot of thermal rather soft photons. First of all Zeilinger an his team indeed could successfully demonstrate the paradigmatic example for quantum behavior in the double-slit experiment with C60 molecules. But they could also "heat the molecules up" in a controlled way, so that they were still pretty cold, but warm enough to emit a small amount of photons on their way to the double slit. As expected emitting just a few photons randomized the state of the bucky-balls enough to let the contrast of the interference pattern get worse, and only if they got warm enough to emit some more photons the interference pattern was gone, and they got the distribution as expected from classical objects on the screen. It's not that the bucky-balls weren't behaving according to classical physics all of a sudden, invalidating the quantum description, but the emission of a few thermal photons in random directions was enough "decoherence" to let the outcome of the experiment look classical.