hsdrop said:
does the Earth and the moon run on 2 different flows of time
This question is too vague as you state it, since how we compare the "flow of time" depends on how we match up "the same moment of time" between the Earth and the moon.
One possible more precise formulation would be: suppose we have two observers, one on the surface of the Earth and one on the surface of the Moon. And suppose that every time the Moon is directly overhead as seen by the observer on the Earth, the two observers record the readings on their respective clocks. (Here the passing of the Moon overhead is what defines "the same moment of time" for both observers.) How will the time intervals between two successive readings compare between the two clocks?
To answer this question, we need to take into account three contributions to "time dilation": the Earth's gravity, the Moon's gravity, and the motion of the observer relative to the center of gravity of the Earth-Moon system. The time dilation factor for each observer is then
$$
\sqrt{1 - \frac{2GM_e}{c^2 r_e} - \frac{2GM_m}{c^2 r_m} - \frac{v^2}{c^2}}
$$
where ##M_e## is the mass of the Earth and ##r_e## is the observer's distance from the Earth's center; ##M_m## is the mass of the Moon and ##r_m## is the observer's distance from the Moon's center; ##v## is the observer's velocity; ##c## is the speed of light; and ##G## is Newton's gravitational constant. Note that this is the time dilation factor relative to an observer at "infinity", i.e., one very far away from both the Earth and Moon, and at rest relative to the center of mass of the Earth-Moon system.
For parameter values, we use a velocity of 450 m/s for the Earth observer and 1022 m/s for the Moon observer (remember these are relative to the center of mass of the Earth-Moon system, so the main contribution for the Earth observer is from Earth's rotation and the main contribution for the Moon observer is the Moon's orbital velocity); we use an ##r_e## of the Earth's radius and an ##r_m## of the Moon's distance from Earth minus the Earth's radius, for the Earth observer; and we use an ##r_e## of the Moon's distance from Earth minus the Moon's radius, and an ##r_m## of the Moon's radius for the Moon observer.
Plugging in values, we obtain for the Earth observer a time dilation factor of
0.9999999993038333,
and for the Moon observer a time dilation factor of
0.9999999999506316.
The Moon observer's factor is closer to 1, so his clock will show more elapsed time between two successive readings (i.e., two successive overhead passages) than the Earth observer's clock.
hsdrop said:
would the face that we see of the moon be different that the part that faces away from the Earth ?
From the above formula it should be clear that there would be some difference, because the distance ##R_e## will be different for the Moon observer; it will be the sum of the Moon's radius plus the Earth-Moon distance, instead of the difference between them. This gives a time dilation factor by the above formula of
0.9999999999507357.
rootone said:
A clock on the moon would run at the same rate as a clock on Earth, within any reasonable measure of accuracy.
It depends on what you consider "reasonable".

The ratio of the two time dilation factors above differs from 1 by a few parts in ##10^{10}##, and results in a difference in elapsed time of about 58 microseconds between successive overhead passages of the Moon (about 24 hours 50 minutes by Earth clocks). That is easily detectable with current technology.
The ratio of the two Moon observers' time dilation factors, however, differs from 1 by only a part in ##10^{13}## or so, and results in a difference in elapsed time of only about 9 nanoseconds between successive overhead passages of the Moon. That is detectable with current technology, but is much closer to the limit of accuracy of our best current atomic clocks.