The example is complicated for different reasons. The first one is that there is a "doppler effect": that is, by you receding from the earth, the successive image frames would reach you later and later (they would have to travel further and further before reaching your ship). Some people think that this is what the time dilatation is about, but it isn't. The time dilatation is what remains if you correct for this "doppler effect" in the classical way (that is, you calculate your successive distances to the earth, and, knowing the speed of the signal, which is lightspeed in this case, you correct for the "travel time"). If you would do this, indeed, Earth would seem to "run slower" (even after the correction of the travel times). It would also "run slower" if you were flying towards the Earth (again, after correcting the travel times, which go in the opposite direction here).
However, with an actual electronic camera, this wouldn't work in fact, because a camera encodes a datastream (analog or digital, it doesn't matter) which encodes for discrete frames every 20 milliseconds or so. As such, on the receiving side, the only thing you can do is decode these frames, and then have them displayed. If you display them using standard equipment also at 20 milliseconds (but with your spaceship's clock of course) you won't see a difference! You will see scenes which "naturally" happen every 20 milliseconds in NY, and you will see them (in your timeframe) also every 20 milliseconds. Things will seem normal to you. You simply will have a problem in the dataflow. You won't get enough frames to feed you continuously with a frame every 20 milliseconds if you are receding from earth, but that's not only relativity's fault, but also the doppler effect's fault.