As far as I understand the problem is, because of inner capacitance of an oscilloscope, if we apply a rectangular signal on the screen we will see semi-sinusoidal one, and we want to handle this capacitance, that the scope shows correct signal. Right?
When we use a probe the circuit looks like a voltage devide:
By this voltage devider, as far as I can see, we reduce input voltage to oscilloscope by the factor of 10 (consider X10 probe).
1. How this reduction of voltage removes the fluctuation on the 1 pic?
2. The devision of voltage by the factor of 10 will occur, as far as I understand, only at 0 frequency and infinite frequency, because in this case we can neglate resistors or capacitors, right? But if say frequency something like in the middle, then we have to consider all resistors and capacitors, which, I assume, will not give an exact devision by 10.