Just about all methods involve bouncing something off the electron (or bouncing the electron off of something, which is the same thing if you choose a frame in which the electron is at rest). That covers scattering, cloud chamber trails, spots on a photographic film, detector clicks, and pretty much everything else.
In some experimental setups you can get a very good idea of the momentum of the electron by passing it through a magnetic field of the way to the detector; the Lorentz force deflects the electron in a velocity-dependent way so that you can arrange that only electrons with a specific momentum make it to the detector.
However, I have this nagging suspicion that you're thinking about the Heisenberg uncertainty principle, and imagining that it says that a measurement of position has to change the momentum and vice versa. That's a 1920s-vintage misunderstanding that was abandoned as more was discovered about the mathematical basis of QM - but by then it had entered the popular imagination, and it's proven impossible to uproot it from there.
The uncertainty principle in the modern understanding says something different. If I prepare a system so that the electron is in a precisely known position I can still, if I am clever enough, devise a way of measuring its momentum to whatever degree of precision I wish. However, if I perform this experiment many times with the electron in the exact same position every time, I will get different values on the momentum on each run - and the spread of the momentum values will be given by the uncertainty principle.