1. The problem statement, all variables and given/known data A grain of pollen shows Brownian motion in a solvent, such that the position x(t) on the x-axis varies with time. The displacement during one second, x(t + 1) - x(t), is measured many times and found to have a Gaussian distribution with an average of 0 and standard devation σ. What is the average and standard deviation of the displacement x(t + 100) − x(t) during 100 seconds? 3. The attempt at a solution I would say the average would still be 0 and the SD still σ. Is that correct?