I am a biologist studying cell migration. I made some time-lapse movies to measure the speed of cell movement. Pictures were taken every 30 second, and I measured the distances that the cell traveled in each 30 second-intervals (step distance). To get the speed I thought it would be very simple: (sum of step distance)/total time. However, my collaborator wants to use RMS (root mean square) of the step speeds to describe the speed:
square root of average (step speed squared).
He told me that this calculation took into account of statistical errors. However, he has not been able to give me any satisfactory explanation on why RMS fixes statistical errors. I wonder if anyone here could help me out.