This is just an experiment based on section 6-3 of the Feynman Lectures on Physics:
In our simplest version, we imagine a “game” in which a “player” starts at x = 0 and each time you “move” you need to take a step either forward (towards + x) or back (towards -x) . The choice is to be randomly determined, for example, by a coin toss.
Source: http://www.feynmanlectures.caltech.edu/I_06.html#Ch6-S3
My goal is to calculate the expected distance from the pointing point. Therefore, I believe that each step is equal to one unit of distance. I wrote a simple C program to simulate 30 random steps, and then calculated the final distance from the starting point. This is repeated a million times, and the program averages the distance to get the expected distance.
Theoretically, the expected distance should be the square root of the number of steps. It should be around sqrt (30) = 5.48.
However, the program starts several times and continues to return a value of about 4.33 (more precisely, 4.33461, 4.33453 and 4.34045). Why is this not even close to the theoretical value of about 5.48?
Here is my code:
#include <time.h> #include <stdlib.h> #include <stdio.h> int main ( int argc, char *argv[] ) { int number_of_steps = 30; int repetition = 1000000; int distance = 0; int total_distance = 0; double expected_distance; int i, j; srand(time(NULL)); for ( i = 0; i < repetition; i++ ) { for ( j = 0; j < number_of_steps; j++) { distance += rand() & 1 ? -1 : 1; } total_distance += abs(distance); distance = 0; } expected_distance = (float) total_distance / i; printf ( "%g\n", expected_distance ); return EXIT_SUCCESS; } /* ---------- end of function main ---------- */
source share