I know this has been answered, but I replied to provide a very simple way to model sensory events for future search engines.
One easy way is blind copying!
Instead of reading the getevent output and computing it, then give sendevent that is really slow. Just blindly copy gestures from a real device with the same version of Android, then paste them blindly.
You can copy touch input in a real device with:
1- At the command line adb dd if=/dev/input/event2 of=/sdcard/left .
2- Make a gesture that you want to simulate (swipe).
3- This wall creates a file named (/sdcad/left) with data created by your real touch.
4- Move the file anywhere in your AVD, say (/sdcad/left) .
5- In the AVD adb shell, run dd if=/sdcard/left of=/dev/input/event2
Viola! A simulated touch event will occur.
NOTE. On my device, a file that has touch events /dev/input/event2 may differ from device to another, so you can use the trial version and the error first.
In short, if you record and play on the same device:
1- dd if=/dev/input/event2 of=/sdcard/left
2- Touch the real
3- dd if=/sdcard/left of=/dev/input/event2
4- Repeat step 3 as you need.
Greetings :)