The calculation of the average angle is usually bad:
... sum += Math.Atan2(yi, xi); } } double avg = sum / (img.Width * img.Height);
The average value of a set of angles does not have a clear meaning: for example, the average of an angle pointing up and one angle pointing down is a right angle. Is this what you want? Assuming the βupβ is + PI, then the average between the two angles almost pointing up will be the angle pointing down if one angle is PI- [small value], the other -PI + [small value]. Perhaps this is not what you want. In addition, you completely ignore the power of the edge - most of the pixels in your real images are not edges, so the direction of the gradient is basically noise.
If you want to compute something like a βmid-rangeβ, you need to add vectors instead of angles and then calculate Atan2 after the loop. The problem is this: this vector sum does not report anything about the objects inside the image, since gradients pointing in opposite directions cancel each other out. This only tells you about the difference in brightness between the first / last row and the first / last column of the image. Perhaps this is not what you want.
I think the easiest way to orient the images is to create a histogram of the angle: create an array with (for example) 360 cells for 360 Β° gradient directions. Then calculate the angle and gradient value for each pixel. Add each gradient value to the rectangular tray. This will not give you a single angle, but an angle-histogram, which can then be used to orient two images to each other using a simple cyclic correlation.
Here is an embodiment of the Mathematica implementations that I put together to see if this works:
angleHistogram[src_] := ( Lx = GaussianFilter[ImageData[src], 2, {0, 1}]; Ly = GaussianFilter[ImageData[src], 2, {1, 0}]; angleAndOrientation = MapThread[{Round[ArcTan[#1, #2]*180/\[Pi]], Sqrt[#1^2 + #2^2]} &, {Lx, Ly}, 2]; angleAndOrientationFlat = Flatten[angleAndOrientation, 1]; bins = BinLists[angleAndOrientationFlat , 1, 5]; histogram = Total /@ Flatten[bins[[All, All, All, 2]], {{1}, {2, 3}}]; maxIndex = Position[histogram, Max[histogram]][[1, 1]]; Labeled[ Show[ ListLinePlot[histogram, PlotRange -> All], Graphics[{Red, Point[{maxIndex, histogram[[maxIndex]]}]}] ], "Maximum at " <> ToString[maxIndex] <> "\[Degree]"] )
Results with sample images:

The histograms of the angle also show why the average angle cannot work: the histogram is essentially one sharp peak, the other angles are approximately uniform. The average value of this histogram will always be determined by a uniform "background noise". That is why you have an almost identical angle (about 180 Β°) for each of the "real live" images with your current algorithm.
The tree image has one dominant angle (horizon), so in this case you can use the histogram mode (the most frequent angle). But this will not work for every image:

Here you have two peaks. Cyclic correlation should still orient the two images towards each other, but just using the mode is probably not enough.
Also note that the peak in the angle histogram is not βupβ: in the tree image above, the peak in the angle histogram is probably the horizon. So he points up. In the image of Lena, this is a vertical white stripe in the background - so she points to the right. Simply orienting images using the most common angle will not rotate each image right-side up.

This image has even more peaks: using the mode (or, possibly, any single angle) would be unreliable for the orientation of this image. But the histogram of the angle as a whole should still give you a reliable orientation.
Note. I did not pre-process the images, I did not try to work with the gradient at different scales, I did not process the final histogram. In a real application, you have to configure all these things in order to get the best algorithm for a large set of test images. This is just a quick test to see if an idea can work at all.
Add: To orient two images using this histogram, you would
- Normalize all histograms, so the area under the histogram is the same for each image (even if some of them are brighter, darker or blurry).
- Take the histograms of the images and compare them for each rotation that interests you:
For example, in C #:
for (int rotationAngle = 0; rotationAngle < 360; rotationAngle++) { int difference = 0; for (int i = 0; i < 360; i++) difference += Math.Abs(histogram1[i] - histogram2[(i+rotationAngle) % 360]); if (difference < bestDifferenceSoFar) { bestDifferenceSoFar = difference; foundRotation = rotationAngle; } }
(you can speed it up with FFT if the length of your histogram is two, but the code will be much more complicated, and for 256 boxes this may not matter)