so I'm trying to execute some sorting algorithms.
private static void quicksort(int[] list, int low, int high){
int pivot = list[low + (high-low)/2];
int i = low;
int j = high;
while (i <= j) {
while (list[i] < pivot) {
i++;
}
while (list[j] > pivot) {
j--;
}
if (i <= j) {
int temp = list[i];
list[i] = list[j];
list[j] = temp;
i++;
j--;
}
}
if (low < j){
quicksort(list, low, j);
}
if (i < high){
quicksort(list, i, high);
}
}
This code runs on two arrays of integers with x inputs each (say, 1 billion). The first is sorted, and the second is a permutation on array 1, where n pairs are randomly selected and switched.
I select the middle element as the axis of rotation, so it should be optimal for the sorted case, right?
I measure the time the algorithm takes to sort each array and count the number of switches and recursion steps. As expected, both of these values are higher for sorting array 2 with random permutations.
: - , . n = 10000 - 20 30 . ?