T(n) = O(nlogn) and W(nlogn)
To prove that, by the definition of O we need to find the constants n0 and c such that: for each n>=n0 , T(n)<=cnlogn .
We will use induction on n to prove that T(n)<=cnlogn for all n>=n0
Now skip the base register ... (we'll come back later)
Hypothesis: Assume that for each k<n , T(k)<=cklogk
Thesis: We want to prove that T(n)<=cnlogn
But T(n)=2T(n/4)+T(n/2)+n
Using the hypothesis, we get:
T(n)<=2(c(n/4)log(n/4))+c(n/2)log(n/2)+n=cnlogn + n(1-3c/2)
So, taking c>=2/3 T(n)<=cnlogn will prove our thesis, because then T(n)<=cnlogn
Now we need to prove the base case:
We take n0=2 , because if we take n0=1 , then logn will be 0 , and this will not work with our thesis. So our base cases would be n=2,3,4 . We need the following statements:
T(2) <= 2clog2
T(3) <= 3clog3
T(4) <= 4clog4
So, taking c=max{2/3, T(2)/2, T(3)/3log3, T(4)/8} and n0=2 , we find constants c and n0 such that for any natural n>=n0 , T(n)<=cnlogn
The demonstration for T(n) = W(nlogn) is analog.
So, basically, in those cases when you cannot use the theorem of the Master, you need to "guess" the result and prove it by induction.
For more information about these demos, see "Introduction to Algorithms"