Calculating Time Complexity of a Recusive Algorithm

I'm trying to calculate the time complexity of a recursive algorithm and I think I've almost got it. Here's the psuedocode I've been looking at:

long pow( long x, int n ) {
    if (n == 0)
        return 1;
    if (n == 1)
    return x;
    if(isEven(n))
        return pow(x, n / 2 ) * pow(x, n / 2);
    else
        return x * pow(x * x, n / 2);
} 

isEven merely determines whether or not the integer passed to it is even or not, and for the point of this example, operates in constant time.

So, if n = 0 or n = 1, it operates it has constant time operation, like this: f(n) = C0. However, when n > 1, it should operate like so: f(n)= f(n-1) + f(n-1) + C1 when n is even and f(n)= f(n-1) + 1 when n is odd, correct? Or should it be: f(n)= f(n/2) + f(n/2) + C1 when n is even and f(n)= f(n/2) + 1 when n is odd?

I've been looking at a lot of examples. Here is one I've found very helpful. My problem stems from there being two recursive calls when n is even- I'm not fully sure what to do here. If anyone could point me in the right direction I'd really appreciate it.


ANSWERS:


Take a look at the Master Theorem. You can treat this as a "divide and conquer" algorithm.

The end result is that with the two recursive calls in place, you end up with a worst case O(n) runtime. E.g. pow(x, 4) calls pow(x, 2) twice, and pow(x, 1) four times; in general a power of two will result in n*2-1 calls.

Also note that by just calling pow(x, n/2) once and squaring the result in that branch, the algorithm becomes O(log n).


Let's define f(m) as that it gives you the number of operations of a problem with size m. The 'problem' is of course exponentiation (pow), like x^n or pow(x,n). The long pow( long x, int n ) { function does not need to do more or less work if I change x. So, the size of the problem exponentiation does not depend on x. It does however depend on n. Let's say 2^4 is of size 4 and 3^120 is of size 120. (Makes sense if you see that 2^4=2*2*2*2 and 3^120=3*3*3*3*..*3) The problem size is thus equal to n, the second parameter. We could, if you want, say the problem size is 2*log(n), but that would be silly.

Now we have f(m) is the number of operations to calculate pow(x,m) for any x. Because pow(x,m) is exactly the problem with size m. So, if we have pow(x,y) then the number of operations is, by definition, f(y). For example, pow(3,3*m/2) has f(3*m/2) operations.

Finally, let's count the operations

long pow( long x, int n ) {
    if (n == 0)  //1
        return 1;
    if (n == 1)  //1
    return x;
    if(isEven(n)) //1
        return pow(x, n / 2 ) * //that is f(n/2), y=n / 2
                 pow(x, n / 2); //also f(n/2)
    else
        return x * pow(x * x, n / 2); //1+1+f(n/2)
} 

Taking it together: f(n) = 2*f(n/2) + c1 (n even) and f(n) = f(n/2) + c2 (n odd). If you are only interested in the worst case scenario, then note that the odd case is less work. Thus f(n) is bounded above by the even case: f(n) <= 2*f(n/2)+c.



 MORE:


 ? Exam answer confirmation - Amortized time
 ? What is the Complexity (BigO) of this Algorithm?
 ? time complexity of nested for loop with j<=i condition
 ? Time Complexity of Programs
 ? Time Complexity of Programs
 ? Time Complexity of Programs
 ? Asymptotic analysis
 ? time complexity of an algorithm
 ? Time Complexity for an algorithm
 ? Big O analysis of an algorithm - Job Interview