# Performance Analysis

• Complexity
• Definition (HSM Ch.1.6)
• Summing the first N integers
• Fibonacci numbers
• The 1st Fibonacci number is 0
• The 2nd Fibonacci number is 1
• The Nth Fibonacci number is the (N-1)th + (N-2)th Fibonacci numbers, for N > 2.
• The Fibonacci series is
0, 1, 1, 2, 3, 5, 8, 13, ...
• RecursiveFibonacci.cpp - Computing the Nth Fibonacci number using the definition directly.
• Observe that it can be done linearly. LinearFibonacci.cpp - Computing the Nth Fibonacci number linearly.
• What's the difference? How big is the difference?
• The space complexity of a program is the amount of memory it needs to run to completion.
• The time complexity of a program is the amount of CPU time it needs to run to completion.
• Performance analysis estimates space and time complexity in advance, while performance measurement measures the space and time taken in actual runs.

• Space Complexity Analysis (HSM Ch.1.6.1.1)
• Constant and variable parts
• Examples
• Arithmetic expression (HSM Program 1.12)
• Constant space array sum (HSM Program 1.13)
• ArrayPlay.cpp - Array manipulation
• Linear space array sum (HSM Program 1.14)

• Time Complexity Analysis by Counting Steps (HSM Ch.1.6.1.2)
• A program step is a segment of a program whose execution time is (in principle) independent of instance characteristics.
• Measure the number of steps performed after program is loaded in memory, wrt the instance.
• Number of steps per statement
• Declarations = count for constructor, or 0 if none
• Expressions = sum of counts for functions invoked, or 1 if none
• Assignment = Size of lvalue + count for evaluating RHS
• Conditional loop control = count for condition
• Definite loop control = 1 if instance independent, otherwise count for initialization and condition the first time and count for reinitialization and condition thereafter.
• Switch control = count for selection expression
• Conditional control = count for condition
• Function invocation = count for instance dependent value parameters, or 1 if none, plus count for local variable declarations
• Memory management (new, delete) = count for constructor or destructor, or 1 if none
• Jumps = count for any expression evaluated, or 1 if none
• Inserting a counter
• Linear sum (HSM Program 1.13,1.15,1.16)
• Recursive sum (HSM Program 1.14,1.17)
• ArrayPlay.cpp - Array manipulation
• Matrix addition (HSM Program 1.18-20)
• Building a table
• Linear sum (HSM Program 1.13, Table 1.1)
• Recursive sum (HSM Program 1.14, Table 1.2)
• Matrix addition (HSM Program 1.18, Table 1.3)
• This approach can be parameterized by input characteristics, e.g., number of inputs, value of inputs.
• This approach relies on the algorithm having a determined execution path.
• If the execution path is not determined, then best case, average case, and worst case complexity may be considered.
• The `compare` function can take 1 or 2 steps, and the numbers of loops depend on the polynomials' terms' exponents.
• In the best case the polynomials' terms' have the same exponents: the `while`loop is done the number of terms times and the for loops are not done. The count is 12N + 8, for N terms.
• In the worst case the polynomials' terms' have different exponents; the `while`loop is done twice the mimimum number of terms times, and a for loop is done the difference between the numbers of terms times. The count is 8(N+M) + 5(N-M) + 8 = 13N + 3M + 8, for N and M terms, N > M. If N = M, that's 16N + 8.
• Building a worst case table
• In any case, step count does not give exact data, as the size of a step is not constant.

• Asymptotic Complexity Analysis (HSM Ch.1.6.1.3)
• A motivation for analysis is to compare algorithms
• Exact step count may be difficult or impossible
• Step count depends on step size
• Complexity is typically proportional to input "size"
• For small input, complexity is often small
• For large input, complexity comparison may be based on orders of magnitude
• Orders of magnitude
• Given an array of N integers, how complex is:
• Add 1 to the first element. Write down the result.
• Look at the middle element; if it is odd then repeat this operation on the top half of the array, otherwise repeat it on the bottom half of the array, until the portion of the array being used has only one element. Write down that last element.
• Add 7 to each element. Write down each result.
• For each element do the operation of recursively looking at the middle element, described above, but making the decision based on the polarity of the difference between the middle element and the element under consideration.
• For each element, multiply it by each other element. Write down each result.
• Write down 0. For each element of the array, write down the sum and difference between the element and each value written down when using the previous element (starting with 0 the first time round).
• Commonly used orders of magnitude are:  Constant 1 Logarithmic logb(N) Linear N NlogN N * logb(N) Polynomial (of order m) Nm Exponential (of base k) kN
• For large N, the differences due to the factors of proportion and differences due to added terms of a smaller order of magnitude, are insignificant compared to the differences caused by higher complexity. See HSM Figure.1.3 and Table 1.7
• Example: an algorithm of complexity c1N2 + c2N is worse than one of complexity c3NlogN, for sufficiently large values of N.
• For smaller N, the factors of proportion and smaller added terms are significant. There is some value of N after which the larger order of magnitude dominates. This is called the break-even point.
• The break-even point cannot be determined analytically; empirical evaluation is needed.
• Big-O notation
• f(N) = O(g(N)) iff there exist positive constants c and N0 such that f(N) <= cg(N) for all N >= N0
• O(g(N)) provides an upper bound for a true complexity f(N)
• Example: 3N+2 = O(N) because 3N+2 <= 4N for all N >= 2
• Note that the g(N) function does not need a constant multiplier; it can always be factored out into the constant "c".
• Any constant multipliers on the principle term can be factored out into the constant "c".
• Any added terms of smaller magnitude can be removed by increasing N0
• Examples
• ```for (k=1; k <= n/2; k++) {
for (j=1; j <=n*n; j++) {
Do Something
}
} ```
• ```for (k=1; k <= n/2; k++) {
Do Something
}
for (j=1; j <=n*n; j++) {
Do Something
} ```
• ```k = n;
while (k > 1) {
Do Something
k = k/2;
} ```
• LinearFibonacci.cpp
• RecursiveFibonacci.cpp
• Omega notation
• f(N) = Omega(g(N)) iff there exist positive constants c and N0 such that cg(N) <= f(N) for all N >= N0
• Omega(g(N)) provides a lower bound for a true complexity f(N)
• Example: 3N+2 = Omega(N) because 3N <= 3N+2 for all N >= 1
• Theta notation
• f(N) = Theta(g(N)) iff there exist positive constants c1, c2, and N0 such that c1g(N) <= f(N) <= c2g(N) for all N >= N0
• Theta(g(N)) provides an upper and lower bound for a true complexity f(N)
• Example: 3N+2 = Theta(N) because 3N <= 3N+2 <= 4N for all N >= 2
• Examples
• sum.cpp - Linear sum (HSM Ch.1.6.1.2)
• sum.cpp - Recursive sum (HSM Ch.1.6.1.2)
• Permutations (HSM Ch.1.5.2)
• Binary search (HSM Ch.1.5.1)
• Reality Check (HSM Ch.1.6.1.4)
• Asymptotic analysis relies on "large enough N"
• For small N, and equal asymptotic complexity, factors of proportion and differences due to added terms of a smaller order do matter.
• Constant complexity is boring, logarithmic complexity is magic, linear complexity is good, N-logarithmic complexity is realistic, polynomial complexity can get bad, exponential complexity is ugly, and it could be worse. See HSM Table.1.8.

## Exercises

• Counting steps: HSM Ch.1.6, exercises 4-6.
• Asymptotic analysis: HSM Ch.1.6, exercises 8,11.

## Exam Style Questions

• Define {time complexity,space complexity,performance analysis, performance measurement}.
• What is the space complexity of the following program?
[A program]
• Copy the following program to you answer book, and show the computation of its step count.
[A program]
• What are the six orders of magnitude commonly used in asymptotic time complexity analysis? Give their big-O form and English names.
• Define the {big-O,Omega,Theta} complexity measures.
• If a program has a step count of [some step count], what is it's {big-O,Omega,Theta} complexity?
• What is the big-O time complexity of the following program?
[A program]