Preview

01 - Order of complexity

 1. In essence, Big O notation tells you how fast a function grows or declines
Big O notation (with a capital letter O, not a zero), also called 
Landau's symbol, is a symbolism used in complexity theory, computer 
science, and mathematics to describe the asymptotic behavior 
of functions 

  TRUE

  FALSE

 2. Landau's symbol comes from the name of the German number theoretician Edmund Landau who invented the notation. The letter O is used because ________________
An O function has an order. In this context, the order quantifies 
how rapidly O increases with input n and the shape of that increase.

Useful link:  http://web.mit.edu/16.070/www/lecture/big_o.pdf

  O stands for 'Order' and all Big O algorithms involve sorting

  O stands for 'Operand' as the notation refers to algorithms applied to numerical operands only

   the rate of growth of a function is also called its order

  O stands for 'Omega' and Omega refers to the last or final element in a sequence

 3. Refer to the following diagram and definition. Select the option that correctly defines A, B and C
A_B_C_Order_of_complexity.png

  A = quadratic; B = linear; C = constant

  A = constant; B = quadratic; C = linear

  A = constant; B = linear; C = quadratic

  A =linear; B = constant; C = quadratic

 4. Fill in the blanks for the following excerpt
Note that the big-O expressions do not have constants or 
low-order terms. This is because, ______________________
________________________________________________________
......(a constant-time algorithm will be faster than a 
linear-time algorithm, which will be faster than a 
quadratic-time algorithm).

  when N gets large enough, constants and low-order terms start to be significant and matter

  when N gets large enough, constants and low-order terms don't matter

  when N gets small enough, constants and low-order terms start to grow exponentially

  when N gets small enough, constants and low-order terms start to aggregate

 5. In general, how can you determine the running time of a piece of code? The answer is that it depends on what kinds of statements are used. Fill in the blanks for A, B and C
A_B_C_how_to_determine_complexity.png

  A = constant; B = constant; C = O(n)

  A = quadratic; B = constant; C = O(2)

  A = linear; B = linear; C = O(n)

  A = constant; B = constant; C = O(1)

 6. Read the following excerpt and see if you can fill in the blank with the most appropriate option
How efficient is an algorithm or piece of code? 
Efficiency covers lots of resources,
including:

• ________________________ 
• memory usage
• disk usage
• network usage

All are important but we will mostly talk about 
time complexity (______________)

  CPU (time) usage

  algorithm's author

  code length

  human error

 7. Read the following excerpt and decide which of the following statements is a natural conclusion and most accurate

  Complexity affects pefformance and performance will also affect complexity

  Complexity does not affect performance but performance will always affect complexity

   Complexity affects performance but not the other way around.

  Complexity does not affect performance and performance does not affect complexity. They are mutually exclusive

 8. The time required by a function/procedure is proportional to the number of "basic operations" that it performs. What is an example of a basic operation?

  one test (e.g., x = 0)

  one assignment (e.g. x := 0)

  All the listed options are valid examples of a basic operation

  one arithmetic operation (e.g., +, *).

 9. Can you fill in the blanks for the following excerpt?
Some functions/procedures perform the same number of 
operations every time they are called. 

For example, StackSize in the Stack implementation always
returns the number ofelements currently in the stack or states
that the stack is empty, then we say that StackSize takes
____________________

  polynomial time

  O(n)+1 time

  linear time

  constant time

 10. Fill in the blanks
In the BubbleSort algorithm, the number of
elements in the array, determines the number 
of operations performed by the algorithm.

This parameter (number of elements) is called the 
___________________________

  problem size/ input size.

  computational complexity

  queue size

  array or algorithm size

 11. Decide whether the following excerpt is true or false
When we are trying to find the complexity of the 
function/ procedure/ algorithm/program, we are not 
interested in the exact number of operations that are being
performed. Instead, we are interested in the relation of 
the number of operations to the problem size

  FALSE

  TRUE

 12. Typically, we are usually interested in the _____________: what is the ___________ number of operations that might be performed for a given problem size

  best case / minimum

  worst case / maximum

  worst case / minimum

  best case / maximum

 13. In reference to the following excerpt, in the worst case, inserting at the beginning of the array, _________________________________________
For example, inserting an element into an array, we 
have to move the current element and all of the elements that
come after it one place to the right in the array

  ALL of the elements in the array must be moved

  just the LAST element in the array must be moved

  just ONE of the elements in the array must be moved

  NONE of the elements in the array must be moved

 14. For a linear-time algorithm, if the problem size doubles, the number of operations will be cut in half

  FALSE

  TRUE

 15. _______describes an algorithm that will always execute in the same time (or space) regardless of the size of the input data set.

  O(2 to the power of N)

  O(1)

  O(N to the power of 2)

  O(N)

 16. _______ describes an algorithm whose performance will grow linearly and in direct proportion to the size of the input data set.

  O(2 to the power of N)

  O(N to the power of 2)

  O(1)

  O(N)

 17. _____________ represents an algorithm whose performance is directly proportional to the square of the size of the input data set

  O(N to the power of 2)

  O(1)

  O(N)

  O(2 to the power of N)

 18. ____________ denotes an algorithm whose growth doubles with each additon to the input data set. The growth curve of an O(2N) function is exponential - starting off very shallow, then rising meteorically

  O(N)

  O(2 to the power of N)

  O(1)

  O(N to the power of 2)

 19. O(log n) >> Example: Finding an item in a sorted array with a ____________ or a balanced search tree as well as all operations in a Binomial heap.

  logo search (using log tables)

  bubble search

  linear search

  binary search

 20. Determining if a number is even or odd; using a constant-size lookup table would be an example of:

  O(2 to the power of N) - factorial

  O(N) - linear

  O(1) - constant

  O(N to the power of 2) - quadratic

 21. Finding the (exact) solution to the travelling salesman problem using dynamic programming; determining if two logical statements are equivalent using brute-force search would be:

  linear

  quadratic

  exponential

  constant

 22. Multiplying two n-digit numbers by a simple algorithm; bubble sort (worst case or naive implementation), Shell sort, quicksort (worst case), selection sort or insertion sort

  exponential

  quadratic

  constant

  linear

 23. The loop executes N times, so the sequence of statements also executes N times. If we assume the statements are O(1), the total time for the for loop is:
for I in 1 .. N loop
sequence of statements
end loop;

  N * O(N), which is O(1) overall

  N * N which is N(O) overall

  N * 1 which is O(1) overall

   N * O(1), which is O(N) overall.

 24. The outer loop executes N times. Every time the outer loop executes, the inner loop executes M times. As a result, the statements in the inner loop execute a total of:
Nested Loops
=============

for I in 1 .. N loop
for J in 1 .. M loop
sequence of statements
end loop;
end loop;

  O(N) * M times. Thus, the complexity is O(N1 x M2)

  N * M times. Thus, the complexity is O(N * M).

  N * N times. Thus, the complexity is O(N * N)

  M*O times. Thus, the complexity is O(N x 2)

 25. When a statement involves a function/ procedure call, the complexity of the statement includes the complexity of the function/ procedure

  FALSE

  TRUE