Time Complexity

  • Uploaded by: api-3814408
  • 0
  • 0
  • November 2019
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Time Complexity as PDF for free.

More details

  • Words: 1,090
  • Pages: 13
Time Complexity UC Berkeley Fall 2004, E77 http://jagger.me.berkeley.edu/~pack/e77

Copyright 2005, Andy Packard. This work is licensed under the Creative Commons Attribution-ShareAlike License. To view a copy of this license, visit http://creativecommons.org/licenses/by-sa/2.0/ or send a letter to Creative Commons, 559 Nathan Abbott Way, Stanford, California 94305, USA.

Time complexity of algorithms Dependency of – the time it takes to solve a problem – as a function of the problem dimension/size

Examples: – Sorting a list of length n – Searching a list of length n – Multiplying a n×n matrix by an n×1 vector

Time to solve problem might depend on data – Average-case time – Best-case time • data is well suited for algorithm (can’t be counted on)

– Worst-case time • data is such that algorithm performs poorly (time-wise)

Worst-Case gives an upper bound as to how much time will be needed to solve any instance of the problem.

Example: N-by-N matrix, N-by-1 vector, multiply

Total = c1N+c2N+N(c3+c2N+N(c4+c5)+c5) = (c2+c4+c5)N2 + (c1+c2+c3+c5)N = c6N2 + c7N

N times

Y = zeros(N,1); initialize space, c1N initialize “for” loop, c2N for i=1:N Scalar assignment, c3 Y(i) = 0.0; initialize “for” loop, c2N for j=1:N c4 Y(i) = Y(i) + A(i,j)*x(j); End of loop, return/exit, c5 end End of loop, return/exit, c5 end

N times

(3 accesses, 1 add, 1 multiply)

Asymptotic Time complexity of algorithms Dependency of – the time it takes to solve a problem – as a function of the problem dimension/size

but – formula may only be valid for “large” problems

So, we keep track of “growth rates” of the computational workload as a function of problem dimension

Order, called “big O” notation Given two functions, f and g, say that “f is of order g” if – there is a constant c, and – a value x0

such that

f ( x ) ≤ c ⋅ g ( x ) for all x > x0

So, apart from a fixed multiplicative constant, the function g is an – upper bound on the function f – valid for large values of its argument.

Notation: write f = O( g ) to mean “f is of order g”. Sometimes write f ( x ) arguments are labled.

= O( g ( x ) ) to remind us what the

Not equality,but “belongs to a class” Recall that f ( n ) = O( g ( n ) ) means that – there is a constant c, and – a value n0

such that f ( n ) ≤ c ⋅ g ( n ) for all n > n0 The = sign does not mean equality! It means that f is an element of the set of functions which are eventually bounded by (different) constant multiples of g. Therefore, it is correct/ok to write

3n = O( n ) ,

( )

3n = O n 2 ,

(

3n = O n 3 ⋅ log n

)

Big O: Examples Example:

Note:

f ( n ) = 31 g ( n) = 1 for all n > 0, 31 ≤ 31 f ( n ) 31⋅ g (n)

For all n, f is bounded above by 31g Write or

f ( n ) = O( g ( n ) ) 31 = O(1)

Big O: Examples Example:

f ( n ) = 4n 2 + 7 n + 12 g ( n) = n 2

2 2 for n > 8 , 4 n + 7 n + 12 < 5 n Note:

f ( n) For large enough n f is bounded above by 5g Write f ( n ) = O( g ( n ) ) f = O( g ) or 2 2 4 n + 7 n + 12 = O n or

( )

5 g ( n) 1200

1000

800

600

400

200

0

0

5

10

15

Big O: Relationships Suppose f1 and f2 satisfy: There is a value n0

f1 ( n ) ≤ f 2 ( n ) for all n > n0 Then f1 ( n ) + f 2 ( n ) ≤ 2 f 2 ( n ) for all n > n0 Hence

f1 + f 2 = O( f 2 )

Example 3:

27 log( n ) + 19 + n = O( n ) f1 ( n )

f 2 ( n)

Generalization: f1 = O( f 2 ) ⇒

f1 + f 2 = O( f 2 )

Big O: Relationships Suppose positive functions f1, f2, g1, g2, satisfy

f1 = O( g1 ) and Then f1 ⋅ f 2 = O( g1 ⋅ g 2 )

f 2 = O( g 2 )

Why? There exist c1, c2, n1, n2 such that

f i ( n ) ≤ ci ⋅ g i ( n ) for all n > ni

Take c0=c1c2, n0=max(n1,n2). Multiply to give

f1 ( n ) f ( n ) 2 ≤ c0 ⋅ g1 ( n ) g 2 ( n ) for all n > n0

Example:

(12n

2

)

( )

+ 2 log n ( 27 log( n ) + 19 + n ) = O n 3 f 2 ( n) f1 ( n )

Asymptotic: Relationships Obviously, for any positive function g

g = O( g ) Let β be a positive constant. Then β ⋅ g = O( g ) as well.

( )

Example 4: 13n = O n 3

3

Message: Bounding of growth rate. If n doubles, then the bound grows by 8. Example 5:

log n = O( log n ) 512

N times

Y = zeros(N,1); initialize space, c1N initialize “for” loop, c2N for i=1:N Scalar assignment, c3 Y(i) = 0.0; initialize “for” loop, c2N for j=1:N c4 Y(i) = Y(i) + A(i,j)*x(j); End of loop, return/exit, c5 end End of loop, return/exit, c5 end

N times

Example: N-by-N matrix, N-by-1 vector, multiply

Total = c6N2 + c7N •Problem size affects (is, in fact) N •Processor speed, processor and language architecture, ie., technology, affect c6 and c7 Hence, “this algorithm of matrix-vector multiply has O(N2) complexity.”

Time complexity familiar tasks Task Matrix/vector multiply Getting a specific element from a list Dividing a list in half, dividing one halve in half, etc Binary Search Scanning (brute force search) a list Nested for loops (k levels) MergeSort BubbleSort Generate all subsets of a set of data Generate all permutations of a set of data

Growth rate O(N2) O(1) O(log2N) O(log2N) O(N) O(Nk) O(N log2N) O(N2) O(2N) O(N!)

Related Documents

Time Complexity
November 2019 21
Complexity Space N Time
November 2019 15
Respect, Complexity
November 2019 27
Complexity Theory
December 2019 19
Complexity Healthcare
November 2019 27