Sequencing

  • Uploaded by: cokbin
  • 0
  • 0
  • November 2019
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Sequencing as PDF for free.

More details

  • Words: 2,402
  • Pages: 49
SEQUENCING & SORTING Budi Hartono, ST, MPM

Algorithm is sequencing  Instructions

are performed one by one  Each instruction is done once  Processors will perform the process exactly the same as instructed

Sequence does matter Algorithm Sequence_1

Algorithm Sequence_2

Declaration: A, B: integer

Declaration: A, B: integer

Description: A  10 A 2*A B A

Description: A  10 B A A 2*A

Write (B)

Write (B) 20

10

Example



– Calculating the Midpoint

Two points P1(x1, y1) & P2(x2,y2) Midpoint (x3, y3) x3 =

x1 + x2 2

y1 + y2 2

y3 =

Algorithm MidPoint



Declaration

P3.x  (P1.x+P2.x)/2

type point : record < x, y: real > P1, P2, P3 : point Description read (P1.x, P1.y) read (P2.x, P2.y) …

P3.y  (P1.y+P2.y)/2 Write (P3.x, P3.y)

Example 

- Converting Seconds to Hours

4000 seconds = 1 hour + 6 minutes+40 seconds Algorithm Conversion Declaration Type hour : record < hh: integer mm: integer ss: integer > H : hour totalSeconds : integer Spillage : integer { temporary variable} Description: read (totalSeconds) J.Hh  totalSeconds div 3600 Spillage  totalSeconds mod 3600 J.Mm  spillage div 60 J.Ss  spillage mod 60 Write (J.hh, J.mm, J.ss)

SORTING

What Are Sorting Algorithms? 

A way to put items sequentially based on some value attached to that item  

Sorting the cards when playing poke Sorting the scores after an examination

Why Are Sorting Algorithms Important? 

Wide use in research of almost all disciplines   



Statistics – categorize samples Chemistry – handle experimental data Sociology – telephone questionnaire

Basement of complex algorithms

Sorting  Two

types: ascending, descending Ascending : 3, 18, 25, 44, 99 Descending : 99, 44, 25, 18, 3  Many    

methods, e.g.:

Straight selection (already done) Bubble-sort / exchange sort Straight insertion Quick Sort

How To Evaluate Different Sorting Algorithms? 

Basic evaluation  

Time to complete sorting Function of number of items to be sorted

(1) Straight Selection  Recall

previous discussion

1

2

3

4

5

6

12

29

17

56

11

23

- Iterations  N-1 times - Find the smallest - Position exchange

N=6

Order from the smallest value

(1) Straight Selection Example

Pass # 0 1 2 3 4 5

12 11 11 11 11 11

29 12 12 12 12 12

Results 17 56 29 17 17 29 17 23 17 23 17 23

11 56 56 29 29 29

23 23 23 59 59 59

Straight Selection Total cycles of ‘comparison processing’ (K): K = (N2-N) / 2 Where, N = number of data

(2) Bubble-Sort  How

to find the smallest number (min) ?  by comparing two adjacent numbers 1

2

3

4

5

6

87

74

71

54

88

60

Need TEMP as well

N=6

Bubble Sort Pass # 0 1 2 3 4 5

87 74 71 54 54 54

74 71 54 71 60 60

Results 71 54 54 87 74 60 60 74 71 74 71 74

Total cycles of ‘comparison processing’ (K): K = (N2-N) / 2 Where, N = number of data

88 60 87 87 87 87

60 88 88 88 88 88

Homework

I

leave the flowchart for homework  (bubble-sort method)

Bubble Sort -Time Complexity 

Number of items: n



Worst case: 



(n – 1) + (n – 2) + … + 2 + 1= n (n – 1) / 2

Time complexity: o (n^2)

(3) Straight insertion sorting small number of elements

(3) Straight insertion

(3) Straight insertion N=6 1

2

3

4

5

6

7

8

10

5

7

9

2

1

8

6

(3) Straight insertion Pass # 0 1 2 3 4 5 6 7

10 5 5 5 2 1 1 1

5 10 7 7 5 2 2 2

7 7 10 9 7 5 5 7

Results 9 2 9 2 9 2 10 2 9 10 7 9 7 8 6 7

1 1 1 1 1 10 9 8

8 8 8 8 8 8 10 9

6 6 6 6 6 6 6 10

Straight insertion - Flow Chart START

Read N, X[I]

FOR I=1 TO N-1 PRINT RESULT T = X[I] X[0]=T J=I-1

STOP

YES WHILE T<x[J]

X [J+1] = X[J] J=J-1 X[J+1]=T

NO

NEXT I

Straight insertion - Time Complexity



Number of items: n



Worst case: 



1 + 2 + 3 + … + (n – 1) = n (n – 1) / 2

Time complexity: o (n^2)

(4) Binary (Heap) Insertion  More

efficient than straight insertion method

1

2

3

4

5

6

7

8

10

5

7

9

2

1

8

6

•Similar to Straight Insertion •Different method in insertion

(4) Binary (Heap) Insertion

Steps:

 1. 2. 3.

Define Location Shift all ordered data Insert the new data

(4) Binary (Heap) Insertion 1

2

3

4

5

6

7

8

10

5

7

9

2

1

8

6

(1) Define Location -

LB = 1

-

UB = I - 1

-

Mid = (LB + UB) /2 5

on the ordered array

7

LB

10

9

2

1

8

6

UB MID

Q

compare

assume Q < MID

MID = MID-1 MID = LB = UB

(4) Binary (Heap) Insertion

(2) Shift to the right  the ordered array on the right side of the location (3) Insert

Pass # 0 1 2 3 4 5 6 7

10 5 5 5 2 1 1 1

5 10 7 7 5 2 2 2

Results 9 2 9 2 9 2 10 2 9 10 7 9 7 8 6 7

7 7 10 9 7 5 5 7

1 1 1 1 1 10 9 8

8 8 8 8 8 8 10 9

6 6 6 6 6 6 6 10

Data to be inserted

#LB

#UB

#MID

#Location for Inserting

Iteration # … 5

Counter 6

1 1 1

1 1 1

5 2 1

3 2 1

1

6

7

8 8 8

1 4 5

6 5 5

4 4 5

5

(4) Binary (Heap) Insertion

Summary of Sorting Algorithms Algorithm

Time

Notes

selection-sort

O(n2)

slow (good for small inputs)

insertion-sort

O(n2)

slow (good for small inputs)

quick-sort

O(n log n) expected

heap-sort

O(n log n)



fast (good for large inputs)

merge-sort

O(n log n)



fast (good for huge inputs)

fastest (good for large inputs)

(5) Quick Sort  By:

CAR Hoare

Based on two operations: • Splitting the array into two sub-arrays by placing the first array element in a middle position such that all the numbers to the right are greater than all numbers to the left (DivideAndConquer). • Repeating the previous step recursively by dividing each sub-array into two sub-arrays in the same manner until an empty sub-array is reached.

(5) Quick Sort

1

2

3

4

5

6

7

10

2

17

7

16

3

9 N=7

X

location = J

smaller

greater

(5) Quick Sort  10, 2, 17, 7, 16, 3, 9.

Quick-Sort 

Quick-sort is a randomized sorting algorithm based on the divide-and-conquer paradigm: – Divide: Pick a random element x (called the pivot) and partition S into   

– –

L elements less than x E elements equal to x G elements greater than x

Recurse: Sort L and G Conquer: Join L, E, and G

x

x L

E

x

G

Partition 

We partition an input sequence as follows:  



 

We remove, in turn, each element y from S and We insert y into L, E or G, depending on the result of the comparison with the pivot x

Each insertion and removal is at the beginning or at the end of a sequence, and hence takes O(1) time Thus, the partition step of quick-sort takes O(n) time Pivot sometimes chosen as first element of S (assuming S unsorted), or median of first, middle, and last elements of S

Algorithm partition(S, p) Input sequence S, position p of pivot Output subsequences L, E, G of the elements of S less than, equal to, or greater than the pivot, resp. L, E, G ← empty sequences x ← S.remove(p) while ¬S.isEmpty() y ← S.remove(S.first()) if y < x L.insertLast(y) else if y = x E.insertLast(y) else { y > x } G.insertLast(y) return L, E, G

Quick-Sort Tree 

An execution of quick-sort is depicted by a binary tree  Each node represents a recursive call of quick-sort and stores  

 

Unsorted sequence before the execution and its pivot Sorted sequence at the end of the execution

The root is the initial call The leaves are calls on subsequences of size 0 or 1 

I.e., recurse on L or G if it has 2 or more members

7 4 9 6 2 → 2 4 6 7 9 4 2 → 2 4 2→ 2

7 9 → 7 9 9→ 9

Execution Example  Pivot

selection 7 2 9 43 7 6 1 → 1 2 3 4 6 7 8 9

7 2 9 4 → 2 4 7 9

2→ 2

9 4 → 4 9

9→ 9

4→ 4

3 8 6 1 → 1 3 8 6

3→ 3

8→ 8

Execution Example (cont.)  Partition,

recursive call, pivot selection

7 2 9 4 3 7 6 1→ 1 2 3 4 6 7 8 9

2 4 3 1→ 2 4 7 9

2→ 2

9 4 → 4 9

9→ 9

4→ 4

3 8 6 1 → 1 3 8 6

3→ 3

8→ 8

Execution Example (cont.)  Partition,

recursive call, base case

7 2 9 43 7 6 1→ → 1 2 3 4 6 7 8 9

2 4 3 1 →→ 2 4 7

1→ 1

3 8 6 1 → 1 3 8 6

9 4 → 4 9

9→ 9

4→ 4

3→ 3

8→ 8

Execution Example (cont.)  Recursive

call, …, base case, join

7 2 9 43 7 6 1→ 1 2 3 4 6 7 8 9

2 4 3 1 → 1 2 3 4

1→ 1

4 3 → 3 4

9→ 9

4→ 4

3 8 6 1 → 1 3 8 6

3→ 3

8→ 8

Execution Example (cont.)  Recursive

call, pivot selection

7 2 9 43 7 6 1→ 1 2 3 4 6 7 8 9

2 4 3 1 → 1 2 3 4

1→ 1

4 3 → 3 4

9→ 9

4→ 4

7 9 7 1 → 1 3 8 6

8→ 8

9→ 9

Execution Example (cont.)  Partition,

…, recursive call, base case

7 2 9 43 7 6 1→ 1 2 3 4 6 7 8 9

2 4 3 1 → 1 2 3 4

1→ 1

4 3 → 3 4

9→ 9

4→ 4

7 9 7 1 → 1 3 8 6

8→ 8

9→ 9

Execution Example (cont.)  Join,

join 7 2 9 4 3 7 6 1 → 1 2 3 4 6 7 7 9

2 4 3 1 → 1 2 3 4

1→ 1

4 3 → 3 4

9→ 9

4→ 4

7 9 7 → 17 7 9

8→ 8

9→ 9

Worst-case Running Time 

 



The worst case for quick-sort occurs when the pivot is the unique minimum or maximum element One of L and G has size n − 1 and the other has size 0 The running time is proportional to the sum n + (n − 1) + … + 2 + 1 Thus, the worst-case running time of quick-sort is O(n2) depth time 0

n

1

n−1

… …



n−1

1

Expected Running Time 

Consider a recursive call of quick-sort on a sequence of size s  

Good call: the sizes of L and G are each less than 3s/4 Bad call: one of L and G has size greater than 3s/4 7 2 9 43 7 6 1

7 2 9 43 7 6 19 7 9 7 1 → 1

2 4 3 1

1

7294376

Good call 

Bad call

A call is good with probability 1/2 

1/2 of the possible pivots cause good calls: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 Bad pivots Good pivots

Bad pivots

Expected Running Time, Part 2  

Probabilistic Fact: The expected number of coin tosses required in order to get k heads is 2k For a node of depth i, we expect  



Therefore, we have 



 

i/2 ancestors are good calls The size of the input sequence for the current call is at most (3/4)i/2n For a node of depth 2log4/3n, the expected input size is one The expected height of the quick-sort tree is O(log n)

The amount or work done at the nodes of the same depth is O(n) Thus, the expected running time of quick-sort is O(n log n)

time per level O(n)

expected height s(r)

s(a)

s(b)

O(n)

O(log n) s(c)

s(d)

s(e)

s(f)

O(n)

total expected time: O(n log n)

Summary of Sorting Algorithms Algorithm

Time

Notes

selection-sort

O(n2)

slow (good for small inputs)

insertion-sort

O(n2)

slow (good for small inputs)

quick-sort

O(n log n) expected

heap-sort

O(n log n)



fast (good for large inputs)

merge-sort

O(n log n)



fast (good for huge inputs)

fastest (good for large inputs)

Notice the following points in the DivideAndConquer procedure:      

Two search operations started ‘together’, one from the left, and one from the right. At the beginning of the search operations, the index Left is the same as First, and the index Right is the same as Last. During the operation, Right moves to the left, and Left moves to the right. The search operations continue as long as the two search operations do not meet. This occurs when Right=Left. At this point, the WHILE loop ends and the first element takes a new position stored in the index Mid. The DivideAndConquer procedure does the main process, which is arranging the array passed to it. This procedure is called from the quicksort procedure

Related Documents

Sequencing
November 2019 24
Protein Sequencing
November 2019 18
Dna Sequencing
December 2019 21
Dna Sequencing 3
December 2019 27

More Documents from "Pradeep Kumar. Nagisetty"

Sequencing
November 2019 24
Alogaritma
November 2019 37
Teknik Informasi
November 2019 43
Tipe Data
November 2019 32
Teknologi Informasi
November 2019 52
Leastsquare
November 2019 33