0% found this document useful (0 votes)
13 views

L11 Sorting&Searching

Uploaded by

lalendrakumar170
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

L11 Sorting&Searching

Uploaded by

lalendrakumar170
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 61

THREADED TREES.

THREADED TREES.
✘ Binary tree traversal algorithms are written using either recursion or
programmer-written stacks. If the tree must be traversed frequently,
using stacks rather than recursion may be more efficient.
✘ A third alternative is a threaded tree. In a threaded tree, null pointers are
replaced with pointers to their successor nodes.
✘ To build a threaded tree ----> first build a standard binary search tree.
✘ Then traverse the tree, changing the null right pointers to point to their
successors.
✘ The traversal for a threaded tree is straightforward. Once you locate the
far-left node, loop happens, following the thread (the right pointer) to the
next node. No recursion or stack is needed. When you find a null thread
(right pointer), the traversal is complete.
THREADED TREES. Inorder traversal (LNR)

To find the far-left leaf, backtracking is performed to process the right


subtrees while navigating downwards. This is especially inefficient when
the parent node has no right subtree.
Binary Trees
◼ Threaded Binary Tree

⚫A binary tree with n nodes has n + 1 null pointers


Waste of space due to null pointers
Replace by threads pointing to inorder successor and/or inorder predecessor (if any)

INORDER TRAVERSAL IS:-

1 3 5 6 7 8 9 11 13

 Single threaded: Makes inorder traversal easier


 Double threaded: Makes inorder and postorder easier
Binary Trees

◼ Threaded Binary Tree (Continued…)


⚫ Implementation requirements
Use a boolean value – thread or child pointer (0 : child, 1 : thread)

⚫ Advantage: Stack not required for inorder traversal


⚫ Disadvantage: Adjustment of pointers during insertion and deletion of nodes
Binary Trees
INORDER TRAVERSAL IS:-

◼ Threaded Binary Tree (Continued…) 1 3 5 6 7 8 9 11 13


⚫ Example (for inorder traversal)
SORTING & SEARCHING.
Looking for data.
SORTING.
One of the most common data-processing
applications.
The process through which data are arranged
according to their values is called SORTING.
If data were not ordered, hours spent on trying to
find a single piece of information.
Example: The difficulty of finding someone’s telephone
number in a telephone book that had no internal order.
SORT CLASSIFICATIONS.
TYPES OF SORTS.
Uses primary
memory for the
All data are held in current data being
primary memory sorted.
during the sorting Secondary storage
process. for data not fitting in
primary memory

Internal External
THREE INTERNAL SORTS.

Selection sort
Shell sort

Insertion sort Heap sort


Quick sort

Bubble sort
SORT ORDER.
Data may be sorted in either ascending or descending
sequence.

If the order of the sort is not specified, it is assumed to be


ascending.

Examples of common data sorted in ascending sequence are the dictionary


and the telephone book.

Examples of common descending data are percentages of games won in a


sporting event such as baseball or grade-point averages for honor students.
SORT STABILITY.
Is an attribute of a sort, indicating that data with
equal keys maintain their relative input order
in the output.

input
order

output
SORT EFFICIENCY.
Is a measure of the relative efficiency of a sort,
usually an estimate of the number of comparisons
and moves required to order an unordered list.

Best possible sorting algorithms are the O(n log n)


sorts.
PASSES.
During the sorting process, the data are traversed
many times. Each traversal of the data is referred
to as a sort pass.

Depending on the algorithm, the sort pass may traverse the


whole list or just a section of the list.

Also, a characteristic of a sort pass is the placement of one


or more elements in a sorted list.
SELECTION SORT.
✘ In each pass of the selection sort, the smallest element is
selected from the unsorted sublist and exchanged with the
element at the beginning of the unsorted list.
✘ If there is a list of n elements, therefore, n – 1 passes are
needed to completely rearrange the data.
SELECTION SORT.

Example 2
Selection Sort Efficiency.
✘ It contains the two loops.

✘ The outer loop executes n – 1 times. The inner loop also


executes n – 1 times.

✘ This is a classic example of the quadratic loop. Its search


effort therefore, using big-O notation, is O(n2).
INSERTION SORT.
✘ Given a list, it is divided into two parts: sorted and
unsorted.
✘ In each pass the first element of the unsorted sublist is
transferred to the sorted sublist by inserting it at the
appropriate place.
✘ If list has n elements, it will take at most n – 1 passes to
sort the data.
INSERTION SORT.
INSERTION SORT EFFICIENCY.
✘ The outer loop executes n – 1 times, from 1 through the last element in the list.
✘ For each outer loop, the inner loop executes from 0 to current times,
depending on the relationship between the hold key and the walker key.
✘ On the average, the inner loop processes through the data in half of the sorted
list. Because the inner loop depends on the setting for current, which is
controlled by the outer loop, we have a dependent quadratic loop, which is
mathematically stated as

✘ In big-O notation the dependent quadratic loop is O(n2).


✘ Therefore, the insertion sort efficiency is O(n2).
BUBBLE SORT.

✘ The list is divided into two sublists: sorted and unsorted.


✘ The smallest element is bubbled from the unsorted sublist and moved to
the sorted sublist.
✘ After moving the smallest to the sorted list, the wall moves one element to
the right, increasing the number of sorted elements and decreasing the
number of unsorted ones.
✘ Each time an element moves from the unsorted sublist to the sorted
sublist, one sort pass is completed.
✘ Given a list of n elements, the bubble sort requires up to n – 1 passes to
sort the data.
BUBBLE SORT.
BUBBLE SORT EFFICIENCY.
✘ Uses two loops to sort the data.
✘ The outer loop tests two conditions: the current index and a sorted flag.
✘ Assuming that the list is not sorted until the last pass, we loop through
the array n times. The number of loops in the inner loop depends on
the current location in the outer loop. It therefore loops through half the
list on the average. The total number of loops is the product of both
loops, making the bubble sort efficiency to

✘ The bubble sort efficiency is O(n2).


QUICK SORT.
✘ In Quick sort, also an exchange sort method, developed by C. A. R.
Hoare in 1962.

✘ Quick sort is an exchange sort in which a pivot key is placed in its


correct position in the array while rearranging other elements
widely dispersed across the list.

✘ More efficient than the bubble sort because a typical exchange


involves elements that are far apart, so fewer exchanges are
required to correctly position an element.
QUICK SORT.
✘ Also called partition-exchange sort.
✘ Each iteration of the quick sort selects an element, known as pivot,
and divides the list into three groups:
- Partition of elements whose keys are less than the pivot’s key,
- Pivot element placed in its ultimately correct location in the list,
- Partition of elements greater than or equal to the pivot’s key.
✘ Pivot element can be any element from the array, it can be the first
element, the last element or any random element.
✘ Approach is recursive.
LOGIC OF PARTITION.
✘ In the array {52, 37, 63, 14, 17, 8, 6, 25} , 25 is taken as pivot.
✘ First pass: {6, 8, 17, 14, 25, 63, 37, 52}
✘ After the first pass, pivot will be set at its position in the final
sorted array, with all the elements smaller to it on its left and
all the elements larger than to its right.
✘ Next, {6 8 17 14} and {63 37 52} are considered as two
separate subarrays
✘ Same recursive logic will be applied on them, and keep doing
this until the complete array is sorted.
QUICK SORT ALGORITHM.
✘ Two Algorithms:
- Quick Sort Recursive
- Partition
QUICK SORT EXAMPLES.
QUICK SORT EXAMPLES. Pivot Element = 6
QUICK SORT EXAMPLES. (PARTITIONS)
QUICK SORT ALGORITHM.
Algorithm Quicksort (Array, Low, High)
1. If (Low < High) Then
1. Set Mid = Partition (Array, Low, High)
2. Quicksort (Array, Low, Mid – 1)
3. Quicksort (Array, Mid + 1, High)
2. End
QUICK SORT ALGORITHM.
Algorithm Partition (Array, Low, High)
1. Set Key = Array[low], I = Low + 1, J = High
2. Repeat Steps A through C
A. while (I < High && Key ≥ Array[i]) i++
B. while (Key < Array[j]) j- -
C. if (I < J) then
swap Array[i] with Array[j]
else
swap Array[low] with Array[j]
return j {Position For KEY}
3. End
1. Set Key = Array[low], I = Low + 1, J = High
2. Repeat Steps A through C
A. while (I < High && Key ≥ Array[i])
i++
B. while (Key < Array[j]) j- -
C. if (I < J) then
swap Array[i] with Array[j]
else
swap Array[low] with Array[j]
return j {Position For KEY}

Algorithm Quicksort (Array, Low, High)


If (Low < High) Then
Set Mid = Partition (Array, Low, High)
Quicksort (Array, Low, Mid – 1)
Quicksort (Array, Mid + 1, High)
End
QUICK SORT EFFICIENCY.
✘ Quick sort is considered the best general-purpose sort known today.
✘ To calculate the complexity of quick sort, the number of comparisons to sort an
array of n elements (index ranges from 0 to n – 1) is f(n), given the following:
- An array of zero or one element is already sorted. This means f(0) = f(1) = 0.
- If pivot is at index i, two subarrays exist. The left subarray has (i) elements,
and the right subarray has (n – 1 – i) elements.
✘ The number of comparisons to sort the left subarray is f(i), and the number of
comparisons to sort the right subarray is f (n – 1 – i), where i can be between 0
to n – 1.
✘ To continuously divide the array into subarrays, n comparisons needed.
✘ The efficiency of quick sort is O(n log n).
DIVIDE & CONQUER.
✘ Divide-and conquer is a general
algorithm design paradigm.
✘ Divide: divide the input data S in two
disjoint subsets S1 and S2.
✘ Recur: solve the subproblems
associated with S1 and S2.
✘ Conquer: combine the solutions for S1
and S2 into a solution for S.
✘ Base case for recursion are subproblems
of size 0 or 1.
MERGE SORT.
✘ Invented by John von Neumann in 1945, example of Divide &
Conquer.
✘ Repeatedly divides the data in half, sorts each half, and combines the
sorted halves into a sorted whole.
✘ Basic algorithm:
- Divide the list into two roughly equal halves.
- Sort the left half.
- Sort the right half.
- Merge the two sorted halves into one sorted list.
✘ Often implemented recursively
✘ Runtime: O(n log n).
MERGE SORT ALGORITHM.
Algorithm merge(S1, S2, S)
Input: Two arrays S1 & S2 of size n1 and n2 sorted in
Algorithm mergeSort(S) non decreasing order and an empty array S of
size at least (n1+n2)
Input: Sequence S with n elements Output: S containing elements from S1 & S2 in sorted order.
i1
Output: Sequence S sorted j1
if S.size() > 1 while (i ≤ 𝑛1) 𝑎𝑛𝑑 (𝑗 ≤ n2)
if S1[i] ≤ S2[j] then
(S1, S2)  partition(S, n/2) S[i+j-1]  Si[i]
i  i +1
mergeSort(S1) else
S[i+j-1]  Si[j]
mergeSort(S2)
j  j +1
S  merge(S1, S2) while ((i ≤ 𝑛1) 𝑑𝑜
S[i+j-1]  Si[i]
i  i +1
while (𝑗 ≤ n2)
S[i+j-1]  Si[j]
j  j +1
Merge Sort
example.

split

22 18 12 -4 58 7 31 42
split split
22 18 12 -4 58 7 31 42
split split split split
22 18 12 -4 58 7 31 42
merge merge merge merge
18 22 -4 12 7 58 31 42
merge merge
-4 12 18 22 7 31 42 58
merge
-4 7 12 18 22 31 42 58
Merge Sort
example.
SEARCHING.
✘ One MORE common and time-consuming operations in
computer science is searching
✘ The process used to find the location of a target among a
list of objects.
✘ The two basic search algorithms:
✘ Sequential search including three interesting variations
and,
✘ Binary search.
SEQUENTIAL SEARCH.
✘ Used whenever the list is not ordered.
✘ Generally, technique used only for small lists or lists that are not
searched often.
✘ Process: Start searching for the target at the beginning of the list and
continue until target found or it is not in the list.
✘ This approach has two possibilities: Find element (successful) or
reach end of list (unsuccessful).
Sequential Search
Example for
Successful search.
Sequential Search
Example for
Unsuccessful search.
Efficiency of the
sequential
search is O(n).
BINARY SEARCH.
✘ Sequential search algorithm is very slow. If an array of
1000 elements, exists, 1000 comparisons are made in
worst case.
✘ If the array is not sorted, the sequential search is the only
solution.
✘ However, if the array is sorted, we can use a more efficient
algorithm called binary search.
✘ Generally speaking, Binary search used whenever the list
starts to become large.
BINARY SEARCH.
✘ Begins by testing the data in the element at the middle of the array to
determine if the target is in the first or the second half of the list.
✘ If target in first half, there is NO need to check the second half.
✘ If target in second half, NO need to test the first half.
✘ In other words, half the list is eliminated from further consideration with
just one comparison.
✘ This process repeated, eliminating half of the remaining list with each
test, until target if found or does not exist in the list.
✘ To find the middle of the list, three variables needed: one to identify the
beginning of the list, one to identify the middle of the list, and one to
identify the end of the list.
Binary Search
Example for
Successful search.
Binary Search
Example for
Unsuccessful search.
Efficiency of the
binary search is
O(log n).
✘ End for today.
THE END FOR TODAY
15 JUNE 2021.
✘ End for today.
THE END FOR TODAY
15 JUNE 2021.

You might also like