Analysis of Time Complexity:

 Analysis of Time Complexity:

Definition:

Time complexity is a measure of the amount of time an algorithm takes to complete as a function of the size of the input.

It provides an understanding of how the running time of an algorithm increases with the size of the input.

 

Notation:

Time complexity is commonly expressed using Big O notation, denoted as O(f(n)), where "f(n)" represents an upper bound on the growth rate of the algorithm's running time concerning the input size "n."

 

Classes of Time Complexity:

1. O(1) - Constant Time:

   - The running time remains constant regardless of the input size.

 

2. O(log n) - Logarithmic Time:

   - The running time grows logarithmically with the input size.

 

3. O(n) - Linear Time:

   - The running time increases linearly with the input size.

 

4. O(n log n) - Linearithmic Time:

   - Common for efficient sorting algorithms like Merge Sort and Heap Sort.

 

5. O(n^2) - Quadratic Time:

   - The running time grows quadratically with the input size.

 

Example:

For a sorting algorithm with O(n log n) time complexity, doubling the input size may approximately double the running time.

Post a Comment

0 Comments