Grokking Algorithm Complexity and Big-O
Ask Author
Back to course home

0% completed

Vote For New Content
Logarithmic Time and Space: O(log n)
Table of Contents

Contents are not accessible

Contents are not accessible

Contents are not accessible

Contents are not accessible

Contents are not accessible

Logarithmic Time Complexity O(\log n) and Logarithmic Space Complexity O(\log n) occur in algorithms that repeatedly reduce the problem size by a constant factor (often halving). This type of complexity is common in algorithms that use recursive division, such as finding powers or performing binary searches.

Image

Key Characteristics

In an algorithm with O(\log n) time and space complexity:

  • Time Complexity: The runtime grows logarithmically as the input size increases, due to dividing the problem in each step.

  • Space Complexity: The memory usage grows logarithmically with the input size due to the depth of the recursive call stack.

Code Example

Here’s a recursive example that demonstrates O(\log n) complexity. This function calculates the largest power of two less than or equal to a given number, n, by repeatedly halving n.

Python3
Python3

. . . .
  • Explanation: This function keeps dividing n by 2 until n becomes 1 or less, adding 1 to the count at each step.

Analyzing the Time Complexity Using Recurrence Relation

To understand the time complexity, let’s set up a recurrence relation for the function:

Step 1: Define the Recurrence Relation

  • Let T(n) represent the time complexity of largest_power_of_two for an input of size n.
  • Each time we call largest_power_of_two(n // 2), we halve n, so we can express the time complexity as:

T(n) = T\left(\frac{n}{2}\right) + O(1)

Here:

  • T\left(\frac{n}{2}\right) represents the recursive call with input size halved.
  • O(1) represents the constant work done in each recursive step (the addition).

Step 2: Solve the Recurrence Relation

  • Let’s expand this recurrence relation step by step to understand the total number of recursive calls.

T(n) = T\left(\frac{n}{2}\right) + 1 = T\left(\frac{n}{4}\right) + 1 + 1 = T\left(\frac{n}{8}\right) + 1 + 1 + 1 = \dots = T\left(\frac{n}{2^k}\right) + k

  • The process continues until the input size is reduced to 1, at which point the function stops. This happens when \frac{n}{2^k} = 1, which simplifies to:

n = 2^k

  • Taking the logarithm of both sides, we get:

k = \log_2(n)

  • So, T(n) = O(\log n) because there are \log_2(n) recursive calls before reaching the base case.

Analyzing Space Complexity Using Recurrence Relation

The space complexity also grows logarithmically because each recursive call uses a stack frame, and the maximum number of stack frames corresponds to the depth of recursion.

  1. Define the Space Recurrence Relation:
    • Let S(n) represent the space complexity.
    • Each recursive call adds one frame to the call stack, and the stack grows as the input is halved:

S(n) = S(n // 2) + O(1)

  1. Recursive Depth and Space Complexity:
    • Similar to the time complexity, the recursion depth reaches \log_2(n).
    • Therefore, the space complexity is also: O(\log n)

.....

.....

.....

Like the course? Get enrolled and start learning!

Table of Contents

Contents are not accessible

Contents are not accessible

Contents are not accessible

Contents are not accessible

Contents are not accessible