Calculate Space and Time Complexity of an Algorithm: A Beginner’s Guide

Dev Diwan
6 min read16 hours ago

--

As developers, understanding the efficiency of an algorithm is crucial. Time and space complexity are tools we use to estimate how well an algorithm scales as input size grows. These metrics help us predict performance and choose the best approach for solving a problem.

In this article, we’ll take a detailed, beginner-friendly look at:

  1. What time and space complexity mean
  2. How to calculate time complexity step by step
  3. How to calculate space complexity step by step
  4. Common Big-O notations
  5. Examples of analyzing algorithms

By the end, you’ll have the confidence to evaluate algorithms like a pro.

What Is Time Complexity?

Time complexity measures how the runtime of an algorithm changes as the size of the input grows. It’s expressed using Big-O notation, which represents the upper bound of an algorithm’s growth.

For example:

  • An algorithm with “ O(n) ” complexity grows linearly with input size.
  • An algorithm with “ O(n²) ”complexity grows quadratically.

Why Time Complexity Matters:

  • Helps predict how an algorithm performs on large inputs.
  • Allows comparison of different approaches to the same problem.

Step-by-Step Guide to Calculating Time Complexity

1. Count the Basic Operations

Break the algorithm into individual steps. Identify the operations that depend on the input size (“ n ”), such as loops or recursive calls.

Example:

function sumArray(arr) {
let sum = 0; // Step 1
for (let i = 0; i < arr.length; i++) { // Step 2
sum += arr[i]; // Step 3
}
return sum; // Step 4
}
  • Step 1: A single operation (constant time) → O(1)
  • Step 2: A loop that runs n times → O(n)
  • Step 3: Repeated inside the loop, so it also contributes O(n)
  • Step 4: A single return statement → O(1)

Total Time Complexity:

O(1) + O(n) + O(n) + O (1) = O(n)
Since we only keep the dominant term (the one that grows fastest), the time complexity is O(n).

2. Nested Loops

When loops are nested, multiply their complexities.

Example:

function printPairs(arr) {
for (let i = 0; i < arr.length; i++) { // Outer loop: O(n)
for (let j = 0; j < arr.length; j++) { // Inner loop: O(n)
console.log(arr[i], arr[j]); // O(1)
}
}
}
  • The outer loop runs “ n ” times.
  • For each iteration of the outer loop, the inner loop runs “ n ” times.
  • The total operations are “ n * n = n² ”.

Time Complexity: O(n²)

3. Recursive Algorithms

For recursion, time complexity is often determined using a recurrence relation.

Example: Fibonacci Sequence

function fibonacci(n) {
if (n <= 1) return n; // Base case: O(1)
return fibonacci(n - 1) + fibonacci(n - 2); // Recursive case
}

Each call to “ fibonacci ” makes two recursive calls, creating a binary tree of calls. The total number of calls grows exponentially.

Time Complexity: O(2ⁿ)

What Is Space Complexity?

Space complexity measures the total memory an algorithm uses relative to the input size. This includes:

  1. Fixed space: Memory used by variables, constants, and program instructions.
  2. Dynamic space: Memory required for inputs, outputs, and data structures like arrays or recursion stacks.

Why Space Complexity Matters:

  • It’s crucial for memory-constrained environments.
  • Helps optimize algorithms that process large data sets.

Step-by-Step Guide to Calculating Space Complexity

1. Analyze Fixed Space Requirements

Count variables, constants, and any additional data structures.

Example:

function addTwoNumbers(a, b) {
let sum = a + b; // Uses one variable
return sum; // No additional structures
}
  • Memory for “ a ”, “ b ”, and “ sum ” is constant.
  • Space Complexity: O(1)

2. Account for Input Size

The size of input data can affect space complexity.

Example:

function reverseArray(arr) {
let reversed = []; // Additional array
for (let i = arr.length - 1; i >= 0; i--) {
reversed.push(arr[i]);
}
return reversed;
}
  • reversed ” array grows with input size “ n ”.
  • Space Complexity: O(n) (due to the new array).

3. Account for Recursive Calls

Each recursive call adds a frame to the call stack, consuming memory.

Example:

function factorial(n) {
if (n === 1) return 1;
return n * factorial(n - 1);
}

For “ n = 5 ”, the recursion depth is 5 (each call is stored on the stack).
Space Complexity: O(n)

Common Big-O Notations

O(1) — Constant Time/Space

An algorithm is said to have O(1) complexity if its runtime or memory usage does not depend on the input size. It always takes the same amount of time or space, regardless of how large the input is.

Example: Accessing an element in an array using its index.

const arr = [1, 2, 3, 4];
console.log(arr[2]); // Always takes constant time

O(log n) — Logarithmic

Logarithmic complexity occurs when the algorithm divides the problem into smaller parts, reducing the input size by a constant factor at each step. This often happens in algorithms like binary search, where the input size is halved with every iteration.

Example: Binary Search

function binarySearch(arr, target) {
let left = 0, right = arr.length - 1;
while (left <= right) {
const mid = Math.floor((left + right) / 2);
if (arr[mid] === target) return mid;
else if (arr[mid] < target) left = mid + 1;
else right = mid - 1;
}
return -1;
}

In binary search, each step reduces the search space by half, leading to O(log n) complexity.

O(n) — Linear

Linear complexity indicates that the runtime of the algorithm grows proportionally with the size of the input. If the input size doubles, the runtime also doubles.

Example: Traversing a list

function printArray(arr) {
for (let i = 0; i < arr.length; i++) {
console.log(arr[i]);
}
}

In this example, the loop runs once for every element in the array, making it O(n).

O(n log n) — Quasilinear

Quasilinear complexity occurs in algorithms that involve both dividing the input (logarithmic) and performing linear work at each division step. Sorting algorithms like Merge Sort and Quick Sort commonly fall into this category.

Example: Merge Sort

function mergeSort(arr) {
if (arr.length <= 1) return arr;
const mid = Math.floor(arr.length / 2);
const left = mergeSort(arr.slice(0, mid));
const right = mergeSort(arr.slice(mid));
return merge(left, right);
}

Here, the array is divided logarithmically, and the merging process takes linear time, resulting in O(n log n) complexity.

O(n²) — Quadratic

Quadratic complexity arises when an algorithm involves nested loops, where each iteration of the outer loop triggers a complete iteration of the inner loop. This is common in algorithms that compare or process pairs of elements.

Example: Nested Loops

function printPairs(arr) {
for (let i = 0; i < arr.length; i++) {
for (let j = 0; j < arr.length; j++) {
console.log(arr[i], arr[j]);
}
}
}

The outer loop runs “ n ” times, and for each iteration, the inner loop also runs “ n ” times, making it O(n²).

O(2ⁿ) — Exponential

Exponential complexity occurs when the algorithm’s growth doubles with each additional input element. Recursive solutions that explore all possibilities, like the Fibonacci sequence, often exhibit this behavior.

Example: Recursive Fibonacci

function fibonacci(n) {
if (n <= 1) return n;
return fibonacci(n - 1) + fibonacci(n - 2);
}

In this example, each call to “ fibonacci ” spawns two more calls, resulting in a total of 2ⁿ calls for input “ n ”.

Example: Analyzing a Complex Algorithm

Let’s analyze the time and space complexity of a simple sorting function.

function mergeSort(arr) {
if (arr.length <= 1) return arr; // Base case: O(1)

const mid = Math.floor(arr.length / 2);
const left = mergeSort(arr.slice(0, mid)); // Recursive call
const right = mergeSort(arr.slice(mid)); // Recursive call

return merge(left, right);
}

function merge(left, right) {
let result = [];
let i = 0, j = 0;

while (i < left.length && j < right.length) { // Merging: O(n)
if (left[i] < right[j]) {
result.push(left[i]);
i++;
} else {
result.push(right[j]);
j++;
}
}
return result.concat(left.slice(i)).concat(right.slice(j));
}

Time Complexity:

  • Each recursive step divides the array into two halves → O(log n).
  • Merging the halves requires traversing the entire array → O(n).
  • Total Time Complexity: O(n log n).

Space Complexity:

  • The recursion stack requires O(log n) space for dividing the array.
  • Additional memory is used for temporary arrays during merging → O(n).
  • Total Space Complexity: O(n).

Key Takeaways

  1. Time complexity measures how runtime scales with input size, while space complexity measures memory usage.
  2. Use Big-O notation to describe the upper limit of growth.
  3. Break algorithms into steps and loops to calculate complexity systematically.
  4. Always keep track of recursion depth and additional data structures.

Mastering these concepts takes practice. Start by analyzing simple algorithms, then move on to complex ones. With time, you’ll be able to write efficient code and confidently explain its performance!

--

--

Dev Diwan
Dev Diwan

Written by Dev Diwan

🌱 Aspiring developer, coding enthusiast, and perpetual learner on the tech odyssey. Let's conquer bugs! 💻. Learning to be a better Human Being✨