What is Space Complexity, Why do every software engineer should know about it?

What is Space Complexity, Why do every software engineer should know about it?
Space Complexity

Generally, most engineers don’t care about the memory they use when writing code. But when it comes to writing optimal solutions, we need to consider the time it takes to run the algorithm and the extra space it occupies.

You need to check out the article Time Complexities in depth which talks about time complexities in-depth with example snippets and graph. Which makes it easy to learn this topic.

Space Complexity

The most embraced definition is, the space complexity of an algorithm or a computer program is the amount of memory space required to solve an instance of the computational problem as a function of characteristics of the input. It is the memory required by an algorithm until it executes completely.

In ideal, we want our algorithm to be super optimal and scalable running at linear, logarithmic, or in some cases a constant time, with storing a minimum amount of space in memory.

But in general, we also need to consider the amount of extra space we utilize for our algorithm.

We need to deal with the memory, how much we use to write an algorithm for a problem.

There are cases when you need to consider a memory-efficient algorithm and when you need not take care of it.

For example, we don’t need to consider memory as a factor when a single person is using the application or a device.

Imagine a case, where a single user is using an apple watch or macbook. At which, we can barely concentrate on the memory as a single user uses the device and we rather care about the fastness of the device/application over memory.

Isn’t this what every mobile app developer wants? Fastness!!

Time and Space complexities play a major role in determining the performance of an algorithm and memory usage.

1. Constant Space – O(1)

Let us consider another example where many users will access the device/application. at which we need to consider both fastness, and memory efficiency as well.

Here is a code snippet example:

public class SpaceComplexity {
  public static void main(String[] args) {
    String[] movies = {"The Shawshank Redemption", "Taxi Driver", "Schindler's List", "The Princess Bride"};
    constantSpace(movies);
  }

  // O(n) time complexity
  // O(1) Space complexity
  private static void constantSpace(String[] movies) {
    for (int index = 0; index < movies.length; index++) {
      System.out.println(movies[i]);
    }
  }
ConstantSpace

In the above snippet, we created a variable index and initialized it to 0 first, and iterated over the array of string elements.

The index variable here uses a single unit of memory or we can call it a single operation.

And we are iterating it from 0 to movies.length-1, so in theory, we iterate through all the elements in the array by reusing the exact memory location where an index is stored and this can be considered as constant or O(1) time(since we are re-using the same memory location for index value).

Hence the space occupied by this algorithm is constant, which is O(1)

Finally,

  • Time complexity is O(n)
  • Space complexity is O(1)

2. Linear Space Complexity – O(n)

Here is another example that occupies the exact amount of space the input array is given or almost.

Below is an example of moviesArray that has 4 top movies present in it.

Let us try to copy this array to understand the space complexity, how it grows linearly with the input size.

import java.util.Arrays;

public class SpaceComplexity {
  public static void main(String[] args) {
    String[] movies = {"The Shawshank Redemption", "Taxi Driver", "Schindler's List", "The Princess Bride"};
    quadraticTime(movies);
  }

  // O(n) time complexity
  // O(1) Space complexity
  private static void quadraticTime(String[] movies) {
    String[] moviesCopy = new String[movies.length]; // O(n) space 

    for(int index = 0; index < movies.length; index++){
      moviesCopy[index] = movies[index];
    }

    System.out.println(Arrays.toString(moviesCopy));
  }
}
Linear Space

In the above code example. We have an input that contains 4 movie items in it.

We created a brand new String array moviesCopy and used a for-loop to store all the items from movies to moviesCopy array.

What are we actually doing here, we created a String array in memory with size 4, which means memory is allocated for this array to store the 4 elements consecutively.

Another code example below –

import java.util.Arrays;

public class SpaceComplexity {
  public static void main(String[] args) {
    int[] array = new int[5];

    int index = 0;

    while (index < array.length) {
      array[index++] = index * index;
    }
    System.out.println(Arrays.toString(array)); // [1, 4, 9, 16, 25]
  }
}

The space is directly proportional to the size of the input array or in other words, we can say that space grows as the input size grows.

Finally,

  • Time Complexity: is O(n), as we are iterating over all the elements.
  • Space Complexity: is O(n)

So, overall the Time and Space complexity define an algorithm's fastness.