Algorithm complexity

The algorithm is a kind of procedure, based on some input, performs predefined steps, and produces a result. Everything between the input and the output is the formal definition of the algorithm. In this article, I will try to explain what is algorithm complexity and how we calculate it.

Fundamental algorithms in programming

Which are the core algorithms in programming, every one of us uses each day at work?

  • Sorting algorithms – used to sort data, based on a condition. For instance, sort products by price descending, or sort a company organizational tree by years of service. Some famous sorting algorithms are Selection Sort, Bubble Sort, Quick Sort, Merge Sort, and Insertion Sort.
  • Searching algorithms – Provide fast and efficient searching in an amount of data based on a condition. For instance – search in a file, database, tree structure, array, and so on. Some famous sorting algorithms are Binary search, Linear search, and Interpolation search.
  • Traversal algorithms – used for tree and graph traversal. The idea is to find an optimal way for visiting each node exactly once. The most popular tree traversal algorithms are Depth-First Search (DFS) and Breadth-First Search (BFS)

What is algorithm complexity?

The complexity can be represented as a function based on the input parameter, which says how many operations will be needed for the algorithm to complete. For instance:

O(nยฒ)

This complexity means that if there is a list with n=100 elements, the algorithm will work with 100ยฒ = 10000 operations. That’s quite simple, but you may say what if this is a search algorithm and found the element on the very first iteration? Wouldn’t that be complexity O(1)? The answer is obvious – the calculated complexity is for the worst case – found the element on the last iteration. But let’s see what types of complexity we may measure:

  • Best case complexity – Like the above example this may happen if an element is found on the very first iteration. You should never consider this as a possibility when calculating complexity. This is a very rare case and it is better to be pessimistic.
  • Average case complexity – this may be useful when the algorithm is going to be executed a lot of times. For instance, sorting or searching may differ between different data types. In this case, is ok to calculate the average value.
  • Worst case complexity – This is the longest time for an algorithm to complete. ะnd even if it seems unlikely to happen it guarantees that the slowest case is analyzed. When the algorithm is run once or very rarely on a large amount of data it is crucial to be fast and efficient. In such situations consider the worst-case complexity.
For instance, searching for an element in an array of N elements will have the following complexities:
Algorithm complexity in all cases - array of n elements
Algorithm complexity in all cases – array of n elements

Types of complexity

There are several complexity types, based on the number of operations:

  • Constant complexity – the number of operations, doesn’t depend on the input parameter. N is no longer a parameter in the complexity function. For instance, getting the first element of an array is with complexity O(1), regardless of whether the array has 5, 10, or 100 elements.
  • Logarithmic complexityO(logN) the complexity doesn’t increase as fast as the input data. For instance, in a binary search tree half of the elements may be excluded while iterating. This will drastically reduce the number of operations. In cases when the algorithm can easily exclude half of the elements, logarithmic complexity may be achieved.
  • Linear complexityO(n) the complexity grows proportionally with the growth of the input parameters. Searching elements one by one in an array has O(n) complexity.
  • Polynomial complexityO(xn) the complexity depends on the input parameters on a degree. For instance, searching for a minimal number in an array with n elements. For each number, check the previous one. Until getting to the last one (which is the worst case), nยฒ operations will be executed.
  • Exponential complexity – the complexity doubles for each new element added to the input. Searching in a graph may lead to such complexity.