Big O Notation and the Magic of Time Complexity

A guide on how to compare algorithms’ behaviour

Photo by Darwin Vegher on Unsplash

What is Big-O Notation?

Counting Instructions

Worst-Case Scenario

Asymptotic Behaviour

Big-O Classes of Runtime Complexity

Graphs of functions showing the number of operations N versus input size n for each function

O( 1 ) — Constant Time

O( log n ) — Logarithmic Time

// Initial array, n = 8
[1, 4, 6, 8, 9, 23, 35, 56]
// With 9 as pivot, we keep checking the left half, n = 8 * 1/2 = 4
[1, 4, 6, 8]
// With 6 as pivot, we keep checking the left half, n = 4 * 1/2 = 2
[1, 4]
// With 4 as pivot, we keep checking the left half, n = 2 * 1/2 = 1
[1]
8 * (1 / 2)³ = 1
n * (1 / 2)^k = 1
n * (1 / 2^k) = 1
n = 2^k
log(base=2)(n) = k

O( n ) — Linear Time

O( n log n ) — Linearithmic Time

O( n² ) — Quadratic Time

O( 2^n ) — Exponential Time

O( n! ) — Factorial Time

Optimising

Conclusion

Additional Resources

Software engineer, architect, pianist, passionate about algorithms, web technologies and learning new skills.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store