Big O, also known as Big O notation, represents an algorithm's worst-case complexity. Big O defines the runtime required to execute an algorithm by identifying how the performance of your algorithm will change as the input size grows. In plain terms, the algorithm will run input + 2 times, where input can be any number.

8m read timeFrom freecodecamp.org
Post cover image
Table of contents
What is Big O?Big O Complexity ChartBig O Time Complexity ExamplesWrapping Up
4 Comments

Sort: