Big O, also known as Big O notation, represents an algorithm's worst-case complexity. Big O defines the runtime required to execute an algorithm by identifying how the performance of your algorithm will change as the input size grows. In plain terms, the algorithm will run input + 2 times, where input can be any number.
4 Comments
Sort: