Explain the concept of Big O notation
Big O notation is a mathematical notation used to describe the performance or complexity of an algorithm in terms of time or space. It characterizes the algorithm's growth rate as the input size increases, providing an upper bound on its running time or memory usage. For example, O(n) indicates that the performance grows linearly with the input size, while O(1) means the performance is constant regardless of input size.