Simple Analogy for Time Complexity
Time Complexity has always been one of those topics for me that I have read about and if someone would ask me, I could give a very nice definition of. But there’s that moment of clarity that we seek, for all concepts we learn, when it finally makes sense. And I was missing that clarity.

But there’s good news! I found that clarity today and I wanted to share it with everyone. So here goes…
Assume you are at your house and you want to reach your friend’s house. There’s only one route (for now) between your houses. And there are three signals on that route.
Now, which scenario would take you the most time to reach your friend’s house ? -> The one where you have to stop for all the signals. (All reds! Isn’t that just the worst ?) And it would obviously be the best if all signals were green as you passed them. That would take you the least time.
Now compare this with a program (or if you want to get fancy, an algorithm).
These two conditions give us an idea as to what would be the most time or an upper bound our algorithm takes to run (all reds) i.e. the worst case complexity of our program represented by something called as Big-O. This also gives us an idea about the least time or a lower bound our algorithm can take (all greens) i.e. the best time complexity represented by Big-Omega. And the average time complexity would lie somewhere in between these two represented by Big-Theta.
That wasn’t so difficult, was it ? Now, imagine there are multiple routes to reach your friend’s house. You have to find out which route takes the least amount of time even when you hit all reds so that you can reach your friend’s house in time.
Easy! You can just calculate the most time for a route using the method we described above and pick the one which gives you the least time with the worst case. Again, compare this to an algorithm. You would want to do this before you’re planning to run a very large algorithm. You would be able to say with certainty that even in the worst case, algorithm no.1 (route 1) runs faster than algorithm no.2 (route 2) and that would justify its usage.
The most common example of this in the programming world would be sorting. Each sorting algorithm iterates through the list of numbers differently and has different number of loops (signals). By comparing their time for worst cases, you will be in a better position to pick one over the other.
That’s it!
DISCLAIMER : This is by no means a theoretical explanation to Time Complexities or Asymptotic Notations. This article is for those who aren’t able to grasp the concept of this topic quickly. Many details have been skipped and you should definitely read this topic from a book like CLRS if you want to understand this in more depth.