Why Time Limits Define Your Strategy
In competitive programming, every problem has a time limit — typically 1 or 2 seconds. Before writing any code, experienced competitors do a quick back-of-envelope calculation to determine which complexity class their solution must fall into. This single skill separates those who pass problems from those who time out.
The rule of thumb most competitive programmers use: a modern judge can execute roughly 10⁸ simple operations per second. Use this to reverse-engineer the required complexity from the input size.
The Input Size → Complexity Cheat Sheet
| Input Size (n) | Required Complexity | Typical Approach |
|---|---|---|
| n ≤ 10 | O(n!) or O(2ⁿ) | Brute force, full permutations |
| n ≤ 20 | O(2ⁿ) or O(n² · 2ⁿ) | Bitmask DP |
| n ≤ 500 | O(n³) | Floyd-Warshall, naive DP |
| n ≤ 5,000 | O(n²) | Quadratic DP, bubble sort |
| n ≤ 10⁶ | O(n log n) | Sorting, segment trees |
| n ≤ 10⁸ | O(n) | Linear scan, two pointers |
| n > 10⁸ | O(log n) or O(1) | Binary search, math formula |
How to Apply This in a Contest
- Read the constraints first. What is the maximum value of n?
- Identify the required complexity from the table above.
- Think of algorithms in that class. What data structures or techniques live at O(n log n)?
- Design your solution backward from the complexity budget.
For example: if n ≤ 200,000 and the time limit is 2 seconds, you have roughly 2 × 10⁸ operations — meaning O(n log n) is safe, but O(n²) (which would be 4 × 10¹⁰) will time out hard.
Hidden Constant Factors
Big-O notation hides constant factors. In practice, these matter:
- An O(n log n) solution with heavy cache misses may be slower than a well-tuned O(n²) solution for small n.
- Python is roughly 50–100x slower than C++ — factor this in. An O(n²) solution might pass in C++ but TLE in Python at the same n.
- Avoid unnecessary memory allocations inside hot loops.
- I/O speed matters. Use
sys.stdin.readlinein Python andios_base::sync_with_stdio(false)in C++.
Common Pitfalls
Nested Loops That Look Linear
A loop with an inner while that advances its own pointer isn't always O(n²). The two-pointer technique runs in O(n) total because each pointer moves at most n times in total — not n times per outer iteration.
Sorting Bottleneck
Even if your main algorithm is O(n), if you sort inside a loop you may unknowingly have O(n² log n). Always identify where sorting happens.
Recursion Depth and Stack Overflow
Deep recursion in DFS on large inputs can cause stack overflow. Iterative DFS with an explicit stack avoids this.
Practice Drill
Before your next contest, open a problem and cover the solution section. Read only the constraints and time limit. Write down your complexity target and one candidate algorithm. Then compare to the editorial. Do this consistently and your intuition will sharpen rapidly.
The Meta-Skill
Complexity thinking isn't just for contests. It makes you a better engineer: you'll instinctively spot when a proposed solution won't scale, propose better alternatives, and communicate clearly with teammates about performance tradeoffs.