Summary of "Time & Space Complexity - DSA Series by Shradha Ma'am"
Summary of "Time & Space Complexity - DSA Series by Shradha Ma'am"
This comprehensive lecture covers the fundamental concepts of Time Complexity and Space Complexity in Data Structures and Algorithms (DSA), focusing on their importance, calculation methods, common types, and practical relevance, especially in coding interviews and competitive programming.
Main Ideas and Concepts
1. Introduction to Time and Space Complexity
- Time and Space Complexity measure the efficiency of algorithms.
- Time Complexity refers to the amount of time an algorithm takes as a function of input size.
- Space Complexity refers to the extra memory an algorithm uses relative to input size.
- Both complexities are critical in interviews and coding platforms.
- Focus is mainly on worst-case Time Complexity for real-world relevance.
2. Time Complexity
- Measured by counting the number of operations relative to input size (usually denoted as n).
- Execution time varies by machine and environment, so complexity abstracts away actual time to focus on growth rate.
- Example: Linear Search
- Worst-case Time Complexity is O(n), where n is the array size.
- Best-case is O(1) (target found at the first element).
- Time Complexity is expressed using Big O notation which focuses on the dominant term and ignores constants.
- Steps to simplify Time Complexity:
- Remove constants.
- Keep the largest term.
- Common time complexities:
- O(1) — Constant time (e.g., accessing an element by index).
- O(n) — Linear time (e.g., Linear Search).
- O(n²) — Quadratic time (e.g., Bubble Sort, selection sort).
- O(log n) — Logarithmic time (e.g., binary search).
- O(n log n) — Linearithmic time (e.g., Merge Sort, quicksort average case).
- O(2^n), O(n!) — Exponential and factorial time (e.g., brute-force recursion, permutations).
- Explanation of worst case, best case, and average case scenarios.
- Introduction to amortized Time Complexity (e.g., HashMap operations).
3. Space Complexity
- Measures extra space used by the algorithm excluding input storage.
- Example: creating a new array to store squares of elements results in O(n) Space Complexity.
- Variables and auxiliary space that do not depend on input size contribute O(1) space.
- Recursive algorithms use additional space for the call stack proportional to recursion depth.
- Space Complexity can be visualized similarly to Time Complexity with graphs.
4. Common Examples and Calculations
- Linear Search: O(n) time, O(1) space.
- Selection Sort and Bubble Sort: O(n²) time due to nested loops.
- Binary Search: O(log n) time on sorted arrays.
- Merge Sort:
- Time Complexity: O(n log n) due to recursive splitting and merging.
- Space Complexity: O(n) due to temporary arrays used in merging.
- Factorial Calculation (Recursion):
- Time Complexity: O(n) due to n recursive calls.
- Space Complexity: O(n) due to call stack depth.
- Fibonacci (Naive Recursion):
- Time Complexity: O(2^n) due to exponential growth of calls.
- Space Complexity: O(n) due to recursion stack.
- Prime Number Checking (Optimized):
- Time Complexity: O(√n) by checking divisors up to square root of n.
5. Recursion and Recursion Trees
- Recursion trees help visualize and calculate Time Complexity.
- Total calls multiplied by work per call gives overall Time Complexity.
- Example: factorial recursion has a linear recursion tree.
- Fibonacci naive recursion leads to an exponential recursion tree.
6. Big O notation and Related Notations
- Big O (O) — Upper bound, worst case.
- Omega (Ω) — Lower bound, best case.
- Theta (Θ) — Tight bound, average case.
- Other notations like little o and little ω exist but are less common in interviews.
7. Practical Usage and Constraints
- Coding platforms generally allow about 108 operations per second.
- Constraints in problems guide which Time Complexity algorithms are feasible.
- For example, for n ≤ 105, O(n log n) or better is usually required.
- For smaller n, O(n²) or even exponential algorithms might be acceptable.
- Understanding constraints helps choose or optimize algorithms accordingly.
Methodologies and Instructional Points
- Calculating Time Complexity:
- Identify loops and recursive calls.
- Count operations inside loops.
- Use summation formulas for nested loops.
- Simplify by ignoring constants and lower order terms.
- Use
Category
Educational