DEFINE: ALGORITHM & TIME AND SPACE COMPLEXITY OF AN ALGORITHM
DEFINE: ALGORITHM
An algorithm is a set of well-defined, step-by-step instructions or procedures designed to perform a specific task or solve a particular problem.
Characteristics of Algorithms:
1. Finite - An algorithm must complete in a limited number of steps.
2. Well-Defined - Each step should be clear and unambiguous.
3. Effective - Each operation is feasible and can be carried out within a practical timeframe.
4. Input - It should accept zero or more inputs to work with.
5. Output - It should produce at least one output (result).
Types of Algorithms:
Sorting Algorithms: Organize data in a certain order (e.g., quicksort, mergesort, bubblesort).
Search Algorithms: Locate specific data within a structure (e.g., binary search, linear search).
Graph Algorithms: Solve problems related to graphs, like finding the shortest path (e.g., Dijkstra’s algorithm, BFS, DFS).
Dynamic Programming Algorithms: Solve problems by breaking them down into overlapping sub-problems (e.g., Fibonacci sequence, knapsack problem).
Greedy Algorithms: Make the best choice at each step with the hope of finding a global optimum (e.g., Kruskal’s algorithm for minimum spanning tree).
Backtracking Algorithms: Use a trial-and-error approach to solve constraint satisfaction problems (e.g., n-queens, Sum of subset).
TIME AND SPACE COMPLEXITY OF AN ALGORITHM
Measuring time and space complexity of an algorithm helps us understand how efficiently it performs, especially as the size of input data grows. Here’s a simple way to think about each:
1. Time Complexity (How fast an algorithm runs)
Goal: Measure how the runtime of an algorithm changes as the input size increases.
Big O Notation (O): We express time complexity in terms of "Big O" notation, which describes the upper bound of the algorithm’s growth rate. Common time complexities include:
O(1): Constant time (e.g., accessing an array element).
O(log n): Logarithmic time (e.g., binary search).
O(n): Linear time (e.g., looping through an array).
O(n^2): Quadratic time (e.g., nested loops).
O(2^n): Exponential time (e.g., some recursive solutions).
Example: For a list of `n` numbers, if an algorithm loops through the list once, its time complexity is O(n). If it has nested loops (like checking pairs), it might be O(n^2).
2. Space Complexity (How much memory an algorithm uses)
Goal: Measure how much extra memory (space) the algorithm requires as input size increases.
Like time complexity, we use Big O notation to describe space complexity.
O(1): Constant space (no extra memory used for input size, e.g., a few variables).
O(n): Linear space (extra memory grows linearly with input, e.g., storing an array of size `n`).
Example: An algorithm that sorts a list in place (like quicksort) uses O(1) extra space. But if it creates a new list to store sorted elements, it might use O(n) space.
By analyzing time and space complexity, you can predict how well an algorithm will scale as data size grows!
Comments
Post a Comment