# DSA Fundamentals: Time Complexity, Space Complexity, Recursion

**Time and Space Complexity:**

Time and space complexity are measures used to analyze algorithms' efficiency in terms of resources consumed. Time complexity represents the amount of time an algorithm takes to complete as a function of the input size, while space complexity represents the amount of memory space an algorithm requires. Big O notation is a standardized way to express and compare these complexities.

Key Concepts:

- Time complexity:
- Time complexity measures the amount of time an algorithm takes to complete as a function of input size.
- Example: O(n
^{2}) indicates that the time taken by the algorithm grows quadratically with the input size. - Space complexity:
- Space complexity measures the amount of memory space an algorithm requires as a function of input size.
- Example: O(n) indicates that the space required by the algorithm grows linearly with the input size.
- Big O Notation:
- Big O notation is used to describe the upper bound or worst-case scenario of the time or space complexity of an algorithm.
- Example: O(n) represents linear complexity, O(log n) represents logarithmic complexity, and O(1) reflects constant complexity.
- Best, Worst, and Average Cases:
- Algorithms may have different time and space complexities for best-case, worst-case, and average-case scenarios.
- Example: Quicksort has an average-case time complexity of O(n log n) but a worst-case time complexity of O(n
^{2}).

Understanding Time Complexity:

- Constant Time (O(1)):
- Algorithms with a constant complexity have execution times that do not depend on input size.
- Example (Python code):

```
def constant_time_algorithm(array):
return array[0]
```

- Algorithms with linear time complexity have execution times that grow linearly with input size.
- Example (Python code):

```
def linear_time_algorithm(array):
for element in array:
print(element)
```

- Algorithms with logarithmic time complexity have execution times that grow logarithmically with input size.
- Example (Python code):

def binary_search(sorted_array, target): # Binary search implementation

^{2})):

- Algorithms with quadratic time complexity have execution times that grow quadratically with input size.
- Example (Python code):

```
def quadratic_time_algorithm(array):
for i in array:
for j in array:
print(i, j)
```

Understanding Space Complexity:

- Constant Space (O(1)):
- Algorithms with constant space complexity use a fixed amount of memory regardless of input size.
- Example (Python code):

```
def constant_space_algorithm():
variable = 5
return variable
```

- Algorithms with linear space complexity use an amount of memory that scales linearly with the input size.
- Example (Python code):

```
def linear_space_algorithm(array):
result = []
for element in array:
result.append(element)
return result
```

^{2})):

- Algorithms with quadratic space complexity use an amount of memory that grows quadratically with the input size.
- Example (Python code):

```
def quadratic_space_algorithm(array):
result = []
for i in array:
for j in array:
result.append((i, j))
return result
```

Note: All the above code is written in Python language.

**Recursion:**

From Wikipedia,

In computer science, recursion is a method of solving a computational problem where the solution depends on solutions to smaller instances of the same problem. Recursion solves such recursive problems by using functions that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science.

Key Concepts:

- Base Case:
- A base case is the condition that stops the recursive calls, preventing infinite recursion.
- Example: In a factorial calculation, the base case is often factorial(0) = 1.
- Recursive Call:
- A recursive call involves invoking the same function within itself to solve a smaller instance of the problem.
- Example: In the factorial example, factorial(n) = n * factorial(n-1) is a recursive call.
- Stack Memory:
- Recursive calls utilize the call stack, where each function call creates a new frame on the stack.
- Example: Visualizing stack memory can help understand the sequence of recursive calls.
- Indirect Recursion:
- Indirect recursion occurs when two or more functions call each other in a circular manner.
- Example: Function A calls function B, which, in turn, calls function A.
- Tail Recursion:
- Definition: Tail recursion is a special form where the recursive call is the last operation in the function.
- Example: return n + tail_recursive_function(n-1) is tail recursion.

Recursive Problem-Solving:

- Factorial calculation:
- Example (Python code):

```
def factorial(n):
if n == 0:
return 1
else:
return n * factorial(n-1)
```

- Example (Python code):

```
def fibonacci(n):
if n <= 1:
return n
else:
return fibonacci(n-1) + fibonacci(n-2)
```

- Example (Python code):

```
def binary_search(arr, target, low, high):
if low <= high:
mid = (low + high) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
return binary_search(arr, target, mid + 1, high)
else:
return binary_search(arr, target, low, mid - 1)
else:
return -1
```

Understanding:

- Base Case: Defines the condition to stop recursion and return a result.
- Recursive Call: Invokes the same function within itself to solve smaller instances of the problem.
- Stack Memory: Recursive calls utilize the call stack, which can lead to a stack overflow if not managed properly.
- Indirect Recursion: Multiple functions call each other in a circular manner to solve a problem.
- Tail Recursion: A special form where the recursive call is the last operation in the function.

**Searching and Sorting:**

Searching and sorting are fundamental operations in computer science and programming. Searching involves finding a specific element within a collection, while sorting rearranges elements in a specific order. Knowledge of common searching and sorting algorithms is essential for efficient data manipulation and retrieval.

Common Search Algorithms:

- Linear search:
- Linear search sequentially checks each element in a collection until a match is found or the entire collection is traversed.
- Example (Python code):

```
def linear_search(arr, target):
for i in range(len(arr)):
if arr[i] == target:
return i
return -1
```

- Binary search is a divide-and-conquer algorithm for finding a target in a sorted collection by repeatedly dividing the search range.
- Example (Python code):

```
def binary_search(arr, target):
low, high = 0, len(arr) - 1
while low <= high:
mid = (low + high) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
low = mid + 1
else:
high = mid - 1
return -1
```

Common Sorting Algorithms:

- Bubble Sort:
- Bubble sort is a simple sorting algorithm that repeatedly steps through the input list element by element, comparing the current element with the one after it, swapping their values if needed. These passes through the list are repeated until no swaps have to be performed during a pass, meaning that the list has become fully sorted. The algorithm, which is a comparison sort, is named for the way the larger elements "bubble" up to the top of the list.
- Example (Python code):

```
def bubble_sort(arr):
n = len(arr)
for i in range(n - 1):
for j in range(0, n - i - 1):
if arr[j] > arr[j + 1]:
arr[j], arr[j + 1] = arr[j + 1], arr[j]
```

Insertion sort is a simple sorting algorithm that builds the final sorted array (or list) one item at a time by comparisons. It is much less efficient on large lists than more advanced algorithms such as quicksort, heapsort, or merge sort.

- Example (Python code):

```
def insertion_sort(arr):
for i in range(1, len(arr)):
key = arr[i]
j = i - 1
while j >= 0 and key < arr[j]:
arr[j + 1] = arr[j]
j -= 1
arr[j + 1] = key
```

Understanding:

- Linear Search:
- Sequentially checks elements until a match is found or the entire collection is traversed.
- Time complexity: O(n).

- Binary Search:
- Efficiently finds a target in a sorted collection using a divide-and-conquer approach.
- Time complexity: O(log n).
- Bubble Sort:
- Repeatedly compares and swaps adjacent elements until the entire list is sorted.
- Time complexity: O(n?).
- Insertion Sort:
- Build a sorted sequence one element at a time by inserting elements into the correct position.
- Time complexity: O(n
^{2}).

**Bit Manipulation:**

From Wikipedia,

Bit manipulation is the act of algorithmically manipulating bits or other pieces of data shorter than a word. Computer programming tasks that require bit manipulation include low-level device control, error detection and correction algorithms, data compression, encryption algorithms, and optimization. For most other tasks, modern programming languages allow the programmer to work directly with abstractions instead of bits that represent those abstractions.

Basic Bit Manipulation Operations:

- Bitwise AND (&):
- Performs a bitwise AND operation between the corresponding bits of two numbers.
- Example:

result = num1 & num2

- Performs a bitwise OR operation between the corresponding bits of two numbers.
- Example:

result = num1 | num2

- Performs a bitwise XOR (exclusive OR) operation between the corresponding bits of two numbers.
- Example:

result = num1 ^ num2

- Inverts the bits of a number, changing 0s to 1s and vice versa.
- Example:

result = ~num

- Definition: Shifts the bits of a number to the left by a specified number of positions, filled with zeros.
- Example:

result = num << 2

- Definition: Shifts the bits of a number to the right by a specified number of positions, filling with zeros (for unsigned numbers) or the sign bit (for signed numbers).
- Example:

result = num >> 3

Applications:

- Bitwise Flags:
- Using individual bits as flags to represent various states or options in a compact manner.
- Optimizations:
- Certain algorithms and data structures can be optimized using bitwise operations for memory efficiency.
- Data compression:
- Techniques like Huffman coding use bit manipulation for data compression.
- Cryptography:
- Some cryptographic algorithms involve bit-level operations for encryption and decryption.

Understanding:

- Bitwise AND, OR, XOR:
- Perform operations between the corresponding bits of two numbers.
- Bitwise NOT:
- Inverts a number's bits, changing 0s to 1s and vice versa.
- Left Shift:
- Shifts a number's bits to the left by a specified number of positions.
- Right Shift:
- Shifts bits of a number to the right by a specified number of positions.

**Weekly Trends and Language Statistics**- Weekly Trends and Language Statistics