You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- The above algorithm’s <b>time complexity</b> is exponential `O(2ⁿ)`, where `n` represents the total number of items. This can also be confirmed from the above recursion tree. As we can see, we will have a total of `31` 😲 recursive calls – calculated through `(2ⁿ) + (2ⁿ) - 1`, which is asymptotically equivalent to `O(2ⁿ)`.
225
+
- The above algorithm’s <b>time complexity</b> is exponential `O(2ⁿ)`, where `n` represents the total number of items. This can also be confirmed from the above recursion tree. As we can see, we will have a total of `31` 😲 recursive calls – calculated through `(2ⁿ) + (2ⁿ) - 1`, which is <i>asymptotically</i> equivalent to `O(2ⁿ)`.
226
226
- The <b>space complexity</b> is `O(n)`. This space will be used to store the recursion stack. Since the recursive algorithm works in a depth-first fashion, which means that we can’t have more than `n` recursive calls on the call stack at any time.
227
227
228
228
### Overlapping Sub-problems
@@ -307,7 +307,7 @@ console.log(
307
307
#### Time & Space Complexity
308
308
309
309
- Since our <b>Memoization</b> array `memo[profits.length][capacity+1]` stores the results for all subproblems, we can conclude that we will not have more than `N*C` subproblems (where `N` is the number of items and `C` is the knapsack capacity). This means that our <b>time complexity</b> will be `O(N*C)`.
310
-
- The above algorithm will use `O(N*C)` space for the <b>Memoization</b> array. Other than that, we will use `O(N)` space for the recursion call-stack. So the total <b>space complexity</b> will be `O(N*C + N)`, which is asymptotically equivalent to `O(N*C)`.
310
+
- The above algorithm will use `O(N*C)` space for the <b>Memoization</b> array. Other than that, we will use `O(N)` space for the recursion call-stack. So the total <b>space complexity</b> will be `O(N*C + N)`, which is <i>asymptotically</i> equivalent to `O(N*C)`.
311
311
312
312
### Bottom-up Dynamic Programming
313
313
@@ -1705,7 +1705,7 @@ console.log(
1705
1705
#### What is the time and space complexity of the above solution?
1706
1706
1707
1707
- Since our <i>memoization</i> array `dp[profits.length][capacity+1]` stores the results for all the subproblems, we can conclude that we will not have more than `N*C` subproblems (where `N` is the number of items and `C` is the knapsack capacity). This means that our <b>time complexity</b> will be `O(N∗C)`.
1708
-
- The above algorithm will be using `O(N*C)` space for the <i>memoization</i> array. Other than that we will use `O(N)` space for the recursion call-stack. So the total <b>space complexity</b> will be `O(N*C + N)`, which is asymptotically equivalent to `O(N*C)`.
1708
+
- The above algorithm will be using `O(N*C)` space for the <i>memoization</i> array. Other than that we will use `O(N)` space for the recursion call-stack. So the total <b>space complexity</b> will be `O(N*C + N)`, which is <i>asymptotically</i> equivalent to `O(N*C)`.
1709
1709
1710
1710
### Bottom-up Dynamic Programming
1711
1711
@@ -3285,18 +3285,12 @@ We can clearly see that this problem follows the <b>[Fibonacci number pattern](#
- Since our <b>memoization</b> array `dp[str.length][str.length]` stores the results for all the <i>subproblems</i>, we can conclude that we will not have more than `N*N` <i>subproblems</i>(where `N` is the length of the input <i>sequence</i>). This means that our time complexity will be `O(N²)`.
3421
-
- The above algorithm will be using `O(N²)` <b>space</b> for the <b>memoization</b> array. Other than that we will use `O(N)` <b>space</b> for the <i>recursion call-stack</i>. So the total <b>space complexity</b> will be `O(N² +N)`, which is asymptotically equivalent to `O(N²)`.
3415
+
- The above algorithm will be using `O(N²)` <b>space</b> for the <b>memoization</b> array. Other than that we will use `O(N)` <b>space</b> for the <i>recursion call-stack</i>. So the total <b>space complexity</b> will be `O(N² +N)`, which is <i>asymptotically</i> equivalent to `O(N²)`.
3422
3416
3423
3417
### Bottom-up Dynamic Programming
3424
3418
Since we want to try all the <b>subsequences</b> of the given <i>sequence</i>, we can use a two-dimensional array to store our results. We can start from the beginning of the <i>sequence</i> and keep adding one element at a time. At every step, we will try all of its <b>subsequences</b>. So for every `startIndex` and `endIndex` in the given string, we will choose one of the following two options:
A <b>basic brute-force solution</b> could be to try all the <i>subsequences</i> of the given number sequence. We can process one number at a time, so we have two options at any step:
4741
+
4742
+
1. If the current number is greater than the previous number that we included, we can <i>increment our count</i> and make a <i>recursive call</i> for the remaining array.
4743
+
2. We can skip the current number to make a <i>recursive call</i> for the remaining array.
4744
+
4745
+
The length of the <b>longest increasing subsequence</b> will be the maximum number returned by the two recurse calls from the above two options.
- The <b>time complexity</b> of the above algorithm is exponential `O(2ⁿ)`, where `n` is the lengths of the input array.
4780
+
- The <b>space complexity</b> is `O(n)` which is used to store the <i>recursion stack</i>.
4781
+
4782
+
### Top-down Dynamic Programming with Memoization
4783
+
To overcome the <i>overlapping subproblems</i>, we can use an array to store the already solved <i>subproblems</i>.
4784
+
4785
+
The two changing values for our <i>recursive function</i> are the `currIndex` and the `prevIndex`. Therefore, we can store the results of all <i>subproblems</i> in a two-dimensional array. (Another alternative could be to use a <i>hash-table</i> whose key would be a string (`currIndex` + `“|”` + `prevIndex`)).
- Since our memoization array `dp[nums.length()][nums.length()]` stores the results for all the <i>subproblems</i>, we can conclude that we will not have more than `N*N` <i>subproblems</i> (where `N` is the length of the input sequence). This means that our <b>time complexity</b> will be `O(N²)`.
4827
+
- The above algorithm will be using `O(N²)` <b>space</b> for the <i>memoization array</i>. Other than that we will use `O(N)` <b>space</b> for the <i>recursion call-stack</i>. So the total <b>space complexity</b> will be `O(N² +N)`, which is <i>asymptotically</i> equivalent to `O(N²)`.
4828
+
4829
+
### Bottom-up Dynamic Programming
4830
+
The above algorithm tells us two things:
4831
+
4832
+
1. If the number at the `currIndex` is bigger than the number at the `prevIndex`, we increment the count for <b>LIS</b> up to the `currIndex`.
4833
+
2. But if there is a bigger <b>LIS</b> without including the number at the `currIndex`, we take that.
4834
+
So we need to find all the <i>increasing subsequences</i> for the number at index `i`, from all the previous numbers (i.e. number till index `i-1`), to eventually find the <i>longest increasing subsequence.</i>
4835
+
4836
+
If `i` represents the `currIndex` and `j` represents the `prevIndex`, our <i>recursive formula</i> would look like:
4837
+
```js
4838
+
if num[i] > num[j] => dp[i] = dp[j] +1if there is no bigger LISfor'i'
4839
+
```
4840
+
Here is the code for our <b>bottom-up dynamic programming approach</b>:
0 commit comments