Posts

[Algorithm] Dynamic Programming-2 (DAG shortest path)

Image
1. Shortest paths in dags Above graph is DAG(Directed Acyclic Graph). We can linearization DAG  Let dist(X) is the minmum distance from S to X. To caclulate dist(D), comparing dist(C)+3 and dist(B)+1 is enough(because B and C are two predecessors to D). Thus, dist(D)=min{dist(B)+1, dist(C)+3} It means If we compute dist values in the left-to-right order, we can make sure that when we get to a node v, we already have all the information we need to compute dist(v).   Implement with python code: from collections import defaultdict, deque def shortest_path (graph, start): # Step 1: Compute the Topological Order of the Graph visited = defaultdict( bool ) stack = deque() def topological_sort (node): visited[node] = True for neighbour in graph[node]: if not visited[neighbour]: topological_sort(neighbour) stack . appendleft(node) for node in graph: if not visited[node]: ...

[Algorithm] Dynamic Programming-1(Definition, Fibonacci)

Image
Dynamic programming is a method for solving large problems by breaking them down into smaller subproblems. This way, you avoid having to repeat the solutions to each subproblem multiple times, allowing you to solve the overall problem much faster. This approach typically works well when a problem has two characteristics: -Optimal substructure Optimal substructure means that the optimal solution to the overall problem can be constructed from the optimal solutions to its subproblems -Overlapping subproblems Overlapping subproblems means that the same subproblems recur many times. Let's see the fibonacci sequence. When solving n'th term of sequence, recursion is very easy way. def fibonacci (n): if n <= 2 : return 1 else : return fibonacci(n - 1 ) + fibonacci(n - 2 ) In this code, there are too many calculation. For example, fibonacci(5) When you compute fibonacci(5), you go through the process as described above. From the diagram, you can see that op...