Dynamic Programming is a technique that trades space for time. It stores the results computed (which might be needed in future) in memory, thus avoiding the need to compute the result everytime
A example of a problem that can be solved with this technique is Longest common subsequence.
Say we want to calculate the length of LCS of two strings.
LCS (“ABCD”,”BD”) = LCS(“ABC”,”B”) + “D”
Note that if the last characters match (“D”), compute LCS of the remaining portion and append this character . If the last characters do not match, then we remove the last character from one of the string and match with another to find the best possible match
To find the length, we can write it as a recursive function
LCS (“ABCD”,”BD”) = 1 + LCS (“ABC”,”B”) (since last chars match)
LCS(“ABCD”,”BDM”)= max( LCS(“ABC”,”BDM”), LCS(“ABCD”,”BD”)) (since last chars don’t match)
The recursion makes lot of calls, most of them have identical arguments. Note that LCS can be called with atmost (M+1)(N+1) different arguments – as each argument represents the length of a string. That is there are only polynomial no. of subproblems. So we can apply the dynamic programming technique.
For dynamic programming problems, the principle of optimality must be satisified. It states that for the global problem to be solved optimally each subproblem must be solved optimally. (Not all optimization problems satisfy this , sometimes it is better to lose a little on one subproblem to make a big gain on another)…and there must be polynomially many problems
Other problems that can be solved using DP technique are
Fibonacci series
Combination nCr
all pair shortest path
Matrix chain multiplication
Regular expression computation
Longest ascending subsequence
Reliable machine design
Independent set of graph
Edit distance (insert,delete, replace)
May 12, 2009 at 6:08 pm
Can you give the comperative study between dynamic programming, Greedy method & Divide-Conqur method?
May 13, 2009 at 9:49 am
Divide and Conquer ( eg. Binary Search, MergeSort, QuickSort )
The problem is divided into smaller pieces ( divide ), solving the smaller parts recursively (conquer) and combining the solutions.
BackTracking ( eg. DFS, 8 Queens, Permutation Generation, Subsets Generation )
you consider all possible choices to solve a problem and recursively solve subproblems under the assumption that the choice is taken. The set of recursive calls generates a tree in which each set of choices in the tree is considered consecutively
Dynamic Programming: (eg. Longest Common Subsequence, many combinatorial optimization problems )
The LCS problem discussed above can be seen as finding traversing a tree – at certain nodes we branch off ( choosing LCS(“ABC”,”BDM”) or LCS(“ABCD”,”BD”) in the above example ) – To determine the best choice we have to consider all possible choices and then evaluate the one optimal
Dynamic programming is an optimization technique for backtracking algorithms. When subproblems need to be solved repeatedly (i.e., when there are many duplicate branches in the backtracking algorithm) time can be saved by solving all of the subproblems first (bottom-up, from smallest to largest) and storing the solution to each subproblem in a table.
Greedy Algorithm
Greedy Algorithms: when “the best” choice can be determined without considering all possible choices.
http://en.wikibooks.org/wiki/Algorithms/Introduction