Dynamic programming algorithms and use in the real world

In the past, I studied classical problems and DP algorithms (coins, the largest increasing subsequence, the longest general subsequence, etc.).

I know that these algorithms have practical applications (i.e. genetic algorithms, just to name them). I just doubt that these algorithms have practical application in modern computer science, where the input size is very large, and problems are not solved only on one machine.

My point is that these algorithms are rather difficult to parallelize (for example, Parallel dynamic programming ), and the memory usage is quadratic in most formulations, which makes it difficult to process inputs that are large enough.

Does anyone have any use cases in this world?

+4
source share
2 answers

Practical use: diff . This is an important utility for Linux that finds differences between the two files, solving the problem of the longest common subsequence using the DP algorithm.

DP algorithms are used, because in many cases they are the only practical solution. And besides, there is nothing wrong with them.

Memory usage:. Often, you can use a sliding window to speed up memory usage. Fibonacci, when solved using a naive upstream DP, requires O (n) memory. A sliding window improves this to O (1) memory (I know magic constant time, but it is not).

Parallelization: Top-level items are often easily parallelized. Bottom may or may not be. The @amit example (parallelizing the longest common subsequence) is good when any given diagonal tiles can be resolved independently as long as the previous diagonals are known.

+5
source

The problem is the longest common subsequence and the Longest common substring problem is sometimes important for string analysis [for example, gene sequence analysis]. And they can be effectively resolved using dynamic programming.

Note you can parallelize this algorithm : you do it in iterations along the diagonal [left, down, right, up] - so the total number of iterations is 2n-1 . And in each diagonal: each cell does not depend on other cells in this diagonal - therefore, here you can parallelize, each stream will have a block of cells in this diagonal.

Please note that data synchronization using this method is also minimal: each stream should only transmit data to its "neighboring streams", so this can be done even if the memory is not used.

In addition, both problems, as mentioned by @larsmans, can use linear space - at each point you only need to β€œremember” the current + 2 last diagonals, and not the entire matrix.

Another common problem solved by dynamic programming is polynomial interpolation . Interpolation can be efficiently performed using Newton Interpolation , which must first calculate the divided differences - which is built using dynamic programming.

+1
source

Source: https://habr.com/ru/post/1394985/


All Articles