Correctly, the term iteration refers to the general concept of iterative algorithms, where each of them solves a problem by consistently creating (I hope, more and more accurate) approximations of some "ideal" solution. Generally speaking, the more iterations, the more accurate ("best") the result will be, but, of course, more complicated calculations need to be performed.
The term cutoff (aka cutoff frequency) is used to denote the method of reducing the size of n-gram language models (as OpenNLP is used, for example, its tag for partial speech). Consider the following example:
Sentence 1 = "The cat likes mice." Sentence 2 = "The cat likes fish." Bigram model = {"the cat" : 2, "cat likes" : 2, "likes mice" : 1, "likes fish" : 1}
If you set the cutoff frequency to 1 for this example, the n-gram model will be reduced to
Bigram model = {"the cat" : 2, "cat likes" : 2}
That is, the clipping method removes from the language model those n-grams that rarely occur in training data. Sometimes it is necessary to reduce the size of n-gram language models, since the number of even bigrams (not to mention trigrams, 4 grams, etc.) explodes for large cases. Then you can use the information to stop (the number of n-grams) for a statistical estimation of the probability of a word (or its POS tag) taking into account (N-1) the previous word (or POS tags).
source share