Is there a lazy / iterator equivalent to `str.split ()`?

Possible duplicate:
Is there a string.split() generator string.split() in Python?

str.split(delim) splits the string into a list of tokens, separated by the delim symbol. The entire list of tokens is returned in one hit.

When working with large blocks of text, it can be useful to process tokens lazily. That is, you need to get only one token at a time, as needed. (An example that comes to mind is processing a large piece of text in memory.)

Is there a built-in or standard library function that will perform lazy split() ? Something from itertools ?

+4
source share
1 answer

Inaccurate equivalent, but re.finditer() does a string search lazily.

+3
source

Source: https://habr.com/ru/post/1402222/


All Articles