IEnumerable <T> and "yield return"

Good afternoon,

I am writing a simple lexer, which is basically a modified version of this one . After receiving each token, I need to make small changes and reanalyze it in order to re-check its type. In addition, of course, after lexical analysis, I need to reuse the entire list of tokens in order to make some sort of β€œparsing” on it. My question is that if the use of the operators IEnumerable<Token>and yield returnin the lexer can slow down the entire program ... It would be preferable to use List<Token>to build a list and iteratively use a conventional statement return? How about repeating in IEnumerable/ List? Which one is faster?

Many thanks.

+3
source share
4 answers

You ask the wrong question, you should worry more about the cost of Regex. Enumeration of tokens will be a very small part of this, it just does not make sense to optimize the code, which can be twice as fast, but only improves the perf program by 1%.

Write down the code, review it, you will know what to do for version 2. Given that these tools work in "human time" (no noticeable difference when the program takes twice as much time, when it takes 20 milliseconds), the most likely result is "do not need anything".

+5
source

It is possible that this will affect performance, but also allows you to lazily create an iterator.

, - . , -, , ( ), .

, , , , List<T>, IEnumerable<T>, , , List<T>, List<T> ... -, , .

+3

IEnumerable yield GetEnumator() IL-.

, , , .

+1

, , , , , .

, , . , Regex.Match(), , , .

, (, '{' '}', , ( , , ). , , .

Of course, this will only work with simple search and search markers. The more complex ones will require something more complex, such as a regular expression. Perhaps you can extend TokenDefinition to indicate if the match is simple or regular. This will reduce the number of regular expressions that are executed, but at the same time retain the necessary flexibility.

0
source

Source: https://habr.com/ru/post/1776468/


All Articles