In terms of theoretical performance, it seems obvious that cached variables will be more efficient than re-accessing the registry (which leads to system calls and possibly even disk I / O, rather than simple memory accesses if the settings are cached). But, as @MarkRansom notes in the comments, 100 registry accesses are unlikely to greatly affect the performance of your program if your procedure is not called very often (for example, in a narrow loop).
As usual, with any performance / optimization problem: donโt worry if you really donโt know that this creates a performance problem (for example, your profiler tells you this, or you can easily prove it yourself).
However, there is one more problem.
You say, โI am not interested in parameters that change at runtime,โ but IMHO you should: what happens if a user changes a parameter at runtime of your program? I mean, you started the calculation based on certain options that cause specific assumptions, and suddenly the parameter changes. This can easily ruin your invariants / assumptions and introduce consistency issues.
Thus, it doesnโt matter what the performance problems are, you should always cache your user settings in variables so that if they are modified by the user at runtime, your program remains consistent.
In other words, not only performance is important to me, but rather the correctness of the program .
@CaptainObvlious raises an interesting point: if you really need to update your settings whenever the user (or another application) changes them, then do it in a controlled way (as he suggests, monitoring the registry is one way) and updating your cached variables only then when it comes up. This ensures that your settings will not change in the middle of the calculation when this calculation really expects the same settings.
syam source share