I am learning algorithms and need you guys to help me. I am new, so forgive me if my question is not clear. While I'm studying, I see something like NlogN, N ^ 2, etc. And something like that.
I don’t quite understand this when it comes to checking the efficiency / performance of various algorithms using these notations. I understand the logarithms very well, but the way they were used in relation to checking the performance of the algorithms makes me crazy.
I ask if someone can point me to a textbook in which such designations were explained so that I can very thoroughly understand the basics. I really want to understand them and am ready to learn.
Thank you for your help.
Kap.
source
share