<\/script>')
...">

Why return an unknown value when I added char to the null string (like "" + c)?

Let me show you my code first:

void testStringAdd(){ char c = '$'; string str = ""; str += c;//the same as `str = str + c;` cout << str << "---"; cout << "size : " << str.size() << endl; str = "" + c;//the same as `str = c + ""`; cout << str << "---"; cout << "size : "<< str.size() << endl; } 

I expected the output to be:

$ --- size: 1

$ --- size: 1

But the real conclusion on vs2013:

$ --- size: 1

--- size: 0

This is an interesting phenomenon, and I wonder why it is so strange?

note: If I code string str = ""; , then str == "" will return true.

+5
source share
1 answer

In str = "" + c; , "" not std::string , it is a string literal of type const char[] , and then decaying to const char* and "" + c becomes pointer arithmetic.

In this case, since c has a positive value, "" + c will result in UB , which means that anything is possible. You may be lucky (or unlucky) that the program will not work.

And as pointed out in the comments, explicit conversion to std::string will fix the problem.

 str = std::string("") + c; 

Or use the std :: string literal operator (starting with C ++ 14):

 str = ""s + c; 
+9
source

Source: https://habr.com/ru/post/1246970/


All Articles