I have a C # application that calls a C ++ DLL.
In C #, I have the code as follows:
[DllImport(@"111.dll", CharSet = CharSet.Unicode)] public extern static String Func1(String arg); ...... String arg = "test text"; String retstring = Func1(arg);
In CPP, I have a function defined as follows:
extern "C" { __declspec(dllexport) LPWSTR Func1(LPWSTR arg) { .... LPWSTR ret1 = L"1?2?3?4?5"; LPWSTR ret2 = SomeActualFunction(arg); retturn ret1;
If I return ret1 in C ++ Func1 (), everything works fine. And in the VS2008 memory window, I can see the correct Unicode binaries. In C ++, binaries ret1 -
"31 00 3f 00 32 00 3f 00 33 00 3f 00 34 00 3f 00 35 00"
and in C # retstring binaries
"28 67 a3 f7 fe 07 00 00 0a 00 00 00 09 00 00 00 31 00 3f 00 32 00 3f 00 33 00 3f 00 34 00 3f 00 35 00"
. I think C # binaries
"28 67 a3 f7 fe 07 00 00 0a 00 00 00 09 00 00 00"
is a header of type System.String.
What else, if I add the following line immediately before returning in the CPP code, I can also get the correct line in C #:
ret2 = L"1?2?3?4?5";
But when I return ret2 in the C ++ DLL, the returned string of ret in C # seems to be corrupted. The binaries in the C ++ DLL are Unicode correct when I inspect. But the retstring binaries in C # code make up
"28 67 a3 f7 fe 07 00 00 0a 00 00 00 09 00 00 00 dd dd dd dd dd dd dd dd dd dd dd dd ....".
I can only notice that ret2 is longer than ret1 - ret2 has several hundred WCHARs.
Any ideas? thanks in advance.