There are two possible answers:
If you want all Unicode file names to be represented, you can rigidly formulate the assumption that the file system uses UTF-8 file names. This is a βmodernβ approach for Linux desktops. Just convert the strings from wchar_t (UTF-32) to UTF-8 with library functions ( iconv will work well) or your own implementation (but find the specifications so you don't make a mistake as Shelvian did), then use fopen .
If you want to make something more standard-oriented, you should use wcsrtombs to convert the wchar_t string to a multibyte char string in locale encoding (which I hope is UTF-8 anyway on any modern system) and use fopen . Please note that this requires that you previously set the locale using setlocale(LC_CTYPE, "") or setlocale(LC_ALL, "") .
And finally, not quite the answer, but the recommendation:
Saving file names as wchar_t strings is probably a terrible mistake. Instead, you should store the file names as abstract byte strings and only convert them to wchar_t just in time to display them in the user interface (if it is even necessary for this: many user interface tools use simple byte strings and interpret as characters for you ) Thus, you eliminate many possible unpleasant angular cases, and you never encounter a situation where some files are inaccessible due to their names.
source share