How do I pass a UTF-16 char from a C++/CLI function to a .NET function? What types do I use on the C++/CLI side and how do I convert it?
I've currently defined the C++/CLI function as follows:
wchar_t GetCurrentTrackID(); // 'wchar_t' is the C++ unicode char equivalent to .NET's 'char'?The .NET wrapper is defined as:
System::Char GetCurrentTrackID(); // here, 'char' means UTF-16 charI'm currently using this to convert it, but when testing it I only get a null character. How do I properly convert a unicode char code to itschar equivalent for .NET?
#pragma managedreturn (System::Char)player->GetCurrentTrackID();- To my knowledge,
System::Charcan seamlessly be cast to and fromwchar_t(at least, on Windows). You don't even need an explicit cast.Medinoc– Medinoc2013-06-04 15:38:21 +00:00CommentedJun 4, 2013 at 15:38
1 Answer1
They are directly compatible. You can assign a Char to a wchar_t and the other way around without a cast, the compiler will not emit any kind of conversion function call. This is true for many simple value types in C++/CLI, like Boolean vs bool, SByte vs char, Byte vs unsigned char, Int16 vs short, Int32 vs int or long, Int64 vs long long, Single vs float, Double vs double. Plus their unsigned varieties. The compiler will treat them as aliases since they have the exact same binary representation.
But not strings or arrays, they are classes with a non-trivial implementation that doesn't match their native versions at all.
Comments
Explore related questions
See similar questions with these tags.
