Passing the enum hex parameter from C # to C

I have a function in C that takes a hexadecimal parameter. I need to call this function from C #. My current approach does not seem to be correct, because my C function returns me the wrong number.

Here is the declaration of my C function:

enum tags { TAG_A = -1, TAG_B = 0x00, TAG_C = 0xC1, ... }; int myfunction(enum tags t); 

Here is my C # code:

 enum tags { TAG_A = -1, TAG_B = 0x00, TAG_C = 0xC1, ... } [DllImport ("mylibraryname")] public static extern int myfunction(tags t); myfunction(tags.TAG_B); 

I am on a Mac and I use Mono and Xcode to do all of this. The C function can be considered correct because I am loading the open source library. I think something is wrong with hexadecimal numbers, but I'm not sure.

Decision:

I ticked one answer, although actually setting up C # overflow has long resolved my problem. So in C # I have:

enum tags: long {TAG_A = -1, TAG_B = 0x00, TAG_C = 0xC1, ...}

+4
source share
2 answers

Hexadecimal is just another way of expressing a literal integer value. This is not relevant to your problem. For example, TAG_B = 0x00 and TAG_B = 0 mean the same thing.

Perhaps the problem is that C enum is a 16-bit integer, while C # enum is 32 bits. Instead of creating an enumeration in C #, just try to do this as direct Int32 values:

 static class tags { public static short TAG_A = -1; public static short TAG_B = 0x00; public static short TAG_C = 0xC1; // ... } [DllImport ("mylibraryname")] public static extern int myfunction(short t); myfunction(tags.TAG_B); 

Or, as LB suggested, you can simply specify the type of enum members:

 enum tags:short { TAG_A = -1, TAG_B = 0x00, TAG_C = 0xC1, // ... } 
+2
source

In my architecture, sizeof (enum_t), where enum_t is an enumeration typedef, returns 4 bytes, so if the C # enum is 4 bytes, I don't see a problem.

Check the size of the enumeration with sizeof for your architecture if they match the problem elsewhere.

0
source

Source: https://habr.com/ru/post/1446946/


All Articles