I have a very simple semaphore code that works fine on Linux but cannot make it work properly on OS X in its entire life ... It returns the strangest results ...
#include <iostream>
#include <fcntl.h>
#include <stdio.h>
#include <semaphore.h>
int main()
{
sem_t* test;
test = sem_open("test", O_CREAT, 0, 1);
int value;
sem_getvalue(test, &value);
printf("Semaphore initialized to %d\n", value);
}
Compiling this on OS X with g ++ returns the following output:
iQudsi:Desktop mqudsi$ g++ test.cpp
iQudsi:Desktop mqudsi$ ./a.out
Semaphore initialized to -1881139893
While on Ubuntu I get a decidedly more reasonable result:
iQudsi: Desktop mqudsi$ g++ test.cpp -lrt
iQudsi:Desktop mqudsi$ ./a.out
Semaphore initialized to 1
I have been on this for 3 hours in a row and cannot understand why OS X returns such strange results ...
I tried to use file paths as a semaphore name, it didn't matter.
I would be grateful for any help I could receive.
source
share