I'm going to write a limited diffusion distribution (DLA) simulation, and I'm wondering if to use C or C ++.
C ++ would be good for design reasons, but I wonder if C will work better. Of course, I know about the effectiveness of the algorithm and have chosen the best possible algorithm. Therefore, I am not talking about improving O (n ^ 2) to O (log n) or the like. I am trying to reduce my constants, so to speak.
If you donβt know DLA, it basically comes down to having an array of doubles (size between 10 ^ 3 and 10 ^ 6) and in a loop in which random twins are selected for comparison (larger / smaller) with larger parts of the array.
Thus, the difference in performance that matters is the data access and call functions:
- Data Access: C struct vs. C ++ class with public data elements versus C ++ class with private data members and accessories.
- Call functions: C functions versus C ++ member functions.
I mean, what is the final way to judge this - is to look at the assembly code (for example, comparing the number of moves / loads, transitions and calls)? This, of course, depends on the compiler (for example, you can compare the awful C compiler with a good C ++ compiler). I use Gnu compilers (gcc and g ++).
I found that the assembly created by gcc and g ++ is almost identical in terms of the number of transitions (none), movements / loads, and calls for the following two programs:
Program C
C ++ program
class particle { public: double x; public: double square() { return x*x; } }; int main() { particle* particles = new particle[10]; double res; particles[0].x = 60.42; res = particles[0].square(); return 0; }
If I use private member data in a C ++ program, of course I get another call in the assembly when I call the particles [0] .setx (60.42).
Does this mean that I could also choose C ++ as C, since they produce almost the same build code? Should I avoid member private data as it adds extra function calls (for example, does the assembly cost expensive)?