Today I did a simple test to compare the speed between java and c - a simple cycle that increases the number of "i" from 0 to two billion.
I really expected c-language to be faster than java. I was surprised by the result:
time spent in seconds for java: approx. 1.8 seconds
time required in seconds for c: approx. 3.6 seconds.
I DO NOT think Java is a faster language in general, but I DO NOT understand why the loop is twice as fast as c in my simple programs?
Did I make a key mistake in the program? Or is the MinGW compiler badly configured or something like that?
public class Jrand { public static void main (String[] args) { long startTime = System.currentTimeMillis(); int i; for (i = 0; i < 2000000000; i++) {
C-PROGRAM
#include<stdio.h> #include<stdlib.h> #include <time.h> int main () { clock_t startTime; startTime = clock(); int i; for (i = 0; i <= 2000000000; i++) { // Do nothing } clock_t endTime; endTime = clock(); float totalTime = endTime - startTime; printf("%f", totalTime/1000); return 0; }
source share