I am trying to write a simple program to output in hexadecimal the first 16 kilobytes of a binary file (Boy Boy ROM) in 16-bit chunks. However, during the for loop, my program will invariably execute segfault, however it always segfaults at another point in the array. Here is the code:
#include <stdio.h> #include <stdint.h> int main () { uint16_t buffer[8000]; FILE* ROM = fopen("rom.gb", "rb"); if (ROM == NULL) { printf("Error"); fclose(ROM); return 1; } fread(buffer, sizeof(buffer), 1, ROM); int i; for(i = 0; i < sizeof(buffer); ++i) { if (buffer[i] < 16) { printf("000%x ", buffer[i]); } else if (buffer[i] < 256) { printf("00%x ", buffer[i]); } else if (buffer[i] < 4096) { printf("0%x ", buffer[i]); } else { printf("%x ", buffer[i]); } } fclose(ROM); return 0; }
Before I switched to using uint16_t instead of char (since Game Boy has a 16-bit address space), this did not happen, and in fact, if I include an ad
unsigned char buffer2[16000];
next to the declaration of the first buffer, I get the expected result. So my questions are: why does adding an unused variable stop the program from segfault? And how can I avoid this and declare a huge array that is not completely used in the program?
source share