I am new to C programming, and lately I am trying to combine with some low-level I / O functions in C on my x86 Linux system in order to better understand the internal processes. As an exercise, I decided to write a small program that should read data from a file and output to the console output to hexadecimal output. My program reads the first 512 bytes of a file and passes a buffer to a function that prints the result.
#include <stdio.h> #include <sys/stat.h> #include <sys/types.h> #include <stdlib.h> #include <fcntl.h> #include <string.h> #include <unistd.h> #define BYTES 512 void printout(unsigned char *data){ for (int i=0; i < BYTES; i++){ if (i == 0 || i % 16 == 0) printf("%07x ",i); printf("%02x", data[i]); if (i % 2) printf(" "); if ((i + 1) % 16 == 0) printf("\n"); } printf("%07x\n",BYTES); return; } int main(int argc, char **argv){ unsigned char data[BYTES]; char *f; int fd; if(argv[1]==NULL){ fprintf(stderr, "Usage: %s <filename>\n", argv[0]); return EXIT_FAILURE; } f = argv[1]; fd=open(f, O_RDONLY); if(fd == -1){ fprintf(stderr, "Error: Could not open file %s\n", f); return EXIT_FAILURE; } read(fd, data, BYTES); printout(data); close(fd); return EXIT_SUCCESS; }
Unfortunately, when I compare my output with hexdump or od, it seems like my program is returning the byte order.
$ myprogram /dev/sda |head -1 0000000 eb63 9000 0000 0000 0000 0000 0000 0000 $ hexdump /dev/sda |head -1 0000000 63eb 0090 0000 0000 0000 0000 0000 0000
Since od -x --endian = little produces the same result as hexdump, I am sure that the problem is in my code, but I have no idea. I would appreciate it if someone could explain to me why this is happening and that I was missing, respectively. I searched on the internet but havenβt found anything useful yet.
Thank you for your help!
Regards, Dirk
source share