Seg with an open command when trying to open a very large file

I take a network class at school and use C / GDB for the first time. Our task is to create a web server that interacts with a client browser. I am making good progress and can open files and send them to the client. Everything goes fine until I open a very large file, and then I’m not mistaken. I'm not a C / GDB professional, so I'm sorry if it makes me ask stupid questions and not be able to see the solution myself, but when I looked at the dumped kernel, I see that my seg fault comes here:

if (-1 == (openfd = open(path, O_RDONLY)))

In particular, we were instructed to open the file and send it to the client browser. My algorithm:

  • Open / Error catch
  • Read file to clipboard / Error
  • Send file

We were also instructed to make sure that the server did not crash when sending very large files. But my problem seems to be opening them. I can send all my smaller files. The file in question is 29.5 MB.

The whole algorithm:

ssize_t send_file(int conn, char *path, int len, int blksize, char *mime) {
  int openfd; // File descriptor for file we open at path
  int temp; // Counter for the size of the file that we send
  char buffer[len]; // Buffer to read the file we are opening that is len big

  // Open the file
  if (-1 == (openfd = open(path, O_RDONLY))) {
    send_head(conn, "", 400, strlen(ERROR_400));
    (void) send(conn, ERROR_400, strlen(ERROR_400), 0);
    logwrite(stdout, CANT_OPEN);
    return -1;
  }

  // Read from file
  if (-1 == read(openfd, buffer, len)) {
    send_head(conn, "", 400, strlen(ERROR_400));
    (void) send(conn, ERROR_400, strlen(ERROR_400), 0);
    logwrite(stdout, CANT_OPEN);
    return -1;
  }
  (void) close(openfd);

  // Send the buffer now
  logwrite(stdout, SUC_REQ);
  send_head(conn, mime, 200, len);      
  send(conn, &buffer[0], len, 0);
  return len;
}

I do not know if it is just a fact that I am new to Unix / C. Sorry if so. = (But they help you a lot.

+3
source share
4 answers

Instead of using a variable-length array, perhaps try allocating memory with malloc.

char *buffer = malloc (len);

...

free (buffer);

I just did some simple tests on my system, and when I use large-sized variable-length arrays (like the size you encountered), I also get SEGFAULT.

+3

, , , , , , , , -, .

, , 8192 (, , , ), , , ( ) 0 ( errno) .

+4

, stackoverflow ( ).

, . , , , (, , ), .

, , . , , .

- Platinum Azure dreamlax, malloc new.

+4

, .

, , , , ( ). . , open(), , .

, (mmap()), malloc(). , path const char*.

+2

Source: https://habr.com/ru/post/1728564/


All Articles