C ++ Reading a file in a Char Array

I am using the following code to read a file into a chararcter array. Now, for a small file (say, 2 MB), it runs correctly, but for a large file (140 MB) on my 18 gigabyte UBUNTU server it gives a segmentation fault . Can someone help me how to solve this? I think 18 GB is enough to store 240 MB of file in memory. I am using 64 bit UBUNTU and compiling with g ++.

 ifstream is; char chararray [fileSize] ; is.read(chararray, fileSize) ; 
+4
source share
3 answers

If the array is a local variable, you will get a stack overflow because it will not be pushed onto the stack. Allocate the "array" on the heap instead, either using new or indirectly using std::vector .

Or use memory mapping. See mmap .

+5
source

Instead of allocating the char array on the stack, I will try to use std::vector , which will be dynamically allocated on the heap :

 std::vector<char> buffer(fileSize); is.read(&buffer[0], fileSize); 
+2
source

The GCC compiler has a default command called size for this! Compile the program using the GCC compiler. Then you can get the file size!

 gcc -Wall test.c size 

This is for a normal C program! Since you did not specify any parameter, it accepts. /a.out as your default option!

If you need to apply some optimization, the code will look like this.

 praveenvinny@ubuntu :~/Project/New$> gcc -Wall -o1 -fauto-inc-dec test.c -o Output praveenvinny@ubuntu :~/Project/New$> size output text data bss dec hex filename 1067 256 8 1331 533 output 

Use the text section for code size. You can use data and bss if you want to also consider the global size of the data.

This will print the code size,

 time -f "%e" -o Output.log ./a.out 

will print the runtime in a log file called Output.log

+1
source

Source: https://habr.com/ru/post/1447184/


All Articles