The fastest way to load a huge text file into an int array

I have a large text file (+ 100 MB), each line of which is an integer (contains 10 million numbers). Of course, the size and amount may change, so I do not know this in advance.

I want to upload the file to int[], which will make the process as fast as possible. First I came to this solution:

public int[] fileToArray(String fileName) throws IOException
{
    List<String> list = Files.readAllLines(Paths.get(fileName));
    int[] res = new int[list.size()];
    int pos = 0;
    for (String line: list)
    {
        res[pos++] = Integer.parseInt(line);
    }
    return res;
}

It was pretty fast, 5.5 seconds. Of these, 5.1s is sent for the call readAllLinesand 0.4s for the loop.

But then I decided to try using BufferedReader and came up with this different solution:

public int[] fileToArray(String fileName) throws IOException
{
    BufferedReader bufferedReader = new BufferedReader(new FileReader(new File(fileName)));
    ArrayList<Integer> ints = new ArrayList<Integer>();
    String line;
    while ((line = bufferedReader.readLine()) != null)
    {
        ints.add(Integer.parseInt(line));
    }
    bufferedReader.close();

    int[] res = new int[ints.size()];
    int pos = 0;
    for (Integer i: ints)
    {
        res[pos++] = i.intValue();
    }
    return res;
}

It was even faster! 3.1 seconds, just 3 s for the cycle whileand even 0.1 s for the cycle for.

, , , ArrayList, int [] .

, , ArrayList?

, FreePascal 1,9 [. ], TStringList StrToInt .

EDIT. Java-, FreePascal. 330 ~ 360ms.

+4
1

Java 8, ArrayList, lines(), int, .

try-with-resources .

try (BufferedReader br = new BufferedReader(new FileReader(fileName))) {
    return br.lines()
             .mapToInt(Integer::parseInt)
             .toArray();
}

, , , , .

: , , .

+7

Source: https://habr.com/ru/post/1651967/


All Articles