My application has the following code that performs two functions:
Parse a file that has the number "n".
There will be two web service calls for each file in the file.
public static List<String> parseFile(String fileName) { List<String> idList = new ArrayList<String>(); try { BufferedReader cfgFile = new BufferedReader(new FileReader(new File(fileName))); String line = null; cfgFile.readLine(); while ((line = cfgFile.readLine()) != null) { if (!line.trim().equals("")) { String [] fields = line.split("\\|"); idList.add(fields[0]); } } cfgFile.close(); } catch (IOException e) { System.out.println(e+" Unexpected File IO Error."); } return idList; }
When I try to parse a file with 1 million lines of record, the java process exits after processing a certain amount of data. I got java.lang.OutOfMemoryError: Java heap space error. I can partially understand that the Java process is stopping due to this huge data. Please suggest me how to do this with huge data.
EDIT: Will this part of the code be new BufferedReader(new FileReader(new File(fileName))); analyze the entire file and is affected by file size.
source share