I am trying to find the most efficient way to search a directory full of text files (maybe 2000 files with 150 lines each) for a keyword. If I was just looking for one keyword, then performance would not be such a problem, but in my application I want to be able to search for another keyword later, perhaps several times. So repeating the entire collection of files each time is time consuming. And keeping everything in mind seems pretty expensive.
What would be the best way to do this? I don’t have access to the SQL database or anything like that, so I can’t temporarily upload the contents to the database and periodically search for it; it will just become a regular windows application.
The most primitive approach I can think of is to dump all the files into one huge XML file and search for it, rather than repeating all the files in the directory every time you search for a keyword. But does it even seem like it can be quite intense?
I will know the name of the directory in advance, so I can pre-process the contents - if this can help to some extent, such as optimization.
Any suggestions are welcome, thanks.
source share