You can read them all and then use regular lists to find them.
with open('bigfile.csv','rb') as longishfile: reader=csv.reader(longishfile) rows=[r for r in reader] print row[9] print row[88]
If you have a massive file, it can kill your memory, but if the file received less than 10,000 lines, you should not face big slowdowns.
source share