I have csv_file in which a.) Firstly, each line must be converted to xml and b.) Secondly, the converted xml will be sent to the rails side for some database write operation.
Below is my Flow code for it.
flow = csv_rows
|> Flow.from_enumerable()
|> Flow.partition
|> Flow.map(&(CSV.generate_xml(&1)))
|> Flow.map(&(CSV.save_to_rails_databse(&1)))
|> Flow.run
Everyting works fine for a small csv file, but when csv_file is very large (suppose 20,000) records, then the second operation (i.e. writing to the database on the rails side) tries to insert two multichannel files into the same one since the elixir sends too many requests to the rail side at the same time, so the database reaches its maximum limit.
Is it good to handle events in party 50, and in this case will be useful min_demand
and max_demand
.
source
share