I want to learn node.js and mongodb that look suitable for something that I would like to do. As a small project to help me learn, I thought that I would copy the “posts” table from the phpbb3 forum, which I have in the mongodb table, so I did something like this, where db is the mongodb database connection and the client - mysql database connection.
db.collection('posts', function (err, data) {
client.query('select * from phpbb_posts", function(err, rs) {
data.insert(rs);
});
this works fine when I do this on small tables, but my message table contains about 100,000 rows and this query is not returned even when I leave it for an hour. I suspect that he is trying to load the entire database table into memory and then insert it.
So, what I would like to do is read a piece of lines at a time and insert them. However, I don’t see how to read a subset of lines in node.js and even more of a problem, I can’t understand how I can iterate through requests one at a time, when I receive a notification only through the callback that it completed.
Any ideas how I can best do this? (I'm looking for solutions using node.js, since I would like to know how to solve this problem, I could no doubt do it easily in a different way)
source
share