To make sure this works, I will create a fake
and fakeAsyncOperation
that will read data from the dataset asynchronously. To accurately model your data, each request from a fake data set returns a response with the data
and pages
fields.
let fake = new Map([ ['root', {data: 'root', pages: ['a', 'b', 'c', 'd']}], ['a', {data: 'a', pages: ['a/a', 'a/a']}], ['a/a', {data: 'a/a', pages: []}], ['a/b', {data: 'a/b', pages: ['a/b/a']}], ['a/b/a', {data: 'a/b/a', pages: []}], ['b', {data: 'b', pages: ['b/a']}], ['b/a', {data: 'b/a', pages: ['b/a/a']}], ['b/a/a', {data: 'b/a/a', pages: ['b/a/a/a']}], ['b/a/a/a', {data: 'b/a/a/a', pages: []}], ['c', {data: 'c', pages: ['c/a', 'c/b', 'c/c', 'c/d']}], ['c/a', {data: 'c/a', pages: []}], ['c/b', {data: 'c/b', pages: []}], ['c/c', {data: 'c/c', pages: []}], ['c/d', {data: 'c/d', pages: []}], ['d', {data: 'd', pages: []}] ]); let fakeAsyncOperation = (page) => { return new Promise(resolve => { setTimeout(resolve, 100, fake.get(page)) }) }
Next we have your foo
function. I renamed doo
to enqueue
because it works as a queue. It has two parameters: acc
for tracking accumulated data and xs
(destructed), which are the elements in the queue.
I used the new async/await
syntax, which makes it especially nice to solve this. We do not need to manually create any Promises or deal with any manual .then
chain.
I freely used the distribution syntax in the recursive call because I have readability, but you can easily replace them for the concat
calls acc.concat([data])
and xs.concat(pages)
if you like it more. is functional programming, so just select the immutable operation that you like and use it.
Finally, unlike other answers that use Promise.all
, this will process each page sequentially. If there were to be 50 subpages on the page, Promise.all
would try to make 50 requests in parallel, and this might be undesirable. Converting a program from parallel to serial is not necessarily simple, so this is the reason for providing this answer.
function foo (page) { async function enqueue (acc, [x,...xs]) { if (x === undefined) return acc else { let {data, pages} = await fakeAsyncOperation(x) return enqueue([...acc, data], [...xs, ...pages]) } } return enqueue([], [page]) } foo('root').then(pages => console.log(pages))
Exit
[ 'root', 'a', 'b', 'c', 'd', 'a/a', 'a/a', 'b/a', 'c/a', 'c/b', 'c/c', 'c/d', 'b/a/a', 'b/a/a/a' ]
Note
I am glad that the foo
function in my solution is not too far from your original - I think you will appreciate it. They use an internal helper function to cyclize and approach the problem in a similar way. async/await
keeps the code pretty good and well readable (imo). All in all, I think this is a great solution for a rather complex problem.
Oh, and don't forget about circular links . There are no round links in my dataset, but if page 'a'
should have pages: ['b']
and page 'b'
had pages: ['a']
, you can expect infinite recursion. Since this answer processes the pages one at a time, this would be very easy to fix (by checking the accumulated acc
value for the existing page id). It is much more difficult (and beyond the scope of this answer) to handle parallel page processing.