I have a list of URLs that I am reading. What I want to do is save all the successfully cleared page data to the feed, and when I'm done, upload it to the slice. I don’t know how many successful chips I will get, so I can’t specify a fixed length. I expected the code to reach wg.Wait()
, and then wait for all methods to wg.Done()
be called, but I never reached the instruction close(queue)
. Looking for a similar answer, I came across this answer SO
stack overflow
where the author does something like this:
ports := make(chan string)
toScan := make(chan int)
var wg sync.WaitGroup
for i := 0; i < 100; i++ {
wg.Add(1)
go func() {
defer wg.Done()
for p := range toScan {
ports <- worker(*host, p)
}
}()
}
go func() {
wg.Wait()
close(ports)
}()
As soon as I wrapped my wg.Wait()
inside goroutine, it was achieved close(queue)
:
urls := getListOfURLS()
activities := make([]Activity, 0, limit)
queue := make(chan Activity)
for i, activityURL := range urls {
wg.Add(1)
go func(i int, url string) {
defer wg.Done()
activity, err := extractDetail(url)
if err != nil {
log.Println(err)
return
}
queue <- activity
}(i, activityURL)
}
go func() {
wg.Wait()
close(queue)
}()
for a := range queue {
activities = append(activities, a)
}
close
, goroutine wg.Wait()
? , defer wg.Done()
, , wg.Wait()
. ?