gpt4 book ai didi

go - 限制运行的例程数量

转载 作者:IT王子 更新时间:2023-10-29 01:25:00 41 4
gpt4 key购买 nike

我有一个要处理的 url 列表,但我想一次运行最大数量的 goroutine。例如,如果我有 30 个 url,我只需要 10 个 goroutines 并行工作。

我的尝试如下:

parallel := flag.Int("parallel", 10, "max parallel requests allowed")
flag.Parse()
urls := flag.Args()

var wg sync.WaitGroup
client := rest.Client{}

results := make(chan string, *parallel)

for _, url := range urls {
wg.Add(1)
go worker(url, client, results, &wg)
}

for res := range results {
fmt.Println(res)
}

wg.Wait()
close(results)

我的理解是,如果我创建一个大小为并行的缓冲 channel ,那么代码将阻塞,直到我读取结果 channel ,这将解锁我的代码并允许生成另一个 goroutine。但是,这段代码在处理完所有 url 后似乎并没有阻塞。有人可以向我解释如何使用 channel 来限制运行的 goroutine 数量吗?

最佳答案

创建所需数量的 worker 而不是每个 url 一个 worker:

parallel := flag.Int("parallel", 10, "max parallel requests allowed")
flag.Parse()

// Workers get URLs from this channel
urls := make(chan string)

// Feed the workers with URLs
go func() {
for _, u := range flag.Args() {
urls <- u
}
// Workers will exit from range loop when channel is closed
close(urls)
}()

var wg sync.WaitGroup
client := rest.Client{}

results := make(chan string)

// Start the specified number of workers.
for i := 0; i < *parallel; i++ {
wg.Add(1)
go func() {
defer wg.Done()
for url := range urls {
worker(url, client, results)
}
}()
}

// When workers are done, close results so that main will exit.
go func() {
wg.Wait()
close(results)
}()

for res := range results {
fmt.Println(res)
}

关于go - 限制运行的例程数量,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/55203251/

41 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com