https://kotlinlang.org logo
#coroutines
Title
# coroutines
n

nwh

07/16/2019, 5:25 PM
Reposting because I never got an answer: What's the recommended way to do a worker pool like structure? eg computing values concurrently (like by using the IO dispatcher) but only allowing a certain number to run at once
m

Matt Thompson

07/16/2019, 5:29 PM
a Channel with a buffer?
n

nwh

07/16/2019, 5:43 PM
I can't figure out how to make a channel work. Let's take a naive example and assume I have to launch 100 tasks:
Copy code
val ch = Channel<Int>(5)
repeat(100) {
   launch {
      delay(Random.nextLong(1000))
      ch.send(it)
   }
}
It actually launches 100 coroutines immediately. It should launch 5 and start launching new ones as the channel's buffer is emptied. But the launching of the task doesn't occupy the channel's buffer, only a task finishing does.
w

withoutclass

07/16/2019, 6:08 PM
b

bj0

07/16/2019, 6:08 PM
I believe he was talking about using a buffered channel on input to your coroutine
w

withoutclass

07/16/2019, 6:09 PM
Fan-Out is how I'd do it
b

bj0

07/16/2019, 6:09 PM
does sound like something that would be in the guide...
n

nwh

07/16/2019, 6:11 PM
Yeah I have @withoutclass. A fan out has the same issue and I don't think it's what I'm hoping to accomplish
w

withoutclass

07/16/2019, 6:11 PM
What issue?
if you want 5 workers, you have 1 channel of work with 5 launched coroutine "workers" reading from it
so you could do something like
repeat(5) { launch { // read stuff }}
then you have launched 5 async coroutines to read from the channel
n

nwh

07/16/2019, 6:13 PM
Hm. I'll try it again
w

withoutclass

07/16/2019, 6:13 PM
Check the example in the guide again that I sent, particularly:
Copy code
fun main() = runBlocking<Unit> {
//sampleStart
    val producer = produceNumbers()
    repeat(5) { launchProcessor(it, producer) }
    delay(950)
    producer.cancel() // cancel producer coroutine and thus kill them all
//sampleEnd
}
Here they have 1 producer channel
producer
and then launch 5 workers by repeating
launchProcessor
5 times
hth
n

nwh

07/16/2019, 6:17 PM
Right it's just strange to think of my loop of 100 actions as a producer of the values 0-99 considering it's completely sync
The blocking work is actually in the "processor" in this example, not the producer
w

withoutclass

07/16/2019, 6:18 PM
They aren't the same, you have work to populate your
channel
, the
repeat(100)
. This is completely separate from your "worker` coroutines, of which you want 5
If you just want this channel to produce 100 random values, I would convert it to an actual
producer
Honestly I think your issue here is your
repeat
is outside of your
launch
which launches 100 coroutines, instead of launching 1 coroutine that does a loop 100 times
n

nwh

07/16/2019, 6:21 PM
The real world example is making API requests. 5 requests should be made at once and as soon as a single one finishes, another should be launched (so there should always be 5 running). The blocking work is making the request itself
w

withoutclass

07/16/2019, 6:22 PM
That doesn't really change anything here, you'll still want to fan-out if you're driving those 5 workers off of values in a channel
n

nwh

07/16/2019, 6:23 PM
And what values would be coming through the channel in that example, the requests?
w

withoutclass

07/16/2019, 6:23 PM
No, I assumed that your real example was somehow related to the example you've posted here
as in 5 workers being driven by a channel of 100 items
if you just want 5 coroutines to run continuously making requests but you want to hard limit them to 5, you could just make an executor with a max of 5 threads
and then have all the launched workers use that thread pool
n

nwh

07/16/2019, 6:52 PM
I guess that would work too, it just feels weird to rely on the thread limit instead
t

Tolriq

07/16/2019, 7:03 PM
n

nwh

07/16/2019, 7:17 PM
Thanks @Tolriq I'll take a look
g

gildor

07/16/2019, 11:16 PM
Executor with limited amount of threads will not help in case of coroutines if you use asyncronous http client
It really depends on how your code looks like, if you want just process a lot of input in parallel, but not more than N at the same time, the easiest way is to use this approach: