Hello, I’ve got a question about coroutines and if...
# coroutines
g
Hello, I’ve got a question about coroutines and if what I’m doing is counter-productive: I’ve got 111 urls to download and used coroutines with async {} to do it in parallel. By using CommonPool it takes 10 seconds but by using async(newSingleThreadContext()) per url I get to 6 seconds, almost twice as fast! It’s a gradle plugin meaning a few seconds after downloading all the data the program finishes so I am not worried about the number of threads. Does this makes sense to do it the way I did or is there a better way to be faster while using less threads?
d
CommonPool
is a limited thread pool, it is limited based on the number of cores you have. Your second version just spawns a new thread for every URL, so they really all download in parallel. This is both not ideal though, ideally you'd use non-blocking IO for this instead of spawning 111 threads.
CommonPool
is made for coroutines that mostly "do nothing" for most of the time (i.e. "wait 20 seconds then do this").
k
yes, non-blocking IO would be optimal. I wonder if there's a non-blocking http client out there
a
if you won’t have thousands of threads running, you can still use blocking IO with a dispatcher suited for it something like:
val BLOCKING_IO = Executors.newCachedThreadPool().asCoroutineDispatcher()
k
g
Really interesting, thank you all!
I still hope I can use coroutines for its imperative way of writing it
Nice, with @adeln solution I get it in 6 seconds and 725 milliseconds.
So this dispatcher runs all those requests in parallel without opening 111 threads then? How?
d
It doesn't run them all in parallel. For blocking IO you need one thread per request.
g
ah yes I see, it reuses threads, so its not the fastest possible, only as you say one thread per request will do it the fastest
a
BLOCKING_IO
dispatcher would create 111 threads and cache them for reuse
CommonPool
size is limited by the number of cores
basically, you shouldn’t use
CommonPool
with blocking IO
g
Got it, thanks!
I saw it running requests in batches of 5 or 6, so now it makes sense
k
@galex you can use one of the libraries I posted and still use coroutines because these libraries return `CompletableFuture`s which you can
await
using coroutines
g
I will look into those as well, thank you @kirillrakhman
g
Interesting, thank you