https://kotlinlang.org logo
Title
m

mkojadinovic

09/24/2021, 12:36 PM
Hi, I am trying to use coroutines in my spring boot project. I am using RestTemplate for networking which is doing blocking calls. Is it safe if I do next:
private suspend fun getSomething() = withContext(<http://Dispatchers.IO|Dispatchers.IO>) {
        RestTemplate().exchange(
            "url", <http://HttpMethod.POST|HttpMethod.POST>, HttpEntity("body"), String::class.java
        )
    }
Will this produce many threads to wait, or I am safe that this will use some limited thread pool? What if I want to limit thread pool size to 2, will it also work?
j

Joffrey

09/24/2021, 12:42 PM
This will offload the blocking code to
<http://Dispatchers.IO|Dispatchers.IO>
which is meant for this, so it's generally OK to do. The
IO
dispatcher uses a thread pool that grows as needed to handle all the simultaneous blocking operations. It will block threads, but not the thread of the caller of
getSomething()
, that's why it's useful. The IO pool will not grow indefinitely though, only up to 64 threads IIRC. If you want to limit this to less threads, you can also create your own thread pool and provide it as dispatcher instead of
<http://Dispatchers.IO|Dispatchers.IO>
. For that you can use newFixedThreadPoolContext but don't forget to close it in whatever lifecycle hook Spring provides for your component.
That being said, if you can use an asynchronous client instead (like Spring's WebClient), it would be much better 🙂
m

mkojadinovic

09/24/2021, 12:49 PM
Ok, so withContext(Dispatchers.IO) is not a magical wand. It is still better to go with WebClient.
:yes: 1
I am curious if my service (with RestTamplate and withContext(Dispatchers.IO)) gets hit 300 times at the same time. Having in mind that I only have 64 threads, would 236 requests will be rejected?
j

Joffrey

09/24/2021, 1:18 PM
No the extra tasks should be queued in the IO dispatcher, waiting for dispatch on a free thread when available
By the way if you go with coroutines in Spring I suggest you use Spring WebFlux which allows suspending functions in your controllers. It also allows you to set a request timeout that will cancel the relevant coroutine if it stays suspended for too long (see the
spring.mvc.async.request-timeout
property - I know it says MVC but it works for webflux)
:thank-you: 1
m

mkojadinovic

09/24/2021, 1:36 PM
Meaning, if I am not sure if some code is blocking I can just use withContext(Dispatchers.IO) and stop worrying about it. Can I be sure that if I use withContext(Dispatchers.IO) with some blocking code that all of my users will be served eventually? Of course if I know for nonblocking solution, I should go for it.
j

Joffrey

09/24/2021, 1:59 PM
if I am not sure if some code is blocking I can just use withContext(Dispatchers.IO) and stop worrying about it
In general you can use
<http://Dispatchers.IO|Dispatchers.IO>
for blocking code and not worry too much about it, although of course if you can find a non-blocking way it will save some resources.
Can I be sure that if I use withContext(Dispatchers.IO) with some blocking code that all of my users will be served eventually?
Using
<http://Dispatchers.IO|Dispatchers.IO>
will not drop tasks if the thread pool reached max capacity and all threads are busy. When this happens, the callers of
withContext(IO)
will just be suspended longer, but eventually this will run. Of course, unless the server runs out of memory before it has a chance to run them, nothing is magical here 😄
m

mkojadinovic

09/24/2021, 2:15 PM
Ok, thanks for clarification.
j

Joffrey

09/24/2021, 2:35 PM
Here is a little proof of what I said (it takes ~6 seconds to run): https://pl.kotl.in/hFfavZpV_ You can clearly see the first 64 tasks being executed immediately, and the next round of 64 being queued etc.
:thank-you: 1
m

mkojadinovic

09/24/2021, 3:15 PM
Hmm, I only see "hello world" example
🤔 1
Now I see it, second try
👌 1
u

uli

09/24/2021, 5:34 PM
Sure at some point your clients might run into timeouts. If answering a request takes 1sec and you get more then 64 requests per second your queue will get longer and longer.