Hi, I am using #ktor server to produce an API in production. In this API, I have several endpoints. In a specific use case, one of my API endpoint is answering very slowly. In fact, I have put lots of measure inside my app and the time to respond from my API point of view is coherent. But from a customer point of view it could be ten times the time I have measured. I have reproduced the issue on my computer. I just have to use apache benchmark to stress a bit my app, as it is done in production, and the time of answering is going up very quickly. It looks like Ktor is being overwhelmed by the requests. I’d like to monitor the “queue” of request that Ktor has to manage ? Is there a way to do so ? Do you have any idea how I can track and find my issue ? Thanks.
10/04/2020, 7:31 PM
Maybe you can set an intermediate step to measure calls. Are you using a Nginx reverse proxy?
10/04/2020, 8:11 PM
No, I don’t need any RP in production but on my computer with Ktor/Netty running in docker with my Redis database I can reproduce the same issue with response time 😞
I have progressed in my issue, it looks like the issue is existing only with big response from my API. Do you know if there is a way to customize the BufferSize of Netty to enhance the response time ? Or is there a way to tune Ktor/Netty to handle very big response ?