I've switched out Netty for CIO as the Ktor server...
# ktor
j
I've switched out Netty for CIO as the Ktor server to see if the performance would be better, but it seems much worse. Is CIO supposed to outperform Netty now or at some point or can it be tweaked to do that? On Netty, my throughput is about 38,000 requests per minute with 2.5 seconds per request. On CIO it's only about 21,000 and each request takes almost 5 seconds
i see there are things that can be tweaked on CIO, are these values the defaults or are there no limits by default? https://ktor.io/clients/http-client/engines.html#cio
seems like those are the defaults, doesn't seem like they can be set when using the embeddedServer option: https://github.com/ktorio/ktor/blob/master/ktor-client/ktor-client-cio/jvm/src/io/ktor/client/engine/cio/CIOEngineConfig.kt
j
@janvladimirmostert can I ask you how do you get those metrics?
I wont be able to answer your question but I would also like to get that info out of Ktor
j
that would be loader.io, i created a test that simulates 2000 users concurrently hitting a specific page that i'm trying to optimize for spikes in traffic. Obviously there's a couple of bottlenecks i should still fix - at 500 users the load times are still around 0.5 seconds where i want it, as soon as the load goes to 2000 concurrent users, that spikes to 2.5 seconds per request which is too slow, this was on Netty. When i now swop out Netty for CIO (Co-routine IO) and do literally nothing else to the code, run a bunch of warm-up load tests, the latency almost doubles, throughput halves. I'm thinking it has something to do with the request limit per page which seems to be set to 100, not sure how to change that when using the embedded configuration
Copy code
class EndpointConfig {
    /**
     * Maximum connections  per single route.
     */
    var maxConnectionsPerRoute: Int = 100
100 is seriously low when you're trying to squeeze a lot of performance out of a single instance
all that changes now is adding the CIO depedency, replacing Netty with CIO and adding the experimental annotation on top of main
Copy code
@JvmStatic
	@KtorExperimentalAPI
	fun main(args: Array<String>) {
Copy code
embeddedServer(
			//Netty,
			CIO,
			watchPaths = listOf("module"),
			module = Application::module,
			port = if (ENV.env == LOCAL) {
				8080
			} else {
				80
			}
		).apply {
			start(wait = true)
		}
have not looked very deep into it, will play around with it further tonight and see if i can get that 5 seconds down to under 1 second
a
Last thing I remember was that CIO performance will be optimized later.
👍 1
j
ok, any idea if there's a rough ETA for that? going to switch back to Netty until then
304 Views