amanda.hinchman-dominguez
03/27/2025, 11:26 PMcheckoutShopper2
ran consistently faster with heavyWorkForProcessingInventory
fun main(): Unit = runBlocking {
val time = measureTimeMillis {
val openCheckoutLanes = launch(Dispatchers.Default) {
val shoppers = listOf(
async { checkoutShopper2("Jake", 3) },
async { checkoutShopper2("Zubin", 10) },
async { checkoutShopper2("Amber", 4) },
async { checkoutShopper2("Ren", 3) }
)
shoppers.awaitAll().forEach {
println(" $it is checked out!")
}
}
openCheckoutLanes.join()
}
println("Shoppers have been checked out. Time: ${time/1000.0} seconds")
}
// average runs from 7 to 25 seconds
private suspend fun checkoutShopper(name: String, numberOfItems: Int): String {
log("Checking out $name. ")
withContext(Dispatchers.Default) {
println(" $name has $numberOfItems items. Checking out...")
(1..numberOfItems).forEachIndexed { i, elem ->
println(" item $elem scanned for $name.")
}
// we HAVE to wait for this to wait i.e. represents heavy background work like payments processing
launch { heavyWorkForProcessingInventory() }.join()
}
return name
}
// average runs from 4 to 12 seconds
private suspend fun checkoutShopper2(name: String, numberOfItems: Int): String {
coroutineScope {
log("Checking out $name. ")
log(" $name has $numberOfItems items. Checking out...")
(1..numberOfItems).forEachIndexed { i, elem ->
println(" item $elem scanned for $name.")
}
// we HAVE to wait for this to wait i.e. represents heavy background work like payments processing
heavyWorkForProcessingInventory()
}
return name
}
private fun heavyWorkForProcessingInventory(): BigInteger = BigInteger.probablePrime(4096, Random())
// sub out to notice slower runs
private suspend fun heavyWorkForProcessingInventory2(): BigInteger = withContext(Dispatchers.Default) {
BigInteger.probablePrime(4096, Random())
}
Seems like using withContext
is surprisingly heavy even if used in the same context, not sure why. It's my understanding that a dispatcher will check if there needs to be a context switch, and if not, then work continues in the same context (according to code comments)uli
03/28/2025, 9:20 AMuli
03/28/2025, 9:26 AMfun main(): Unit = runBlocking {
val time = measureTimeMillis {
val shoppers = listOf(
async { checkoutShopper2("Jake", 3) },
async { checkoutShopper2("Zubin", 10) },
async { checkoutShopper2("Amber", 4) },
async { checkoutShopper2("Ren", 3) }
)
shoppers.awaitAll().forEach {
println(" $it is checked out!")
}
}
println("Shoppers have been checked out. Time: ${time/1000.0} seconds")
}
private suspend fun checkoutShopper2(name: String, numberOfItems: Int): String {
log("Checking out $name. ")
log(" $name has $numberOfItems items. Checking out...")
(1..numberOfItems).forEachIndexed { i, elem ->
println(" item $elem scanned for $name.")
}
// we HAVE to wait for this to wait i.e. represents heavy background work like payments processing
heavyWorkForProcessingInventory()
return name
}
private fun heavyWorkForProcessingInventory(): BigInteger = BigInteger.probablePrime(4096, Random())
// sub out to notice slower runs
private suspend fun heavyWorkForProcessingInventory2(): BigInteger = withContext(Dispatchers.Default) {
BigInteger.probablePrime(4096, Random())
}Dmitry Khalanskiy [JB]
03/28/2025, 11:19 AMlaunch { heavyWorkForProcessingInventory() }.join()This is just a roundabout way of saying
heavyWorkForProcessingInventory()
, there is no benefit to writing it this way at all.amanda.hinchman-dominguez
03/28/2025, 7:42 PMfun log(message: String) {
println("$message | current thread: ${Thread.currentThread().name}")
}
re: second bullet point. that is a good point out, it should be the case for both for a more accurate comparison. I'll have to read the rest of this thread later tonight and I'll come back with updates!uli
03/28/2025, 8:05 PMamanda.hinchman-dominguez
03/29/2025, 6:13 PMcheckoutShopper2
consistent starts lower, suggesting that withContext
is a bit more expensive using so often. I also notice the time goes up no matter the solution with each repeated run in IntelliJ, I would have to create some scripting and run this code in terminal to see if I get same trend there results or if this behavior more relates with the environment
@Dmitry Khalanskiy [JB] thank you for pointing out the lack in difference I got that fixedamanda.hinchman-dominguez
03/29/2025, 6:14 PM