mudasar187
06/02/2023, 7:00 PMwithType<Test>().configureEach {
useJUnitPlatform()
testLogging {
showExceptions = true
showStackTraces = true
exceptionFormat = FULL
events = setOf(TestLogEvent.PASSED, TestLogEvent.SKIPPED, TestLogEvent.FAILED)
}
// For å øke hastigheten på build kan vi benytte disse metodene
maxParallelForks = Runtime.getRuntime().availableProcessors() / 2
reports.forEach { report -> report.required.value(false) }
}
I also tried so speed up the build by setting these configurations in gradle.properties
->
org.gradle.parallel=true
org.gradle.caching=true
org.gradle.unsafe.configuration-cache=true
My question is.. Are this config org.gradle.parallel=true
and this config
maxParallelForks = Runtime.getRuntime().availableProcessors() / 2
the same configuration? Or is the first one for build and the second one for tests ? Please explain me, tried to understand from the documentation but it not made my question clear.Adam S
06/02/2023, 7:05 PMorg.gradle.parallel=true
means multiple subprojects will run their tasks in parallel. So if you have 10 subprojects, multiple subprojects can do work at the same time.
There’s a good visualization in the docs (Figure 1. and Figure 2.) https://docs.gradle.org/current/userguide/performance.html#parallel_executionmudasar187
06/02/2023, 7:06 PMAdam S
06/02/2023, 7:06 PMAdam S
06/02/2023, 7:08 PMmaxParallelForks
affects how Gradle will launch different Java processes. If you enable multiple forks, then run the tests and look at your task manager you’ll probably see lots of jvm processes launching and dying. https://docs.gradle.org/8.1.1/userguide/performance.html#execute_tests_in_parallelmudasar187
06/02/2023, 7:09 PMmaxParallelForks = Runtime.getRuntime().availableProcessors() / 2
? I have seen multiple options here. Some may have
maxParallelForks = (Runtime.getRuntime().availableProcessors() / 2).takeIf { it > 0 } ?: 1
and some only have
maxParallelForks = Runtime.getRuntime().availableProcessors()
Adam S
06/02/2023, 7:11 PMmudasar187
06/02/2023, 7:14 PMAdam S
06/02/2023, 7:14 PMVampire
06/02/2023, 8:21 PMVampire
06/02/2023, 8:21 PMmudasar187
06/03/2023, 8:34 PMmaxParallelForks
settings @Vampire?Vampire
06/03/2023, 9:24 PMAdam S
06/04/2023, 8:20 AMmaxParallelForks
will probably help. but it’s best to experiment with the different options to see what works (as with all questions about optimisation!)mudasar187
06/04/2023, 8:28 AMmudasar187
06/04/2023, 8:28 AMmudasar187
06/04/2023, 8:31 AMAdam S
06/04/2023, 8:36 AM./gradlew check
and see how long the build takes!
It’s a good idea to re-run that a few times, in case the tests need to ‘warm up’ (by downloading dependencies or re-compiling). If you have Build Cache enabled, then add --no-build-cache
and/or --rerun-tasks
to make the playing field fair, so each run has a fresh start. BUT it’s also worth testing with a more ‘real life’ scenario. Usually you’re running tests locally on your machine, so try and optimise for a fully-cached project.
An easier solution is to use Gradle Build Scan to send build metrics to Gradle and get a fancy HTML report. That will help visualising how a build performs, and digging into any problems.
You can also add the --profile
profile that will generate a high-level performance report in $buildDir/reports/profile
. It’s more basic than Build Scan, but it’s faster to open and it should give a good indication.Adam S
06/04/2023, 8:37 AMAdam S
06/04/2023, 8:42 AMorg.gradle.parallel=true
, but also Gradle will be able to cache the outputs of projects, so Gradle will only re-run tests in subprojects that have changes
• try enabling Build Cache. This will also help on your build server. For example, if make a change to README then Gradle will realise “there are no code changes, therefore I can assume the output of compiling src/main
is the same, and because the tests passed last time, I can assume they’ll pass again.”
There’s a blog with more info: https://blog.gradle.org/stop-rerunning-testsmudasar187
06/04/2023, 9:28 AMBuild Cache
in github actions 🙂Adam S
06/04/2023, 9:30 AMVampire
06/04/2023, 2:42 PMVampire
06/04/2023, 2:46 PM--rerun
instead of --rerun-tasks
if your Gradle version is new enough and your are only interested in running that one task.Vampire
06/04/2023, 2:47 PMmudasar187
06/05/2023, 7:05 AMorg.gradle.parallel=true
, and if I am changing my test, where I try to fail a test, it still get build success, how can I configure that when I change my tests or code it should rerun the tests?Vampire
06/05/2023, 7:09 AMmudasar187
06/05/2023, 7:11 AMmudasar187
06/05/2023, 7:12 AMbuild.gradle.kts
file
import com.github.jengelman.gradle.plugins.shadow.tasks.ShadowJar
import org.gradle.api.tasks.testing.logging.TestExceptionFormat.FULL
import org.gradle.api.tasks.testing.logging.TestLogEvent
import org.jetbrains.kotlin.gradle.tasks.KotlinCompile
import org.openapitools.generator.gradle.plugin.tasks.GenerateTask
plugins {
kotlin("jvm") version "1.8.21"
kotlin("plugin.serialization") version "1.8.21"
id("org.openapi.generator") version "6.6.0"
id("com.github.johnrengelman.shadow") version "8.1.1"
application
}
group = "no.nav.sokos"
repositories {
mavenCentral()
maven { url = uri("<https://maven.pkg.jetbrains.space/public/p/ktor/eap>") }
}
val ktorVersion = "2.3.0"
val logbackVersion = "1.4.7"
val logstashVersion = "7.3"
val jacksonVersion = "2.15.1"
val prometheusVersion = "1.11.0"
val kotlinLoggingVersion = "3.0.5"
val janionVersion = "3.1.9"
val natpryceVersion = "1.6.10.0"
val kotestVersion = "5.6.2"
dependencies {
// Ktor server
implementation("io.ktor:ktor-server-core-jvm:$ktorVersion")
implementation("io.ktor:ktor-server-call-logging-jvm:$ktorVersion")
implementation("io.ktor:ktor-server-call-id-jvm:$ktorVersion")
implementation("io.ktor:ktor-server-netty-jvm:$ktorVersion")
implementation("io.ktor:ktor-server-content-negotiation-jvm:$ktorVersion")
implementation("io.ktor:ktor-server-swagger:$ktorVersion")
// Ktor client
implementation("io.ktor:ktor-client-content-negotiation:$ktorVersion")
implementation("io.ktor:ktor-client-core-jvm:$ktorVersion")
implementation("io.ktor:ktor-client-apache-jvm:$ktorVersion")
implementation("io.ktor:ktor-serialization-jackson-jvm:$ktorVersion")
// Security
implementation("io.ktor:ktor-server-auth-jvm:$ktorVersion")
implementation("io.ktor:ktor-server-auth-jwt-jvm:$ktorVersion")
// Jackson
implementation("io.ktor:ktor-serialization-jackson:$ktorVersion")
implementation("com.fasterxml.jackson.core:jackson-databind:$jacksonVersion")
implementation("com.fasterxml.jackson.module:jackson-module-kotlin:$jacksonVersion")
implementation("com.fasterxml.jackson.datatype:jackson-datatype-jsr310:$jacksonVersion")
// Monitorering
implementation("io.ktor:ktor-server-metrics-micrometer-jvm:$ktorVersion")
implementation("io.micrometer:micrometer-registry-prometheus:$prometheusVersion")
// Logging
implementation("io.github.microutils:kotlin-logging-jvm:$kotlinLoggingVersion")
runtimeOnly("org.codehaus.janino:janino:$janionVersion")
runtimeOnly("ch.qos.logback:logback-classic:$logbackVersion")
runtimeOnly("net.logstash.logback:logstash-logback-encoder:$logstashVersion")
// Config
implementation("com.natpryce:konfig:$natpryceVersion")
// Test
testImplementation("io.kotest:kotest-assertions-core-jvm:$kotestVersion")
testImplementation("io.kotest:kotest-runner-junit5:$kotestVersion")
testImplementation("io.kotest:kotest-assertions-core:$kotestVersion")
}
application {
mainClass.set("no.nav.sokos.prosjektnavn.ApplicationKt")
}
sourceSets {
main {
java {
srcDirs("$buildDir/generated/src/main/kotlin")
}
}
}
kotlin {
jvmToolchain {
languageVersion.set(JavaLanguageVersion.of(17))
}
}
tasks {
withType<KotlinCompile>().configureEach {
dependsOn("openApiGenerate")
}
withType<GenerateTask>().configureEach {
generatorName.set("kotlin")
generateModelDocumentation.set(false)
inputSpec.set("$rootDir/src/main/resources/openapi/pets.json")
outputDir.set("$buildDir/generated")
globalProperties.set(
mapOf(
"models" to ""
)
)
configOptions.set(
mapOf(
"library" to "jvm-ktor",
"serializationLibrary" to "jackson"
)
)
}
withType<ShadowJar>().configureEach {
enabled = true
archiveFileName.set("app.jar")
manifest {
attributes["Main-Class"] = "no.nav.sokos.prosjektnavn.ApplicationKt"
}
}
("jar") {
enabled = false
}
withType<Test>().configureEach {
testLogging {
showExceptions = true
showStackTraces = true
exceptionFormat = FULL
events = setOf(TestLogEvent.PASSED, TestLogEvent.SKIPPED, TestLogEvent.FAILED)
}
reports.forEach { report -> report.required.value(false) }
}
}
And my gradle.properties
file:
kotlin.code.style=official
org.gradle.caching=true
# Får du en feil som henvender deg til denne
# <https://docs.gradle.org/8.0/userguide/configuration_cache.html#config_cache:requirements:disallowed_types>
# så markere du ut linjen nedenfor
org.gradle.unsafe.configuration-cache=true
Vampire
06/05/2023, 7:21 AMIt does not get UP-TO-DATE and rerung the tests
UP-TO-DATE
would be strange as you throw away any previous result using clean
and thus void some of the biggest strengths of Gralde.
But I hope you just do it for your experiments and not braught that over from Maven where you needed it for correct results.
When i change my tests i still get same resultWell, debug your tests then. Also, never trust a test you did not see fail for the correct reason.
mudasar187
06/05/2023, 7:25 AM./gradlew build shadowJar
mudasar187
06/05/2023, 7:32 AMmudasar187
06/05/2023, 7:34 AMVampire
06/05/2023, 7:51 AMuseJunitPlatform()
or when using the new test suites DSL useKotest(...)
🙂mudasar187
06/05/2023, 8:21 AMuseJunitPlatform()
but no options give me `useKotest()`https://kotest.io/docs/quickstart/Vampire
06/05/2023, 9:39 AMuseKotest()
is not for the test task itself, but if you use the new JVM testsuites DSL. There no useJunitPlatform()
exists, but framwork-specific methods like useKotest()
where you can also specify a version and do not need to add Kotest as test dependency manually, but get it automatically.mudasar187
06/05/2023, 9:49 AMVampire
06/05/2023, 9:51 AMVampire
06/05/2023, 9:51 AMVampire
06/05/2023, 9:52 AMuseKotest()
but is hasn't.
There is only JUnit 4, Jupter, Spock and TestNG.Marek Kubiczek
10/16/2024, 1:44 PMVampire
10/16/2024, 2:55 PM