Hello, We’re in the process of migrating the Grad...
# scripting
b
Hello, We’re in the process of migrating the Gradle Kotlin DSL from the legacy Kotlin script API to the latest experimental one so we can take advantage of the new implicit receivers feature but I’m having a little trouble figuring out how to migrate one of our script templates. In the troublesome scenario the list of default imports must be computed dynamically from data passed from the outside to the compiler. With the legacy API we were able to achieve it via a custom
ScriptDependenciesResolver
that reads the data from the given script resolver environment which can be configured via the
-Xscript-resolver-environment
command line argument. The custom resolver is setup via the
ScriptTemplateDefinition
annotation: https://github.com/gradle/gradle/blob/6f683805c6b782b1eb91c01591aecabe6fe535fb/subprojects/kotlin-dsl/src/main/kotlin/org/gradle/kotlin/dsl/precompile/PrecompiledProjectScript.kt#L52-L54 My understanding is that it is NOT possible to mix the legacy
ScriptTemplateDefinition
annotation with the new
KotlinScript
one to get both, the dynamic implicit imports computed from the script resolver environment plus the implicit receivers from a custom
compilationConfiguration
setup. So the question is then, does the experimental API allow for the list of default imports for a given script to be computed based on command line arguments passed to the compiler?
Copy code
@KotlinScript(
        compilationConfiguration = DynamicDefaultImportsConfiguration::class
        // other settings
    )
    open class ScriptWithDynamicDefaultImports

    object DynamicDefaultImportsConfiguration : ScriptCompilationConfiguration({
        refineConfiguration {
            beforeCompiling {
                it.compilationConfiguration.with {
                    defaultImports(
                        // How to read arguments passed to the compiler from here?
                        TODO()
                    )
                }.asSuccess()
            }
        }
    })
n
cc @ilya.chernikov
i
Hi @bamboo! The previous solution is a bit hacky anyway, and we haven’t yet found time to analyze and maybe design a better way to access the compiler command line from scripts. But, first of all, the old “environment” is still accessible internally via
Copy code
get(ScriptCompilationConfiguration.hostConfiguration)?.get(ScriptingHostConfiguration.getEnvironment)?.invoke()
but more importantly, since you can use your own host, you can add a new key to the compilation configuration and pass required parameters via it directly, without using command line. But maybe it’s a bit too early for gradle to switch to the new API anyway: I had an intention to implement a refinement callback for “sections”, to remove necessity to use the lexer directly on your side. But if you’re eager to use new API, I can probably try to prioritize this task as well.
b
Hey @ilya.chernikov!
Using the old environment should get things going for now, I’ll give it a try, thanks!
since you can use your own host, you can add a new key to the compilation configuration and pass required parameters via it directly, without using command line.
This is very interesting
Do you have any pointers to the specific place in the docs or maybe an example that would make things a little more concrete? 🙂
i
I fixed environment access code above a bit.
b
I see 👍
i
About examples: Here is how you’re using a basic host implementation to compile your script - https://github.com/JetBrains/kotlin/blob/master/libraries/examples/scripting/jvm-simple-script/host/src/org/jetbrains/kotlin/script/examples/jvm/simple/host/host.kt#L18 And you can have a look at any definition of a configuration key, to see how to define your own. The configuration is a polymorphic container, which you can extend as you like just by defining new keys.
b
Is it possible to use those same configuration keys to configure the Gradle Kotlin plugin
compileKotlin
tasks somehow?
i
You mean in the case when you compile scripts with gradle?
b
The scenario for these templates is a little different
It’s not Gradle compiling the scripts directly
but a regular
compileKotlin
task which has its configuration augmented by the
kotlin-dsl
plugin
the
kotlin-dsl
plugin adds the required script templates to the classpath and configures the script environment via
freeCompilerArgs
i
As far as I remember you’ve been calling our compiler directly from gradle to compile build scripts. Is something changed here or we’re talking about different type of scripts?
b
right, this is a different scenario
this is to allow folks to implement Gradle plugins by writing Gradle scripts instead of classes
so the templates are used for scripts in a regular Kotlin sourceset
we call them precompiled script plugins
or precompiled scripts
we still have the scenario where Gradle compiles build script by calling the Kotlin compiler directly
that scenario was mostly migrated to the new script API, except for the IDE support
i
Ok, I see.
First about precompiled scripts: if you want to use compileKotlin, then the compiler will use the script compilation configuration it can extract for the given script name. There are several ways to provide this configuration, but there is no direct functionality I can think of, that would allow you to pass data from
kotlin-dsl
extension to it. So you can either continue to use “environment” hack, until we’ll come up with anything, or can create your own out-of-band data exchange.
b
As long as the hack is working it shouldn’t be a problem. This is not directly exposed to users in any way.
Meaning, we can evolve the internal protocol when we’re ready to.
About the IDE support, the problem is a little similar
We currently expose a
ScriptDependenciesResolver
to compute the classpath
and we would like to tell the IDE the template has implicit receivers
i
About the compilation from gradle, it will be nice if you can switch to the compilation via a host (or via more low-level script compiler & evaluator), as in the linked sample, rather than calling the compiler internals. And I will eventually make a replacement for the lexer usage, so you’ll be able to rely on the supported API instead (when it become really supported after graduating from experimental status, anyway).
1
b
Yes, absolutely, we’re planning to switch to the new compilation method after 6.0 is released.
i
And in all cases it will be nice if you’ll provide proper script templates with configurations attached with the
KotlinScript
annotation, In this case the IDE support should work out of box.
b
We are switching to new template mechanism internally so we can replace the interface delegation we use in our templates by an implicit receiver instead
In this case the IDE support should work out of box.
It’s just not clear to me at this point what the replacement for the
ScriptDependenciesResolver
in that case would be
i
I see the problem with script naming though - your precompiled scrips seems have the same file patter as the build scripts. We will have problems at the moment in distinguishing them.
b
Maybe we should move the precompiled script templates to a separate module. For the batch compilation case we use the
-script-templates
compiler argument to tell exactly which templates should be used.
i
It’s just not clear to me at this point what the replacement for the
ScriptDependenciesResolver
in that case would be
In your example above, the block:
Copy code
refineConfiguration {
            beforeCompiling {
is one of the “resolvers” in the old API - it is a callback that is called by the compiler, so you can change compilation configuration according to your needs. So, e.g. now in this block you can add dependencies to the configuration.
b
OR better, have a single set of templates and use them for precompiled scripts
So, e.g. now in this block you can add dependencies to the configuration.
I see, so we could make a blocking TAPI model call at that point
i
Maybe we should move the precompiled script templates to a separate module.
The problem with distinction is IDE-specific, we can easily separate compilation, but in the IDE we have a single set of templates per project.
b
there’s also ongoing work on the TAPI models
The problem with distinction is IDE-specific, we can easily separate compilation, but in the IDE we have a single set of templates per project.
Conceptually, the API exposed by the templates should be exactly the same
and that’s why the IDE support currently work just fine with a single set of templates
i
With dependencies?
b
what do you mean?
the dependencies are computed differently depending on where in the source tree we find the scripts
(if it’s in a Kotlin sourceset we return the compileKotlin compile dependencies, otherwise we return the regular per script dependencies)
the single resolver can handle it because the decision is taken by the TAPI model builder
I hope that’s in the direction of your question 🙂
i
More or less, except that in the IDE it is processed differently, the one in a sourceset are probably a part of a module, so the module dependencies are calculated via the project model. But we’re not decided what to do with scripts in this case. For a moment we ignore (in the IDE) all dependencies that are coming from the configuration for scripts and use module dependencies instead. But this blocks certain interesting use cases. So we may consider taking configuration dependencies into account. For the out-of-sources scripts we’re using dependencies from configuration only. So if you’ll have same base configuration for both script types, in the future you’ll need to decide which dependencies to add to the configuration.
b
For a moment we ignore (in the IDE) all dependencies that are coming from the configuration for scripts
Ah, I didn’t know that
i
I mean only the scripts which are a part of a module/sourceset.
b
Right. What about the default imports? Because I think we do compute them on the fly for different scripts and the IDE seems to honour them.
I mean, different scripts which are part of a module sourceset.
We give them different default imports to make certain extensions visible or not depending on the set of plugins they apply
i
Everything else is fine, dependencies are not, because such “extending dependencies for a subset of the sources in the sourceset” seems not supported in the IDEA project model.
b
I see 👍
i
Coming back to the lexer stuff, what I’m planning to do is to have a callback like:
Copy code
refineConfiguration {
            onSections("plugins", "dependencies") {
that would allow you to process sections of the scripts without using the lexer directly. What do you think about such an API?
b
Currently, we compile the
plugins
/
buildscript
/
initscript
sections as separate scripts
so we would basically need to be able to “cancel” the compilation of certain sections
i
I don’t think you need a cancel, in this
onSections
block, you need to execute another compiler for the section before continuing with the results.
b
also, I’m not sure we would be able to give up using the lexer completely because we take different decisions on how to compile scripts depending on which sections are present
i
What in particular changing?
b
to speed up repeated executions, we emit a class (that gets cached) that knows exactly which parts of the script need to be compiled
this class is like a specialized interpreter for the script
it makes loading scripts from the cache uniform
and helps with keeping some things precompiled
for a concrete example, take the following script:
Copy code
plugins { java }

java { }
upon analysing this script, we see that we always have to run the
plugins
block before we can compile the rest of the script
so what we do is, we emit a
Program
(that’s what we call this uniform façade to the scripts)
we compile the
plugins
section at this point because we know it needs to run every time
and we make a
Program
that executes the precompiled
plugins
script and THEN compiles the rest after the plugins have been applied
that strategy saves a few cycles when plugins or buildscript block changes cause the body of the script to be recompiled
it is complicated though
i
Ok, I see. Sounds complicated, yes. So, seems this
onSections
callback doesn’t fit your use case. I’ll think about it further. I still don’t like the fact that you’re using compiler internals. Maybe the old (and still unused) solution with the
source-sections
compiler plugin should be revived after all.
b
Hm, the compiler doesn’t seem to recognise the template configured via
-script-templates
anymore now that I changed from
ScriptTemplateDefintion
to
KotlinScript
(trying to test that environment access snippet)
ok, it works if I use both annotations 🙂
i
🙂 a nice hack
metal 1
b
for now it seems this will do
I’ll play a bit more with it and let you know
thanks again, Ilya
i
Actually, you can use a complicated plugin option format:
Copy code
-Pplugin:kotlin.scripting:script-definitions=...
or - better - use script definitions discovery mechanism.
👍 2
b
Status update: the migration to new
KotlinScript
annotation for compilation worked out great and Gradle 6.0 will ship with the new implicit receiver based script templates. Thanks again!
We are still using the old style templates for the IDE support and now I’m looking into migrating those.
Does IntelliJ already handle custom `compilationConfiguration`s?
i
Thank you for the great news!
Yes, IDE support should now be on the same or better level as with old templates.
b
Great. I got IntelliJ to recognize my new
@KotlinScript
based template yesterday but I didn’t manage to get it to see the
baseClass(...)
I gave the template, will continue to investigate it later.
One question though:
is it possible to get the IDE to continue to use the resolver from
@ScriptTemplateDefinition
to get the classpath and default imports but get the remaining settings (mainly
implicitReceivers
) from the `@KotlinScript`’s
compilationConfiguration
?
i
No, mixing is not possible, unfortunately. Why would you need it?
b
As long as a refinement handler can make a blocking TAPI call to Gradle to get the data, there’s no need for it.
Almost there! I got the IDE to recognize my template with an implicit receiver and the implicit receiver is taken into account for error checking BUT it is NOT taken into account for content assistance
meaning, I don’t get suggestions for members of an implicit receiver type
any thoughts?
also it seems
annotationsForSamWithReceivers
has no effect in IntelliJ, should it be supported already?
maybe you can spot whether I’m doing something wrong
ping @ilya.chernikov ☝️
i
@bamboo, I see what is missing with
annotationsForSamWithReceivers
- will check whether I can fix it quickly. About the implicit receiver - I need to dig a bit deeper.
b
Thanks! Let me know if you’d like me create an YT issue for it.
i
@bamboo, the problem with
samWithReceiver
is a bit worse then I thought. I need some more time to implement required functionality for the new scripting API. I’ll try to do it as soon as possible, but I’m by far not sure if it is not already too late for 1.3.60. Anyway, here is the issue to watch - https://youtrack.jetbrains.com/issue/KT-34294
b
Thanks, @ilya.chernikov
My plan is to finish converting the remaining templates to uncover any remaining issues and give you a proper nightly you can use to validate the changes and the IntelliJ UX
then we can see if the new templates go with Gradle 6.1 and Kotlin 1.3.60 OR if we have to wait for the next release train
in parallel to this, there’s the new TAPI model work happening to optimize the script classpath computation
I imagine there might be some overlap and the migration to the new templates might cause some rework on your side
which is why I think getting a 6.0 compatible Gradle nightly with the new templates will be key
how does that sound to you?
i
Sounds good to me. Will try to push my part through as soon as possible.
Hi @bamboo! I think I fixed the problem with sam-with-receiver, although maybe the wrong one. 🙂 What I’ve meant with KT-34294 is that SAM with receiver is not properly processed then using the new API to compile it, rather than calling the internal compiler API directly, as you seems still doing. I fixed the processing, so if you will ever be able to switch to the new API - it should work. But I also hopefully fixed the problem with IDEA, which wasn’t mentioned in the issue - I’m now configuring the sam-with-receiver annotation for IDEA as well. But I haven’t been able to test it yet - I have problems with trying to build your branch with locally-built kotlin. The fixes will be in the next 1.3.60 EAP (in the next few days) so as soon as it is out and I have some time - I’ll try to test it again on your branch. Or you can try it yourself then.
The problem with implicit receiver is being investigated by @nastelaz now.
t
@ilya.chernikov any update on implicit receivers recognition by IntelliJ? I'm trying to use them but I can't get the IDE to stop erroring on unknown references and to give me completion suggestions.
i
@Tmpod the problem is fixed for upcoming 1.3.70 - https://youtrack.jetbrains.com/issue/KT-34740 (although unresolved references could be a sign of the different problem, so maybe it help if you'll give some more details, in a new thread.)
t
Great! The unresolved references are connected with the implicit receivers since I only get those on the methods the wrapping class has.
Is there any ETA for 1.3.70?
i
~ Feb.
t
Cool
I'll just ignore those errors while 1.3.70 doesn't come out :p
i
t
will do, thanks! totally missed this message too :/