Hi all, is there any way to save the compiled scri...
# scripting
l
Hi all, is there any way to save the compiled scripts to class files, and then evaluate them later by loading class files with a specific classloader? Thanks!
or you could look into the scripting internals using this as a starting point
i
The script compiler returns e serializable
CompiledScript
, which you can save and use later with evaluator. You can also provide a base classloader for the evaluator via evaluation configuration. There is a caching API - Nikky pointed above to the the sample implementation of it - that uses this to avoid compilation. If you need to have bare bytes, internally it is implemented in
KJvmCompilerScript
(https://github.com/JetBrains/kotlin/blob/master/libraries/scripting/jvm-host/src/kotlin/script/experimental/jvmhost/impl/KJvmCompiledScript.kt#L31) which currently holds serialized class bytes. (See
KJvmCompiledModule
). But please take into account that these are implementation details now.
l
Let me explain the reason why I raised this question. As I said before, I’m working on a project which integrates Kotlin-scripting into Apache Flink in order to achieve some sort of dynamic characters. Generally, writing a Flink application is like a normal Java/Kotlin program as follows, which is built and packaged as a jar file.
Copy code
fun main(args: Array<String>) {
    val env = StreamExecutionEnvironment.getExecutionEnvironment()
    dataflow(env)
    env.execute("Flink App")
}

fun dataflow(env: StreamExecutionEnvironment) {
    val WORDS: List<String> = listOf(
            "To be, or not to be,--that is the question:--",
            "Whether 'tis nobler in the mind to suffer",
            "The slings and arrows of outrageous fortune",
            "..."
    )
    val texts: DataStreamSource<String> = env.fromCollection(WORDS)
    val lengths = texts.map { it.length }

    lengths.print()
}
To execute the Flink application, just submit the jar to a Flink cluster via Flink CLI or Web interface, which acts as a Flink client. During the submit procedure, the Flink client loads the application jar and evaluates the main method to generate the JobGraph (Execution Plan) which is submitted to the Flink cluster too. After that, the Flink cluster will get the JobGraph, load the jar and run. One of the dynamic characters I want to achieve is to divide the Flink application above into two parts -- the unchanged Flink
application backbone
and the
dynamic dataflow script
.
Copy code
// application backbone
fun main(args: Array<String>) {
    val env = StreamExecutionEnvironment.getExecutionEnvironment()
    KotlinScriptEngine.eval(File("dataflow-script.kts"), env)
    env.execute("Flink App")
}
Copy code
// dataflow-script.kts
val WORDS: List<String> = listOf(
        "To be, or not to be,--that is the question:--",
        "Whether 'tis nobler in the mind to suffer",
        "The slings and arrows of outrageous fortune",
        "..."
)
val texts: DataStreamSource<String> = env.fromCollection(WORDS)
val lengths = texts.map { it.length }

lengths.print()
To execute the dynamic-version Flink application, as I plan to implement, the
application backbone
compiles the
dataflow script
dynamically, generates the class files which should be submitted to the Flink cluster along with the application backbone jar. And then the Flink cluster will load and execute the backbone jar and the dynamically-generated dataflow class files. As you can see, the dataflow class files are loaded by the classloader of Flink cluster. I’m not sure whether Kotlin-scripting is suitable in this kind of scenario. @ilya.chernikov @Nikky Any suggestion? Thanks!
i
The intended to use it in a similar situation is to have a script evaluator on the cluster side, and use it with any desired classloader and a
CompiledScript
prepared on a client to get an evaluated instance of the script class. If this is not acceptable, then you need to hack your own evaluator similar to
BasicJvmScriptEvaluator
. (many of its complications are duet to the import support, so maybe you can cut it too) Then you’ll probably will also want to extract play class bytes on a client, to reduce dependencies to the scripting infrastructure. But this approach will put you outside of the supported paths, so you have to take it into account.
l
@ilya.chernikov I couldn't agree more. Thanks a lot!