Lorin
02/21/2019, 3:30 AMNikky
02/21/2019, 7:42 AMNikky
02/21/2019, 7:43 AMilya.chernikov
02/21/2019, 8:19 AMCompiledScript
, which you can save and use later with evaluator. You can also provide a base classloader for the evaluator via evaluation configuration.
There is a caching API - Nikky pointed above to the the sample implementation of it - that uses this to avoid compilation.
If you need to have bare bytes, internally it is implemented in KJvmCompilerScript
(https://github.com/JetBrains/kotlin/blob/master/libraries/scripting/jvm-host/src/kotlin/script/experimental/jvmhost/impl/KJvmCompiledScript.kt#L31) which currently holds serialized class bytes. (See KJvmCompiledModule
). But please take into account that these are implementation details now.Lorin
02/22/2019, 3:40 AMfun main(args: Array<String>) {
val env = StreamExecutionEnvironment.getExecutionEnvironment()
dataflow(env)
env.execute("Flink App")
}
fun dataflow(env: StreamExecutionEnvironment) {
val WORDS: List<String> = listOf(
"To be, or not to be,--that is the question:--",
"Whether 'tis nobler in the mind to suffer",
"The slings and arrows of outrageous fortune",
"..."
)
val texts: DataStreamSource<String> = env.fromCollection(WORDS)
val lengths = texts.map { it.length }
lengths.print()
}
To execute the Flink application, just submit the jar to a Flink cluster via Flink CLI or Web interface, which acts as a Flink client. During the submit procedure, the Flink client loads the application jar and evaluates the main method to generate the JobGraph (Execution Plan) which is submitted to the Flink cluster too. After that, the Flink cluster will get the JobGraph, load the jar and run.
One of the dynamic characters I want to achieve is to divide the Flink application above into two parts -- the unchanged Flink application backbone
and the dynamic dataflow script
.
// application backbone
fun main(args: Array<String>) {
val env = StreamExecutionEnvironment.getExecutionEnvironment()
KotlinScriptEngine.eval(File("dataflow-script.kts"), env)
env.execute("Flink App")
}
// dataflow-script.kts
val WORDS: List<String> = listOf(
"To be, or not to be,--that is the question:--",
"Whether 'tis nobler in the mind to suffer",
"The slings and arrows of outrageous fortune",
"..."
)
val texts: DataStreamSource<String> = env.fromCollection(WORDS)
val lengths = texts.map { it.length }
lengths.print()
To execute the dynamic-version Flink application, as I plan to implement, the application backbone
compiles the dataflow script
dynamically, generates the class files which should be submitted to the Flink cluster along with the application backbone jar. And then the Flink cluster will load and execute the backbone jar and the dynamically-generated dataflow class files.
As you can see, the dataflow class files are loaded by the classloader of Flink cluster. I’m not sure whether Kotlin-scripting is suitable in this kind of scenario. @ilya.chernikov @Nikky Any suggestion? Thanks!ilya.chernikov
02/22/2019, 10:29 AMCompiledScript
prepared on a client to get an evaluated instance of the script class.
If this is not acceptable, then you need to hack your own evaluator similar to BasicJvmScriptEvaluator
. (many of its complications are duet to the import support, so maybe you can cut it too) Then you’ll probably will also want to extract play class bytes on a client, to reduce dependencies to the scripting infrastructure. But this approach will put you outside of the supported paths, so you have to take it into account.Lorin
02/22/2019, 4:07 PM