So I downgraded to jdk11 and made sure hive 3.2.2 ...
# kotlin-spark
e
So I downgraded to jdk11 and made sure hive 3.2.2 is the only version in the deps. and everything works as expected both in
local[*]
and against a cluster on docjer-compose. But then I began to save
delta
forma instead of
parquet
and now I get
Copy code
cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.sql.catalyst.expressions.ScalaUDF.f of type scala.Function1 in instance of org.apache.spark.sql.catalyst.expressions.ScalaUDF``
when I run against the cluster (works on lcoal). Any idea how I may debug this?
j
Looks like some lambda can't be serialized properly. You can try to set the kotlin compiler option:
freeCompilerArgs.add("-Xlambdas=class")
to see if that makes a difference. That has helped me in the past
🙏 1
e
will check and let u know
how do I add freeCompilerArgs in maven?
j
so
<arg>-Xlambdas=class</arg>
🙌 1
e
ok, so it didn't helped actually, but now I'm kinda convinced the issue is my docker-compoe stand-alone cluster has jdk17 while I'm compiling for jdk11 (which is used on dataproc)
j
That's also very much possible. There may have been a change in the JDK to how lambdas are stored and serilized
🙏 1
e
found the issue, i forgot to remove 1
LocalDate
from the parameters:
Copy code
sports.toDS()
            .let { addDate(it, now.year, now.monthValue, now.dayOfMonth) }
instead of extracting the values and passing those:
Copy code
val now = LocalDate.now()
    val year = now.year
    val month = now.monthValue
    val day = now.dayOfMonth
sports.toDS()
            .let { addDate(it, year, month, day) }
👍 1