holgerbrandl
12/14/2022, 9:44 AMJob aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3) (192.168.217.128 executor 0): java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDD
... very long stacktrace ....
I was under the impression that all serialization should be very simply in the expression, and every job is returning a simple string.
What am I doing wrong?Jolan Rensen [JB]
12/14/2022, 11:53 AMSimConfig
outside the function. Spark has some trouble detecting inner classes.Jolan Rensen [JB]
12/14/2022, 1:39 PMlocal[*]
fine. But after a lot of different setups, running it on the master get's me stuck at SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
So... I can't reproduce it 😕 Could you maybe find an example with local[*]
where you get the same error?holgerbrandl
12/14/2022, 2:13 PMJolan Rensen [JB]
12/14/2022, 2:46 PM