hi, the following throws exceptions ```fun main() ...
# kotlin-spark
r
hi, the following throws exceptions
Copy code
fun main() {
    withSpark {
        dsOf(
            listOf(mapOf("a" to "b", "x" to "y")),
            listOf(mapOf("a" to "b", "x" to "y")),
            listOf(mapOf("a" to "b", "x" to "y"))
        ).showDS().printSchema()
    }
}
j
I don't think spark can even handle datasets of lists of maps. No idea how that would even be encoded in a Column/Row structure. Any particular reason you need it?
r
spark can handle it if, for example, I return a list of maps from a UDF & I specify a custom/explicit DataType. The use case is to store a list of
attribute sets: [ { attr1: v1, attr2: v2}, {attrA: va, attrB:vb, attrX:vx} ]
where the attribute names/keys can be different between the sets and even unknown
a
Wowzer
Completely missed this
@razvandragut fixed it, ging to release soon-ish
🦜 1
r
Thanks !! 😉