Speak personal! Has anyone here had any experience...
# server
j
Speak personal! Has anyone here had any experience working with high csv file processing with Kotlin? Could you explain how? Thanks
google 2
e
You mean big CSV files? I'm not sure I understood the question... you'd like something "Kotlin-style" (e.g. using destructuring for column values) or something that Just Works™️?
j
That's right, I'm talking about large csv files
p
Hi, how big? I personally used Jackson CSV but not for dealing with large files. I cannot speak about performances but, since Jackson is fairly performant on its own, maybe it could be a good fit for you too.
j
All right? I'm talking about files with 200 thousand lines for example. I'm testing Jackson CSV with coroutines.
e
I think this may be a XY problem. What are you trying to achieve overall? Dealing with huge CSV files shouldn't be a problem because every line is an entry, so you could just use an input stream and process line by line. Something like:
Copy code
val reader = FileReader(path).buffered()

reader.lineSequence().forEach { line ->
  process(parse(line))
}
This way you keep memory usage low by always discarding every entry after processing.