https://kotlinlang.org logo
#feed
Title
# feed
d

doyaaaaaaken

08/23/2019, 12:05 PM
Hi, all. I just released an csv reader/writer libray! 🚀 (This is inspired by
scala-csv
) And I’ll be very gratefull if you can give me feedback. Thank you. https://github.com/doyaaaaaken/kotlin-csv
👍 5
a

altavir

08/23/2019, 12:17 PM
The library would make sense if it used kotlinx.io underneath so it could be easily migrated to multiplatform. Otherwise, I do not understand, what it adds on top of, say, https://commons.apache.org/proper/commons-csv/
👆 4
d

doyaaaaaaken

08/23/2019, 10:14 PM
Thank you for your feedback. 🙇 Actually, I’m considernig to migrate to multiplatform, so I’ll check kotlinx.io, thanks. Other merit of this library is, I think, there is a demand for csv library easy to use. In some cases, users don’t need highly functiononal library, and they want to use out-of-the-box library. That’s why I think scala-csv(https://github.com/tototoshi/scala-csv) got many users.
a

altavir

08/24/2019, 7:36 AM
OK, I will wait for multiplatform then. On JVM you can just add convenience kotlin layer on top of existing java libraries, so you can keep both bug-proof core and simple top layer. But for MPP you need to make java-free code. We have plans for io and tables for kmath: https://github.com/mipt-npm/kmath/issues/43 (I am not sure about tables, they probably will be implemented in another project). Multiplatform csv would be useful.
d

doyaaaaaaken

08/24/2019, 8:46 AM
OK, I’ll try multiplatform.
you can keep both bug-proof core and simple top layer. But for MPP you need to make java-free code
Valuable feedback, thanks!
d

DALDEI

09/01/2019, 9:43 PM
+1^10 for layering an 'easy to use' DSL on top of an existing 'Tried and True' mature CSV library. "CSV" is netoriously subtle -- ("Quoted" as the subtly is a lack of specification and lack of conformance to specificiation 'in the field'). I encourage you to not underestimate the range of common 'real world' difficulties with csv and to not lump in libraries that have grown to handle these real issues with 'users don't need highly functional libraries'. While the issues tend to be combined, such that its not obvious what are 'fancy but unnecessary' features vs '^You really dont want to know WHY we had to do this -- but trust me, you need it ' I encourage you to focus on the value add your library exposes (the 'Simple DSL Use Case') but to wrap that on top of (possibly pluggable) mature CSV library(s). Why pluggable ? The 'native' disusision is relevlent, but even on JVM there are reasons for programs to try to stick to one 'family' of data serialization/parsing/data mapping where possible. One example: consistent mappings of column headers to data field/property names, and consistent ways to configure things such that they can interop. For example, Im working on an app that has to read and write externaly-defined CSV files. (actually TSV)) ..) I also need to data-map these into Kotlin classes to expose to internal business logic as well as 3rd party tools (such as GUI and DB) It is an incredible pain to get consistant treatment end to end of 'simple' things like column headings like "Record Number" More subtle issues -- are TSV files (tab csv) to include or not quoted values? Are the order of headers reveleal (can they be detected out of order for reads? what about writes? can they be remapped?) Do read and write have to produce identical content ? or simply logically equivalent content? Can schema be used for data validation ( (like csv-schema ), can the data be queried by column name ? Can there be additional or missing colujms ? Are empty values nulls or "" ? Are values typed ? ( i.e can are invalid numbers rejected or passed), are escapes used ? if so what style ? Can embedded newlines be present ? These are only some of the issues. If you look into the checkin history of any major CSV library you can see the history of pain solved bit by bit. For an library authors' perspective -- I would highly recommend against writing your own parser for any general purpose case. For a library consumers perspective, I would stay way from any CSV library that was not based on a older/mature co-debase -- regardless of additional value ... its not worth the risk. (past 30 years is paved with multiple nieve attempts at "This one should be good enough" ..))
d

doyaaaaaaken

09/02/2019, 7:41 AM
Thank you for your feedback! 🙇 Enabling to plug mature CSV library and use it as default on JVM case is great idea, and I agree with it. I’ll focus on these values. 1. Simple DSL interface, 2. multiplatform And not focus on writing csv parse logic except for Kotlin native case.
d

DALDEI

09/17/2019, 2:16 AM
I have found many CSV libraries are associated also with data mapping libraries. I belive due to the same use cases where csv reading would be used database access or document parsing -- the common 'other side of the coin' for csv. Resulting in for example Jackson data binding (object mapper) + Jackson CSV module, or SimpleFlatMapper -- jdbccsv -- all with associated mapping or integrated to common data mapping libraries (such as jdbc based mappers). Mention this because if you use a pluggable CSV library you may want to be able to 'pair' that with a pluggable object/data mapper. This allows people to use a common class definition and annotation set that works for multiple serializations.
👍 1
28 Views