currently I am using compressed csv, 20 years minute interval stock bars, each symbol about 20M in size, and have to process 2000+ symbols, in python, load from one of those csv.gz file took 2.6+ seconds (kotlin took 800+ms), but if could be in other more efficient format will be much faster. I used to use feather format (produce/load from Python), took < 100ms in python for same amount of data to load. for my use case, I feel most needed is a good charting library, then would be nice to have an efficient data format for large dataset io.