<@UQGG9A0LX> are you guys able to manipulate <ONNX...
# kotlindl
р
@zaleslaw are you guys able to manipulate ONNX models?
z
This functionality is not released yet, but will be released in the 0.3, also the interesting project kinference exists, they wrote their own runtime.
р
So you rely on this, or ONNX Java API?
z
For our current goals the java api is enough, but it has some limitations, related to model format
Probably we will use kinference if it became more mature and support more ops or will contribute to this project, because currently it's focused on subset of NLP models
р
Sounds great thanks
At kinference they execute everything on the JVM, or I am missing something?
z
I'm not sure, but looks like on JVM side only
р
I asked Anastasia (who is unfortunately not in this slack)
they plan to get into KMP and support GPU on native and JS
currently on JVM without GPU
z
Great plans, it will takes a lot of work to build such library, but will be a good and useful solution
🙌 1
I'd like yet another library cubool or something like this for boolean algebra on gpu
Interesting solution from the architecture point of view
р
yes that looks impressive 👍 thanks for pointing that out
Unfortunately as of now PyTorch does not support officially importing ONNX modules but some solutions exists. I will support execution of TorchScript modules in kotlin (including training).
metal 1
K 1
@zaleslaw lucky you are to be on holidays 😄 I do now have a prototype for training TorchScript models from kotlin: https://github.com/mipt-npm/kmath/tree/feature/noa/kmath-noa I will add more optimisers (beyond Adam) very soon as well as algorithms for bayesian deep learning as: https://github.com/grinisrit/noa/blob/master/docs/ghmc/bayesian_deep_learning.ipynb
z
Great Job @Ролан! It looks cool, there is a lot of work
Is it related to the PyTorch wrapper in kmath (for tensor ops) or it is a separate project?
р
Yes it's the same project - because I have a certain opinion about how the wrapper should work. A faithful binding can be found here: https://github.com/bytedeco/javacpp-presets/tree/master/pytorch
I just believe that people should do as much as possible in python for the model building. But then they should be able to train their python cooked models in Kotlin easily without extra effort including the most experimental ones (and not only perform inference - this is important in reinforcement learning, on-line learning etc., which gets everywhere now). At the same time it's really hard to chase all the architectures in the API (like new layers, various training patterns, different loss functions etc.). Basically, the user shall be able to write his own extensions with raw tensors (including in C++/CUDA) for the python API and being able to use in Kotlin straightaway - that's already possible with the current state of the prototype.
The only place where I will need to keep up with pytorch are the optimisers - but that does not evolve very fast (there are only 7 I think). On top of that I write also my own (in bayesian optimisation). But we need to think more about the API for those with @altavir and other kmath contributors.
a
Sorry, guys, I don't have time to read the whole thread right now. But last time I've been working on KMath I was porting my old code for very advanced multivariate optimizers and their API. I think it should be much more advanced than anything in Torch (with more limited field of application of course). Ping me if you have any specific applications in mind so I could move this work nearer in my schedule.
z
It sounds reasonable in many cases, great that Kotlin users will have different ways to solve their problem
р
Yes we would like to focus on bringing more functionality in kmath than what you can do with pytorch in python. The moment you break down your deep learning model into TorchScript modules correctly you will be able to access the more sophisticated optimisers and bayesian samplers in kmath beyond classical training through stochastic gradient descents of all sorts.