Is it possible to use KMaths for deep learning inf...
# mathematics
a
Is it possible to use KMaths for deep learning inference on Native (I think KMaths is multiplatform right?)? (Eg: via ONNX model)
a
KMath currently does not have instruments for ML. But if you want to operate via primitive like matrices, then yes, you can. There is an example here: https://github.com/mipt-npm/kmath/blob/master/examples/src/main/kotlin/space/kscience/kmath/tensors/neuralNetwork.kt.
Tensor module should work with native. Why do you need native anyway?
a
Ok thank you.
Why do you need native anyway?
We have c++ apps that currently calls our kotlin backend for AI processing/logic. We would like to support offline users by providing a kotlin multiplatform lib in the future. We don't want to ask each c++ apps to handle the inference (adding a c++ runtime + doing the AI planning logic themselves) that's why we'd encapsulate the logic in a lib. Therefore we'd need an inference lib that supports Native. This project is still probably 6months + away but still I prefer to have an intuition as soon as possible. There are probably a lot of ways to go about this, if ONNX has a C api maybe we can use Kotlin C Interop, but the simpler the better. Ideally we'd find a native comliant lib and simply use it, that's why I was curious to know if KMath was native friendly.
a
Great, keep me informed please. We can benefit from integration. You can check @Iaroslav Postovalov’s integration with GSL as a reference: https://github.com/mipt-npm/kmath-gsl.
👍 1
a
Perfect will do, thanks for your help
a
The integration with native is still tedious. But doable. At least basic things are working fine.
i
By the way, I even couldn't build kmath-gsl with newer Kotlin compiler because the commonizer started failing.
So it is really painful activity.
a
I believe it would be much easier if they do not want to cover all platforms like we do.