Meta just open-sourced an automatic differentiatio...
# mathematics
d
Meta just open-sourced an automatic differentiation framework for Kotlin! (https://diffkt.org/). It can compute higher-order and symbolic derivatives, supports user-defined types, and has compile-time shape checking. Examples include machine learning and physical simulations.
a
Interesting, reading it now. Could you explain why do you need to include C++ parts into the core?
I am happy to hear that you decided to use library approach similar to KMath instead of compiler plugin.
I can't build even the kotlin part. It lacks shape module.
d
1) C++ part is for performance, there are multiple options, actually I think they were experiments on the performance of different backends. 2) ShapeKt should be open sourced at any moment.
a
Does it make sense to include performance optimization into automatic differentiation? In my opinion, you should not mix differentiation logic and tensor performance optimization. Currently, it seems like you try to replicate Torch functionality.
👍 1
Rolling it all out in one package will significantly complicate the development and deployment without obvious gains. It will probably make much more sense to do a Torch connector from your library like @Ролан done.
d
I think when they started on the project, which was a year before I started on it, they were looking at the idea of a Kotlin deep neural network package to support data science in Kotlin. Look in the kotlin/api/model directory. At a later time, they became more interested in scientific computing and physical simulation, with the development of the user defined types. I think applications in physics based graphical modeling, think Metaverse, is the current direction. There are a number of mass-spring examples and triangular finite element examples in the repo that underlay problems in physics based graphical modeling.
a
I looked though the examples. Being a physicist, I usually do not use tensors at al, so it is not interesting for me. Automatic differentiation is useful though. For optimization problems. Like in this example: https://github.com/mipt-npm/kmath/blob/a1267d84ac43ca18d543c44c81514a73f990d50c/examples/src/main/kotlin/space/kscience/kmath/fit/chiSquared.kt#[…]7. Currently KMath supports autodiff only for numbers, but it is possible to add it for generic algebra elements as well.
But still, the differentiation and computation are separated and I think it is how it should be.
Context receivers would also allow to easily add differentiable operators without inheritance.