https://kotlinlang.org logo
#mathematics
Title
# mathematics
b

breandan

01/05/2021, 1:56 AM
Did some experiments with type classes. It's a nice pattern for abstract algebra: https://github.com/breandan/kotlingrad/blob/master/core/src/main/kotlin/edu/umontreal/kotlingrad/typelevel/TypeClassing.kt
❤️ 1
a

altavir

01/05/2021, 5:45 AM
Seems like kmath algebras in miniature. Are there any differences? Aside from constructors?
b

breandan

01/05/2021, 10:07 AM
You guys should look into Church encoding, there are some nice ideas to explore here
a

altavir

01/05/2021, 11:50 AM
@breandan lambda calculus is not friendly to performance on JVM and I am not sure what it would give in this case. The idea is to separate mathematical operations from structures operations are defined on. In labmda calculus you still need to get your operations from somewhere. Languages like Haskel or Scala usually achieve it by implicit function binding to a type. The thing I especially want to avoid since several sets of operation could exist for the same type.
Multiple recievers will allow to solve the problem of binding in a Kotlin explicit way without a loss of clarity
b

breandan

01/05/2021, 12:14 PM
The idea is you can replace more abstract operations with more efficient implementations as you specialize. The idea is more fundamental than λ-calculus, Moses Schönfinkel showed you can do it with an even simpler system called the SKI calculus. If you think about it, this is a really beautiful idea. You don't need exponentiation, multiplication or even addition. In fact all these can all be expressed in terms of a single primitive: function application. In practice, you would never use it for efficiency, just like you would never do multiplication by repeated addition since there are more efficient implementations available. But providing the ability to express these ideas in a simpler language, you can translate between many languages and runtimes without needing to rewrite everything.
a

altavir

01/05/2021, 12:17 PM
The problem is not to provide specialized optimized operation - this one is easy. The problem is prpagating those operations between algebras. For example, if you need to sum nd-arrays, you need to be able to sum its elements. And you still need to keep balance between performance and generics. We mostly managed to do it in kmath, but there are some hacks included. For example, I've added some manual optimizations for Doubles.
b

breandan

01/05/2021, 12:31 PM
The main benefit of the type system and algebras as far as I can tell are for user friendliness and to provide a unifying API, but when it comes to actually executing these things on a machine, you don't want to be calling functions or even using the JVM. At a low level, it all gets optimized into binary arithmetic, but the goal should be to have a robust type system to check the user level program, then throw everything away. No lambdas, generics, no types, no invokedynamic, no virtual functions, everything runs as fused sparse BLAS primitives on a native interface to the C/G/T/IPU. Then the question becomes which intermediate representation is the most efficient way to access the performance you need for various workloads. This is an interesting problem, but I'm more concerned about the semantics and expressiveness of the API
a

altavir

01/05/2021, 12:33 PM
You need to balance API with peroformance. There are a lot of ways to make the same KMath API more expressive, but it won't be able to provide lossless implementations.
3 Views