altavir
10/04/2022, 7:34 AMtensor.lup()
with prefix notations like Algebra::lup(tensor)
. It allows much more robust use of contexts and extensions and in my opinion is much more readable. In numpy/Torch postfix notation is used mostly because it could not be done in the other way. The only thing we loose is easy chaining (it could be brought back via scope functions). Does anybody know a case with real complicated chaining on tensors?