It's the best place to ask; I suppose @Alexandre Brown. First of all, I need to say that we will release experimental ONNX support in the 0.3 release.
I have been watching with interest the kinference project for the last year and what features are being added there. As far as I know, this project primarily serves for internal purposes and has its own runtime written in pure Kotlin. This is an interesting idea and implementation, especially solving the problem of parsing onnx models.
However, the goal of the project, as it seems to me, is to support only those operators that are used in some NLP models. This makes integration difficult at the current stage. But you're right, there needs to be some clarity here both for the community and between frameworks. I will try to contact the authors of the frameworks to clarify their development plans.
The experience of implementing some operators, in particular LSTM / GRU for inference in pure Kotlin, also looks interesting.