Wanted to know what’s the future of <kotlinx-seria...
# serialization
k
Wanted to know what’s the future of kotlinx-serialization-protobuf Are we planning to build multiplatform protobuf library? If yes, why do we need it? Protobuf was meant to be language agnostic, so a contract written once could be used by different languages to generate encoding decoding code. What benefits will it provide over using some language specific library? Plans for supporting proto3 syntax?
s
uh it is a "language-specific" library, as it's built for "kotlin", not for "javascript" or "java" or "C++" or "C" if you want to share the definition with other languages/libraries, use the schema generator
j
On the proto version point @solonovamax the schema generator only supports proto2, and I think proto 3 came out in 2016 - not sure on the language specific point though. Kotlinx serialization as you've mentioned is language specific, IE kotlin
k
kotlinx-serialization-protobuf is a multiplatform library which means you can build something using it that supports jvm, android, ios, macos, js, etc If you consider macos/ios as platform, the programming language for this platform is swift/objective-c, but since we’re using KMP, we managed to write code that works for this platform using kotlin Protobuf is a language agnostic schema language. You define your contract in proto file, and wrappers to encode/decode messages for this contract are auto generated for each language. IE, you can encode a message using kotlin code and someone on the other hand, say using swift can decode same message from raw bytes Now, consider we built a kotlin library using kotlinx-serialization-protobuf for some proto definitions, you can generate a jar / aar / Podfile(swift) to use our library on multiple platforms. On the similar note, we could have generated a swift library using swift-protobuf as well. My question is, why would we use kmp library targeting swift over an actual swift library when it comes to protobuf serialization?
g
My current use-case is a project with a lot of business @serializable classes that has : • some fields transient (not to be serialized) or computed • inheritance between interfaces and classes • code using those classes that use the interfaces to make business logic I cannot define those constraints in a proto file, so generated code won't be a replacement for those classes. I'm considering some options : • I could only use the generated files, and rework my business logic => that's a very bad architecture since my business logic will be driven by protobuf capabilities no red • I could have mappers: every generated class from protobuf file also comes with a generated method (I've to write the tricky generator in KSP) to instantiate&fill a business class => business classes are no more @Serializable and I've made a split between DTO & biz, feels good on an architectural level but given the project it means a lot of maintenance just to be able to support this specific serialization 🤔 • I could write a KSP plugin to maintain proto files up-to-date from kotlin classes (copying comments and other build time information if needed, also have to handle proto upgrade), so that KMM developers doesn't spend time maintaining proto files and simply doesn't care about this serialization mode. Proto files are shared with teams that don't use our KMM library and use other languages (in my case = backend). That's what I'm trying currently, and it means I'd rely on the current protobuf serialization capability. Also let's say you've a KMM project, if you use proto to generate objc and java/kotlin files for ios/Android, you still need to make a layer of expect/actual to be able to use them from your KMM project. So here a KMP protobuf library makes sense for me, even if I agree it's not 100% of the use cases.