https://kotlinlang.org logo
#multiplatform
Title
# multiplatform
l

louiscad

08/28/2020, 8:00 AM
Hello, I'm advertising the #library-development channel that we have on this Slack. We can use it to chat about our pain points and solutions as library developers (like overcoming the 405/409 http errors from bintray that make publishing workflows flaky 😬). If you make libraries or plan to do it, consider joining that channel!
m

mbonnin

08/28/2020, 9:09 AM
Ooooh, is there a discussion about this already? I'm super interested in getting these to work reliably.
l

louiscad

08/28/2020, 9:13 AM
@mbonnin I was discussing in DM with @Arkadii Ivanov since I saw we had the same issue, and in the meantime, I discovered that having
publish=1
when uploading to bintray is the culprit that triggers unreliability like HTTP 405&409 errors. The good news is that you can trigger publishing after all of that with a single http call https://bintray.com/docs/api/#_publish_discard_uploaded_content
a

Arkadii Ivanov

08/28/2020, 9:15 AM
Yep. This flag should be disabled. And also having not many concurrent uploads and upload all metadata in a separate job.
But it won't completely fix the problem
m

mbonnin

08/28/2020, 9:16 AM
Damn, why not?
a

Arkadii Ivanov

08/28/2020, 9:17 AM
To me it looks like bintray don't like concurrent uploads. Also there could be same files uploaded from different jobs, not sure. But it will significantly reduce flakiness
l

louiscad

08/28/2020, 9:18 AM
BTW, it's possible that there's also network unreliability on GitHub Actions that would be the cause of the timeouts. I'll implement what I was talking about this weekend and see the results. If there's network issues when uploading from GitHub Actions, I'll limit it to publish Windows and Linux exclusive targets, and publish the rest locally from my Mac.
It really worked wonders in parallel without the publish flag enabled, I think it's more concurrent publishing that it doesn't like.
a

Arkadii Ivanov

08/28/2020, 9:21 AM
Also successful upload does not really mean all required files are there. I had issues like metadata files not uploading after gradle upgrade, or JS artifacts corrupted for unknown reason. So verification step is definitely required before publication.
m

mbonnin

08/28/2020, 9:21 AM
When you say verification, it's manual verification, right? There's no process like sonatype's "close" process that checks the artifacts?
(which is a bummer because with mpp it's hard to keep track of all the artifacts...)
l

louiscad

08/28/2020, 9:26 AM
@mbonnin He means having a project that adds the bintray repo with credentials to see unpublished content, and that depends on all the modules there for all supported targets. Having this step run after upload and before publishing allows automatic verification.
m

mbonnin

08/28/2020, 9:28 AM
I see, thanks. That's more involved than what I hoped for but good to know 👍
Obviously not tested because this requires a release but I'll test for next one
we only run a macOS job at the moment (no windows) so it's easy to run everything in the same job
a

Arkadii Ivanov

08/28/2020, 9:59 AM
Don't you need the
override
flag still?
m

mbonnin

08/28/2020, 9:59 AM
Do I?
l

louiscad

08/28/2020, 9:59 AM
@mbonnin You might still need
override=1
in case you have to retry the job because of a network issue.
m

mbonnin

08/28/2020, 10:00 AM
Ah yes, if there's a network issue, override is still required
or discard-retry is another option, this is how sonatype would do
a

Arkadii Ivanov

08/28/2020, 10:00 AM
Unless you don't have concurrent jobs. Otherwise same files can be uploaded concurrently.
m

mbonnin

08/28/2020, 10:01 AM
Yea, I only have one job, that's easier
l

louiscad

08/28/2020, 10:01 AM
Also, you leak the OkHttp connection (misses
use { … }
or a call to
close
, though it doesn't natter much since the VM should be shutdown afterwards.
1
m

mbonnin

08/28/2020, 10:01 AM
Did we reach out to bintray about all this?
a

Arkadii Ivanov

08/28/2020, 10:02 AM
Then should be ok. And even not flaky.
l

louiscad

08/28/2020, 10:02 AM
We didn't reach bintray, but we can spread the word about our findings once we can confirm all that.
m

mbonnin

08/28/2020, 10:04 AM
Then should be ok. And even not flaky
I've definitely seen my fair share of 409/405 (and also 443). I also used to run gradle with
--parallel
so maybe that's another reason
If next release fails, I'll reach out to bintray, they usually respond quite fast
a

Arkadii Ivanov

08/28/2020, 10:05 AM
Please share their response in case 😀
👍 1
1
l

louiscad

08/28/2020, 10:05 AM
@mbonnin 443 wasn't ah http error status code but the port of https where the connection failed at some point (often timeout, the cause might be GitHub: https://github.com/actions/virtual-environments/issues/1187)
1
1
@mbonnin I'm following up, here's the commit where I add a new workflow that includes uploaded dependencies check as @Arkadii Ivanov suggested, and finally publishes via Bintray's API, with retries for up to an hour. https://github.com/LouisCAD/Splitties/commit/e0f8c596e3d738316bcb720b9ef3062bf9caa7cf That commit just misses the
.gitignore
file for the new module which I added just after here 😅: https://github.com/LouisCAD/Splitties/commit/0125b2bdbca70c2e54ed4551da82e2fc932cbfa7
🔥 1
❤️ 2
… and it failed because of GitHub Actions unreliable network…
But no 405 or 409 http from Bintray, and it worked flawlessly locally.
m

mbonnin

08/31/2020, 3:38 PM
😬
So in these cases, you're relaunching the GA job?
And waiting 20minutes again?
l

louiscad

08/31/2020, 3:39 PM
This adds-up to GitHub servers and networking costs, I'll keep on retrying until it succeeds. If it keeps on failing often, I might automate retries, or switch to another CI provider.
m

mbonnin

08/31/2020, 3:39 PM
I see
l

louiscad

08/31/2020, 3:39 PM
Yes, that's what I'm doing. But I'll wait doing something else, like complaining here 😁
m

mbonnin

08/31/2020, 3:40 PM
Ahah, I was going to say, this costs me attention time too, I don't like doing something else while a release is ongoing
If it takes hours, I prefer doing it manually for the time being
l

louiscad

08/31/2020, 3:41 PM
Until we get sub-minute build and release times, it's always worth doing something else during the process.
m

mbonnin

08/31/2020, 3:42 PM
Also, slightly related, did you benchmark upload speed of bintray vs sonatype?
I'm under the impression that bintray is significantly slower but only have anecdotal evidence
l

louiscad

08/31/2020, 3:45 PM
Nope, I'm not uploading to sonatype (yet?), but since I disabled publish on every upload, it's no longer a bottleneck. The publishing after upload took less than 8 minutes, but I have so many artifacts (I'll try to tell you how many if I can catch the CI mid-success) that I'm not surprised nor bothered by it.
I have 1688 files uploaded for the current dev version of Splitties. Currently praying for the publication check to not fail again because of networking issue, as I'm writing a ticket for them to consider.
m

mbonnin

08/31/2020, 4:07 PM
1688 😮
l

louiscad

08/31/2020, 5:29 PM
Hallelujah!

https://www.youtube.com/watch?v=beWJ_MdvyLQ

After disabling Gradle's configure-on-demand and using bash on Windows (instead of default powershell that parses dots differently), the whole process works wonder! I'm so happy! Thank you so much to both of you @Arkadii Ivanov and @mbonnin for the inspiration, the help (including in DM), support and all. Here's the CI run. The most interesting is the last one which shows the publish API call retries at play. https://github.com/LouisCAD/Splitties/runs/1052298465?check_suite_focus=true
🎉 2