I got a requirement to be able to retry a failed 5...
# squarelibraries
c
I got a requirement to be able to retry a failed 500 network call up to 5 times (with a 5 second delay). I know that a network interceptor will be my friend here, but is the retry count something that I should just try to persist somewhere, or is there someway to know that I'm retrying the same request without having to create some sort of singleton or something? It also seems like okhttp might retry on my behalf anyway. Is there a way to disable that so I can follow my retry specs directly?
y
Retry count is countable from the chain or responses.
response.priorResponse
On the OkHttpClient.Builder you can configure retry behaviour.
With 5 second retries, I'd be tempted to make those external to the call, do in app code. Mainly because the interceptors are synchronous.
e
Yeah I’m wondering why this is a concern of OkHttp 🤔, should be sufficient to just
repeat(5) {...}
your api call with delay until its complete or it passes the success criteria
c
@yschimke oh nice! I didn't know retry count was something you could count. That solves the problem of "storing" the current retry value. I didn't know that you can configure retry behavior on the builder. I wonder if I can have retries only for statusCode == 500.
With 5 second retries, I'd be tempted to make those external to the call, do in app code. Mainly because the interceptors are synchronous.
That's an interesting take. So using coroutines and retrofit, it'd just be like a try catch, where the catch just tries again. That's what I would have likely tried if I didn't know about interceptors. Interceptors still feel like the right spot to do them. Hm. "Interceptors are synchronous" but all of that happens on a background thread, so it's not like it affects the main thread or anything right?
@efemoney
, should be sufficient to just
repeat(5) {...}
your api call with delay until its complete or it passes the success criteria
That repeat method is on a Flowable? (sorry just trying to understand how that would plug into my code). Or actually, maybe repeat is just a plain ol kotlin funciton. let me try to look it up!
y
Interceptors will be borrowing a thread from the dispatcher with enqueue, or using calling thread in execute. These will get exhausted, and you can't safely do suspending functions within interceptor, only chain.proceed or other blocking call execute.
c
Oh! Interesting! I thought a suspend function with a delay would be the trick to meet this task.
I'll try to do this with just a try catch at the call level and repeat there. BUT I will also try this at the interceptor level since my team is convinced that that's the way to go. Maybe interceptors have some docs about long running work i can point my team to.
e
@Colton Idle the repeat is the plain old kotlin function, at the application level. Again personally I don’t see the reason to want to have this inside an interceptor specifically but if thats a constraint you have then consider what Yuri mentioned. As an example, this is what we use to retry X times. Ours is exponential delay but there is no reason why you cannot do something different.
Copy code
internal suspend inline fun <T> retry(
  times: Int = Int.MAX_VALUE,
  until: (T) -> Boolean,
  delay: Duration = 1.seconds,
  block: () -> T,
): T {
  var currentDelay = delay

  repeat(times - 1) {
    val result = block()
    if (until(result)) return result

    delay(currentDelay)
    currentDelay *= 2 // exponential delay of x, 2x, 4x, 8x, 16x ...
  }

  return block() // last attempt
}
which can be used like so:
Copy code
val result = retry(
  times = 7,
  block = {
    runCatching { myApi.call(query = param) }
    // OR
    // myApi.call(query = param) which returns retrofit2.Response
    //  and you check response.isSuccessful
  },
  until = { it.isSuccess },
)

if (it.isSuccess) doSuccess() else reportFailure()
c
If you’re trying to build up resilience in your app, you might take a look at some existing libs that handle this, even if it’s just for ideas, eg https://github.com/michaelbull/kotlin-retry
I haven’t used that library but it appears to handle additional details like back off and jitter, etc
I have a custom thing I built to handle all 500 errors, with Fibonacci back off and fuzz factor. It can be a fun exercise on its own
266 Views