# kotlindl

Javokhir Savriev

03/15/2024, 5:23 AM
Hi, throwing OutOfMemoryError FATAL EXCEPTION for bigger ONNX/ORT models on Android
I’m following ONNX Runtime Mobile Examples on android & all are working well. I’m using the same implementation to run my custom ONNX/ORT model on android but getting the following exception : E/AndroidRuntime: FATAL EXCEPTION: main Process: ai.onnxruntime.example.imageclassifier, PID: 21599 java.lang.OutOfMemoryError: Failed to allocate a 361332168 byte allocation with 8388608 free bytes and 164MB until OOM, target footprint 372920392, growth limit 536870912 at java.util.Arrays.copyOf( at at at ai.onnxruntime.example.imageclassifier.MainActivity$readModel$2.invokeSuspend(MainActivity.kt:154) at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33) at at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:571) at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:738) at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:678) at kotlinx.coroutines.scheduling.CoroutineScheduler$ Following are some development environment and model specifications I’m using : 1. Custom model: Model Size 340 MB Is there a bigger model size creating the problem?
Copy code
private val onnxModel: OnnxInferenceModel by lazy {
    val modelBytes = context.resources

Nikita Ermolenko

03/15/2024, 1:03 PM
A few related issues have been reported on the ONNX Runtime GitHub page. In one of these issues, it was suggested to load the model from a file instead of from bytes ( However, it appears that reading from a file may also result in out-of-memory errors, as mentioned here: You can attempt to load the model from a file using
. If this method does not work, further investigation may be required to resolve the issue, potentially on the onnxruntime's side.
👍 1