I spent around 5 years as the lead developer of
Pigment for Android (until we gave up trying to figure out how to make money with a coloring app on Android 😓).
Pigment was a native Android app that used OpenGL for rendering so that we could use GL shaders for high quality, realistic rendering. Everything it had to render was stored in the app space, and we managed textures, lines, points, etc all there and transferred things to the GPU to render. This worked out quite well and scaled to handle hours and days long coloring projects.
While that's not the Canvas approach you mentioned (which I also have
lots of experience with), I share this because it's very similar, and I believe it can scale quite far. You can easily start with a live canvas approach which is simple, then add optimizations over time, like
tiling,
Picture based snapshots for super fast undo/redo support, etc. this approach can scale, and if Android had access to Skia shaders at the time (and I had known what I know now about this stuff) I would have even considered making Pigment with Canvas APIs!
If you haven't seen it, I'd recommend checking out Dan Sandler's
Markers project on GitHub. Sure, it's very old. But it also demonstrates a lot of really good techniques that are easy to translate to a larger, more modern system.