Rendering to a Texture in a background thread usin...
# android
t
Rendering to a Texture in a background thread using OpenGL ES 2, framebuffers and Android's Canvas implementation of Skia My goal is to be able to leverage the GPU in a background thread to create an offscreen image that is rendered using the Skia Canvas class. Ideally I would have liked to use Compose multiplatform and its Skiko library to do this, but due to the limitations of its DirectContext class, its not possible to write an EGL surface into its Canvas. As a consequence, I've had to use native functionality to implement this. I have it working for the desktop using the LWJGL library. For Android I follow these steps, running from an event queue that is running in a background thread: 1. I use EGL to get the current display 2. I use that display to initialize the EGL library 3. I create a config defining a colour buffer, that complies with OpenGL ES 2, and is based on a PBuffer 4. I use the eglChooseConfig function, using the display and config attributes to find the nearest config to the one I've requested 5. I create a pBuffer using the display, and returned config, specifying the height and width I want in the buffer 6. I create an EGL rendering context, using the display and config. 7. I make that context current, by passing in the display, the pBuffer surface, and the context. So far, so good. If I use GLES commands I can verify that the pBuffer is updated, and I can dump its pixels into a bitmap to verify its correct colours. But I want to use Android's implementation of Skia's Canvas class to avoid having to write shader programs, etc. Android's SKIA API will let me create a Canvas from a Surface, if the Surface is initialized with a SurfaceTexture. In order to get access to the texture and to be able to use glReadPixels to access its values, the texture must be bound to a framebuffer. So the next stage in my processing is to create a framebuffer and to attach a texture to it as a colour buffer. Here are the steps I followed: 1. I create a FrameBuffer using GLES's genFrameBuffers call. 2. I bind the generated framebuffer to GLES.GL_FRAMEBUFFER 3. I verify that the FrameBuffer was built successfully. 4. I then make the GLES20's GL_TEXTURE0 active. 5. I use GLES20's glGenTextures function to create the texture 6. I bind the id of the texture to the GL_TEXTURE_2D target 7. I use the GLES20's glTexImage2D method to allocate the texture givings its width and height and image format. 8. I verify that the texture was built successfully. 9. I then bind the texture to the framebuffer's colour attachment point using GLES20.glFramebufferTexture2D. In the call I set the parameters this way: a. target is set to GL_FRAMEBUFFER, b. attachment is set to GL_COLOR_ATTACHMENT0, c. texTarget is set to GL_TEXTURE_2D d. texture is set to the texture id (a uint) that was generated when the texture was created. e. level is set to 0, as only the base image is needed 10. I then run the glCheckFrameBufferStatus command to verify that the frameBuffer is well-formed. To verify that the wiring of the texture to the framebuffer worked has worked, I use GLES calls to draw to the texture and and then dump the pixels to a png. The png image is colored correctly, so the connection to the texture via the framebuffer appears solid. Now comes my attempt to use Android's API to use the texture to create a surface. I followed the advice I found online (see the code itself for the URL giving the advice):
Copy code
val surfaceTexture = SurfaceTexture(_texId, true)
surfaceTexture.setDefaultBufferSize(width, height)
val surface = Surface(surfaceTexture)
val canvas = surface.lockHardwareCanvas()
When I use the debugger, I can see the status of the calls and the nature of the objects returned (see attached image). I make a call to canvas.isHardwareAccelerated() to verify that the Canvas is alive and that it is set up to run on the GPU. This comes back as true. My first point of surprise is that the canvas that is generated is a RecordingCanvas. I would have thought that I would have built a standard Canvas. From what I understand the RecordingCanvas just adds drawing instructions to a list and won't release them to the GPU to be rendered into the texture until they are somehow released. I can't find any examples of how to use this type of Canvas when it is not attached to a RenderNode, so from here I am guessing. Here is my best guess attempt to render into the texture using the canvas, a blue circle on a tan background:
Copy code
canvas.drawRGB(168, 133, 56)
var paint = Paint()
paint.setARGB(255, 55, 111, 168)
paint.setAntiAlias(true)
paint.setStyle(Paint.Style.FILL)
canvas.drawCircle(500.0f, 500.0f, 200.0f, paint)
GLES20.glFlush()
GLES20.glFinish()
surfaceTexture.updateTexImage()
surface.unlockCanvasAndPost(canvas)
There are no exceptions to any of these calls. I have tried adding and removing the last 4 calls in various combinations. I thought that they would release the drawing instructions to the GPU, maybe that is what is wrong. I test the results of the rendering by calling my writeBufferToBitmap method:
Copy code
writeBufferToBitmap(width, height, "testSurfaceTexture")
When I look at the image generated it is pure black. It is as if writing with the Canvas has no effect. If I instead use GLES commands to draw to the texture, it renders the solid colour I set. So it seems likely that there is something in the block of code above that isn't right, most likely to do with my Canvas either being a RecordingCanvas or my guesses at releasing the drawing instructions to the GPU. I have spent a couple of months now learning Kotlin, Android Studio, and OpenGL and this is the last piece I need in order to be able to build my app. If someone could help me get over what I think is likely the last hurdle, I would be most appreciative and I will publish the resulting code so that it can serve as a resource for others who want to optimize rendering. I've attached the full source for more details. Cheers, Tom Cuthill
🧵 4
not kotlin but kotlin colored 3