OpenGL ES: Render A 2D texture in Android App

Chirag Prajapati
Mindful Engineering
6 min readMay 17, 2022
Photo: Canva

This story is the second part of the OpenGL ES rendering series. If you didn’t read the previous one, you can start from here.

In the previous story

  • We got a basic understanding of GPU, OpenGL, and OpenGL ES.
  • Project setup and implementation for shape rendering.
  • Brief of GLSurfaceView, Vertex Shader, Fragment Shader, mapping from the world coordinate to OpenGL coordinate system and draw function.

Texture rendering:

  • The texture is an OpenGL object that contains one or more images that all have the same image format. To draw texture in a square shape, we need to tell each vertex of the square which part of the texture it corresponds to. So each vertex of the shape has a texture coordinate associated with them.
  • In the Texture coordinate system, X & Y range from 0 to 1. Texture coordinate starts from (0, 0) for the lower-left corner of the texture to the (1, 1) for the upper-right corner of the texture image. The following image shows how we map texture coordinates to the square.
  • Here, we define 4 texture coordinates for square. We want to attach the bottom-left side of the texture to the bottom-left side of the square. So we use the (0,0) texture coordinate for the square’s bottom-left vertex. The same applies with a (1,0) for the bottom-right side, (1,1) for the top-right side, and (0,1) for the top-left side of the square.
  • Texture coordinates would look like this,

Load shaders and define the pipeline

  • First, create vertex and fragment shader files and store them in the assets folder :
    Vertext Shader -> `src/main/assets/shaders/vertexShader.vert`
    Fragment Shader -> `src/main/assets/shaders/fragmentShader.frag`
  • If you don’t have an idea about the shader then kindly check the previous blog (Shape Rendering)of this series.
  • What will be in these 2 files, we discuss later in this blog. First, we need to create a pipeline and attach shaders. Check the below-defined code for that implementation.

Define attributes and uniforms

  • Define attributes and uniform handlers in the onSurfaceCreated function. These handlers will be used for passing data into the shaders.

Loading texture

  • Store the image that we want to render as a 2D texture in the assets folder:
    src/main/assets/models/mind.png
  • Here, we’re using an image file with .png format and converting it to bitmap.

Creating and binding texture

  • A texture unit is the location of the texture.
  • The glActiveTexture function is used to activate the texture unit. First, need to activate the texture unit before binding the texture. Texture unit GL_TEXTURE0 is always by default activated, so it’s ok if we don’t activate the first texture unit. OpenGL has a total of 16 texture units, which we can activate by using from GL_TEXTURE0 to GL_TEXTURE15.
  • The glGenTextures function takes the input of the first param is how many textures we want to generate and want to store in intArray.
  • The glBindTexture function is used to bind the texture with the currently activated texture unit. After that, we can start generating the texture using image data.
  • In the onDrawFrame function, define the glUniform1i function which is used to assign a location value to the texture sampler so we can set multiple textures at once in a fragment shader. We’ll get more ideas about this in Fragment Shader.

Generating texture

  • After binding the texture, we can start generating a texture using the previously loaded image data (textureBitmap) by using the function texImage2D.
  • The first argument defines the texture target, setting this to GL_TEXTURE_2D means this operation will generate a texture on the currently bound texture object at the same target (so any textures bound to targets GL_TEXTURE_1D or GL_TEXTURE_3D will not be affected).
  • A glGenerateMipmap function generates smaller textures from the high-resolution one which will be mapped with those objects which are far away from the viewer. This function resolves the memory caching issue.

Apply view projection matrix

  • As we discussed in the previous blog, calculate view and projection matrices and multiply them with each other. The final matrix will be passed to the vertex shader by using a uniform handler.

Pass quadrant position to the shader

  • Here, quadPositionHandle is an attribute handler of quadrant vertex data.
  • COORDINATES_PER_VERTEX is the number of the coordinates available in one vertex.
  • VERTEX_STRIDE is the number of bytes required for one vertex. We’re using float value for coordinate, so need to multiply vertex coordinates with float size.
  • quadrantCoordinatesBuffer is the float buffer of quadrant float array data.

Pass texture position to the shader

  • Here, texPositionHandle is an attribute handler of texture vertex data.
  • textureCoordinatesBuffer is the float buffer of texture float array data.

Enable attribute handlers

  • To use the attribute indexes (handler) of the quadrant and texture position in shader files, call the glEnableVertexAttribArray function. This function enables the attribute index for the position attribute. If the attribute is not enabled, it will not be used during rendering.

Draw Primitive

  • The function glDrawElements is designed to use data in a format similar to an indexed face set. OpenGL pulls position data from the enabled arrays in order as defined in a list of vertex numbers (DRAW_ORDER).
  • Here, the first parameter specifies what kind of primitives to render.
  • The second parameter specifies the number of elements to be rendered.
  • The third parameter specifies the type of the values in indices.
  • The fourth parameter is the short buffer of the draw order short array.

Disable attribute handlers

  • The glDisableVertexAttribArray function disables the generic vertex attribute array specified by index. By default, all generic vertex attribute arrays are disabled. After drawing primitive, we need to disable previously enabled vertex attributes.

Vertex Shader (vertexShader.vert)

  • uVPMatrix is a matrix generated by the multiplication of view and projection matrix.
  • Calculate gl_Position by multiplying uVPMatrix with quadrant position (a_Position) to set the correct proportion of the shape to any device screen.
  • As we know that Image coordinate system starts from Top-Left but OpenGL coordinate system starts from Bottom_Left. So we need to invert the Y value of the texture position. The final vertex (v_TexCoord) will be passed to the fragment shader define as a varying variable.

Fragment Shader (fragmentShader.frag)

  • u_Texture is with sampler2D uniform which is bound to a texture unit. Remember that in the gluniform1i function we’re passing texture handler and texture unit. The gluniform1i function call binds it to the texture unit.
  • v_TexCoord is the texture vertex data passed from the vertex shader.

Output:

So in the end, we’ll have output something like this.

Congratulations!! You got an understanding of How to draw 2D textures using OpenGL ES with Android.

Download sample from Github, with full code of the sample shown above :

References:

https://developer.android.com/guide/topics/graphics/opengl

Like what you read? Don’t forget to share this post and clap. Stay tuned for later updates. Thank you!

--

--

Chirag Prajapati
Mindful Engineering

Android Developer | Kotlin | Core Java | ARCore | OpenGL | Flutter