Huawei AR Engine ile ilk AR uygulamanızı geliştirin

Melikeeroglu
Huawei Developers - Türkiye
13 min readJun 29, 2020

Merhabalar,

Bu yazıda Huawei AR Engine kullanarak sıfırdan basit bir AR image uygulaması yapacağız. Geliştirmelerimizi Android Studio’da Java ile yapacağız.

Öncelikle AR(Augmented Reality) yani Arttırılmış Gerçeklik nedir bundan bahsedelim. AR, gerçek dünyadaki fiziksel ortam ile bilgisayar ortamında oluşturulan verilerin birleştirilmesidir. Kısaca sanal nesnelerin gerçek görüntüler üzerine bindirilmesi diyebiliriz. Günümüzde önem kazanan ve hayatımızın birçok alanında kullandığımız AR’nin temelleri aslında 1990’lı yıllarda atılmıştır. Yıllar önce hayatımıza giren bu teknolojinin kullanım alanları saymakla bitmez. Huawei 2019 yılında Huawei AR Engine’i kullanıcılarına sundu. HUAWEI AR Engine şu anda hareket izleme, düzlem algılama, ışık tahmini ve vuruş testi, el hareketi tanıma ve iskelet izleme, İnsan vücudu poz tanıma ve iskelet izleme, görüntü izleme, yüz ifadesi izleme imkanları sağlar. Biz bu yazıda düzlem algılama ve vuruş testi özelliklerini kullanarak basit bir AR uygulaması geliştireceğiz.

Proje Oluşturma

Android Studio’yu açalım ve projemizi oluşturalım.

App level build.gradle dosyasına aşağıdaki dependencyleri ekleyelim.

implementation fileTree(include: ['*.aar'], dir: 'libs')
implementation 'de.javagl:obj:0.3.0'

Huawei AR Engine SDK’yi projemizin libs klasörüne ekleyelim.

AndroidManifest.xml dosyamıza Camera Permission ekleyelim.

<uses-permission android:name="android.permission.CAMERA" />

Ve şimdi kodlamaya başlayabiliriz.

Öncelikle image rendering işlemi için kullanacağımız classları ekleyelim. Bunun için rendering adında bir klasör oluşturup rendering ile ilgili classları bu klasöre ekleyebiliriz.

1. BackgroundRenderer

Bu class background texture(arka plan dokusu) oluşturma işlemi için kullanılacak.

import android.content.Context;import android.opengl.GLES11Ext;import android.opengl.GLES20;import com.huawei.ardemo.R;import com.huawei.hiar.ARFrame;import java.nio.ByteBuffer;import java.nio.ByteOrder;import java.nio.FloatBuffer;public class BackgroundRenderer {    private static final String TAG = BackgroundRenderer.class.getSimpleName();    private static final int COORDS_PER_VERTEX = 3;    private static final int TEXCOORDS_PER_VERTEX = 2;    private static final int FLOAT_SIZE = 4;    private FloatBuffer mQuadVertices;    private FloatBuffer mQuadTexCoord;    private FloatBuffer mQuadTexCoordTransformed;    private int mQuadProgram;    private int mQuadPositionParam;    private int mQuadTexCoordParam;    private int mTextureId = -1;    private int mTextureTarget = GLES11Ext.GL_TEXTURE_EXTERNAL_OES;    public BackgroundRenderer() {    }    public int getTextureId() {        return mTextureId;    }    public void createOnGlThread(Context context) {        int textures[] = new int[1];        GLES20.glGenTextures(1, textures, 0);        mTextureId = textures[0];        GLES20.glBindTexture(mTextureTarget, mTextureId);        GLES20.glTexParameteri(mTextureTarget, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);        GLES20.glTexParameteri(mTextureTarget, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);        GLES20.glTexParameteri(mTextureTarget, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);        GLES20.glTexParameteri(mTextureTarget, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);        int numVertices = 4;        if (numVertices != QUAD_COORDS.length / COORDS_PER_VERTEX) {            throw new RuntimeException("Unexpected number of vertices in BackgroundRenderer.");        }        ByteBuffer bbVertices = ByteBuffer.allocateDirect(QUAD_COORDS.length * FLOAT_SIZE);        bbVertices.order(ByteOrder.nativeOrder());        mQuadVertices = bbVertices.asFloatBuffer();        mQuadVertices.put(QUAD_COORDS);        mQuadVertices.position(0);        ByteBuffer bbTexCoords = ByteBuffer.allocateDirect(                numVertices * TEXCOORDS_PER_VERTEX * FLOAT_SIZE);        bbTexCoords.order(ByteOrder.nativeOrder());        mQuadTexCoord = bbTexCoords.asFloatBuffer();        mQuadTexCoord.put(QUAD_TEXCOORDS);        mQuadTexCoord.position(0);        ByteBuffer bbTexCoordsTransformed = ByteBuffer.allocateDirect(                numVertices * TEXCOORDS_PER_VERTEX * FLOAT_SIZE);        bbTexCoordsTransformed.order(ByteOrder.nativeOrder());        mQuadTexCoordTransformed = bbTexCoordsTransformed.asFloatBuffer();        int vertexShader = ShaderHelper.loadGLShader(TAG, context,                GLES20.GL_VERTEX_SHADER, R.raw.background_vertex);        int fragmentShader = ShaderHelper.loadGLShader(TAG, context,                GLES20.GL_FRAGMENT_SHADER, R.raw.background_fragment_oes);        mQuadProgram = GLES20.glCreateProgram();        GLES20.glAttachShader(mQuadProgram, vertexShader);        GLES20.glAttachShader(mQuadProgram, fragmentShader);        GLES20.glLinkProgram(mQuadProgram);        GLES20.glUseProgram(mQuadProgram);        ShaderHelper.checkGLError(TAG, "program creation");        mQuadPositionParam = GLES20.glGetAttribLocation(mQuadProgram, "a_Position");        mQuadTexCoordParam = GLES20.glGetAttribLocation(mQuadProgram, "a_TexCoord");        ShaderHelper.checkGLError(TAG, "program parameters");    }    public void draw(ARFrame frame) {        ShaderHelper.checkGLError(TAG, "before draw");        if(frame==null){            return ;        }        if (frame.hasDisplayGeometryChanged()) {            frame.transformDisplayUvCoords(mQuadTexCoord, mQuadTexCoordTransformed);        }        GLES20.glDisable(GLES20.GL_DEPTH_TEST);        GLES20.glDepthMask(false);        GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mTextureId);        GLES20.glUseProgram(mQuadProgram);        GLES20.glVertexAttribPointer(                mQuadPositionParam, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, 0, mQuadVertices);        GLES20.glVertexAttribPointer(mQuadTexCoordParam, TEXCOORDS_PER_VERTEX,                GLES20.GL_FLOAT, false, 0, mQuadTexCoordTransformed);        GLES20.glEnableVertexAttribArray(mQuadPositionParam);        GLES20.glEnableVertexAttribArray(mQuadTexCoordParam);        GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);        GLES20.glDisableVertexAttribArray(mQuadPositionParam);        GLES20.glDisableVertexAttribArray(mQuadTexCoordParam);        GLES20.glDepthMask(true);        GLES20.glEnable(GLES20.GL_DEPTH_TEST);        ShaderHelper.checkGLError(TAG, "after draw");    }    public static final float[] QUAD_COORDS = new float[]{            -1.0f, -1.0f, 0.0f,            -1.0f, +1.0f, 0.0f,            +1.0f, -1.0f, 0.0f,            +1.0f, +1.0f, 0.0f,    };    public static final float[] QUAD_TEXCOORDS = new float[]{            0.0f, 1.0f,            0.0f, 0.0f,            1.0f, 1.0f,            1.0f, 0.0f,    };}

2. DisplayRotationHelper

Bu classımız görüntü değişikliklerini dinler ve gerekli düzenlemeleri yapar.

import android.content.Context;import android.hardware.display.DisplayManager;import android.hardware.display.DisplayManager.DisplayListener;import android.view.Display;import android.view.WindowManager;import com.huawei.hiar.ARSession;public class DisplayRotationHelper implements DisplayListener{    private boolean mViewportChanged;    private int mViewportWidth;    private int mViewportHeight;    private final Context mContext;    private final Display mDisplay;    public DisplayRotationHelper(Context context) {        this.mContext = context;        mDisplay = context.getSystemService(WindowManager.class).getDefaultDisplay();    }    public void onResume() {       mContext.getSystemService(DisplayManager.class).registerDisplayListener(this, null);    }    public void onPause() {       mContext.getSystemService(DisplayManager.class).unregisterDisplayListener(this);    }    public void onSurfaceChanged(int width, int height) {        mViewportWidth = width;        mViewportHeight = height;        mViewportChanged = true;    }    public void updateSessionIfNeeded(ARSession session) {        if (mViewportChanged) {            int displayRotation = mDisplay.getRotation();            session.setDisplayGeometry(displayRotation, mViewportWidth, mViewportHeight);            mViewportChanged = false;        }    }    public int getRotation() {        return mDisplay.getRotation();    }    @Override    public void onDisplayAdded(int displayId) {    }    @Override    public void onDisplayRemoved(int displayId) {    }    @Override    public void onDisplayChanged(int displayId) {        mViewportChanged = true;    }}

3. ShaderHelper

Bu classımız çizeceğimiz objeyi gölgelendirecektir.

import android.content.Context;import android.opengl.GLES20;import android.util.Log;import java.io.BufferedReader;import java.io.IOException;import java.io.InputStream;import java.io.InputStreamReader;public class ShaderHelper {    public static int loadGLShader(String tag, Context context, int type, int resId) {        String code = readRawTextFile(context, resId);        int shader = GLES20.glCreateShader(type);        GLES20.glShaderSource(shader, code);        GLES20.glCompileShader(shader);        final int[] status = new int[1];        GLES20.glGetShaderiv(shader, GLES20.GL_COMPILE_STATUS, status, 0);        if (status[0] == 0) {            Log.e(tag, "Error compiling shader: " + GLES20.glGetShaderInfoLog(shader));            GLES20.glDeleteShader(shader);            shader = 0;        }        if (shader == 0) {            throw new RuntimeException("Error creating shader.");        }        return shader;    }    public static void checkGLError(String tag, String label) {        int error;        while ((error = GLES20.glGetError()) != GLES20.GL_NO_ERROR) {            Log.e(tag, label + ": glError " + error);            throw new RuntimeException(label + ": glError " + error);        }    }    private static String readRawTextFile(Context context, int resId) {        InputStream inputStream = context.getResources().openRawResource(resId);        try {            BufferedReader reader = new BufferedReader(new InputStreamReader(inputStream));            StringBuilder sb = new StringBuilder();            String line;            while ((line = reader.readLine()) != null) {                sb.append(line).append("\n");            }            reader.close();            return sb.toString();        } catch (IOException e) {            e.printStackTrace();        }        return null;    }}

4. VirtualObjectRenderer

Bu classımız daha önce bahsettiğimiz sanal objemizin oluşturulmasında görev alacaktır.

import android.content.Context;import android.graphics.Bitmap;import android.graphics.BitmapFactory;import android.opengl.GLES20;import android.opengl.GLUtils;import android.opengl.Matrix;import com.huawei.ardemo.R;import com.huawei.ardemo.UtilsCommon;import java.io.IOException;import java.io.InputStream;import java.nio.ByteBuffer;import java.nio.ByteOrder;import java.nio.FloatBuffer;import java.nio.IntBuffer;import java.nio.ShortBuffer;import de.javagl.obj.Obj;import de.javagl.obj.ObjData;import de.javagl.obj.ObjReader;import de.javagl.obj.ObjUtils;public class VirtualObjectRenderer {    private static final String TAG = VirtualObjectRenderer.class.getSimpleName();    private static final int COORDS_PER_VERTEX = 3;    private static final float[] LIGHT_DIRECTION = new float[]{0.0f, 1.0f, 0.0f, 0.0f};    private float[] mViewLightDirection = new float[4];    private int mVertexBufferId;    private int mVerticesBaseAddress;    private int mTexCoordsBaseAddress;    private int mNormalsBaseAddress;    private int mIndexBufferId;    private int mIndexCount;    private int mProgram;    private int[] mTextures = new int[1];    private int mModelViewUniform;    private int mModelViewProjectionUniform;    private int mPositionAttribute;    private int mNormalAttribute;    private int mTexCoordAttribute;    private int mTextureUniform;    private int mLightingParametersUniform;    private int mMaterialParametersUniform;    private int mColorUniform;    private float[] mModelMatrix = new float[UtilsCommon.MAX_TRACKING_ANCHOR_NUM];    private float[] mModelViewMatrix = new float[UtilsCommon.MAX_TRACKING_ANCHOR_NUM];    private float[] mModelViewProjectionMatrix = new float[UtilsCommon.MAX_TRACKING_ANCHOR_NUM];    private float mAmbient = 0.5f;    private float mDiffuse = 1.0f;    private float mSpecular = 1.0f;    private float mSpecularPower = 4.0f;    public VirtualObjectRenderer() {    }    public void createOnGlThread(Context context, String objAssetName,String diffuseTextureAssetName) throws IOException {        Bitmap textureBitmap = BitmapFactory.decodeStream(                context.getAssets().open(diffuseTextureAssetName));        GLES20.glActiveTexture(GLES20.GL_TEXTURE0);        GLES20.glGenTextures(mTextures.length, mTextures, 0);        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextures[0]);        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,
GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR_MIPMAP_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR); GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, textureBitmap, 0); GLES20.glGenerateMipmap(GLES20.GL_TEXTURE_2D); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0); textureBitmap.recycle(); ShaderHelper.checkGLError(TAG, "load texture"); InputStream objInputStream = context.getAssets().open(objAssetName); Obj obj = ObjReader.read(objInputStream); obj = ObjUtils.convertToRenderable(obj); IntBuffer wideIndices = ObjData.getFaceVertexIndices(obj,3); FloatBuffer vertices = ObjData.getVertices(obj); FloatBuffer texCoords = ObjData.getTexCoords(obj, 2); FloatBuffer normals = ObjData.getNormals(obj); ShortBuffer indices = ByteBuffer.allocateDirect(2 * wideIndices.limit()).order(ByteOrder.nativeOrder()).asShortBuffer(); while (wideIndices.hasRemaining()) { indices.put((short) wideIndices.get()); } indices.rewind(); int[] buffers = new int[2]; GLES20.glGenBuffers(2, buffers, 0); mVertexBufferId = buffers[0]; mIndexBufferId = buffers[1]; mVerticesBaseAddress = 0; mTexCoordsBaseAddress = mVerticesBaseAddress + 4 * vertices.limit(); mNormalsBaseAddress = mTexCoordsBaseAddress + 4 * texCoords.limit(); final int totalBytes = mNormalsBaseAddress + 4 * normals.limit(); GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, mVertexBufferId); GLES20.glBufferData(GLES20.GL_ARRAY_BUFFER, totalBytes, null, GLES20.GL_STATIC_DRAW); GLES20.glBufferSubData( GLES20.GL_ARRAY_BUFFER, mVerticesBaseAddress, 4 * vertices.limit(), vertices); GLES20.glBufferSubData( GLES20.GL_ARRAY_BUFFER, mTexCoordsBaseAddress, 4 * texCoords.limit(), texCoords); GLES20.glBufferSubData( GLES20.GL_ARRAY_BUFFER, mNormalsBaseAddress, 4 * normals.limit(), normals); GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, 0); GLES20.glBindBuffer(GLES20.GL_ELEMENT_ARRAY_BUFFER, mIndexBufferId); mIndexCount = indices.limit(); GLES20.glBufferData( GLES20.GL_ELEMENT_ARRAY_BUFFER, 2 * mIndexCount, indices, GLES20.GL_STATIC_DRAW); GLES20.glBindBuffer(GLES20.GL_ELEMENT_ARRAY_BUFFER, 0); ShaderHelper.checkGLError(TAG, "obj buffer load"); final int vertexShader = ShaderHelper.loadGLShader(TAG, context, GLES20.GL_VERTEX_SHADER, R.raw.vitualobject_vertex); final int fragmentShader = ShaderHelper.loadGLShader(TAG, context, GLES20.GL_FRAGMENT_SHADER, R.raw.virtualobject_fragment); mProgram = GLES20.glCreateProgram(); GLES20.glAttachShader(mProgram, vertexShader); GLES20.glAttachShader(mProgram, fragmentShader); GLES20.glLinkProgram(mProgram); GLES20.glUseProgram(mProgram); ShaderHelper.checkGLError(TAG, "program creation"); mModelViewUniform = GLES20.glGetUniformLocation(mProgram, "u_ModelView"); mModelViewProjectionUniform = GLES20.glGetUniformLocation(mProgram, "u_ModelViewProjection"); mPositionAttribute = GLES20.glGetAttribLocation(mProgram, "a_Position"); mNormalAttribute = GLES20.glGetAttribLocation(mProgram, "a_Normal"); mTexCoordAttribute = GLES20.glGetAttribLocation(mProgram, "a_TexCoord"); mTextureUniform = GLES20.glGetUniformLocation(mProgram, "u_Texture"); mLightingParametersUniform = GLES20.glGetUniformLocation(mProgram, "u_LightingParameters"); mMaterialParametersUniform = GLES20.glGetUniformLocation(mProgram, "u_MaterialParameters"); mColorUniform = GLES20.glGetUniformLocation(mProgram, "u_ObjColor"); ShaderHelper.checkGLError(TAG, "Program parameters"); Matrix.setIdentityM(mModelMatrix, 0); } public void updateModelMatrix(float[] modelMatrix, float scaleFactor) { float[] scaleMatrix = new float[UtilsCommon.MAX_TRACKING_ANCHOR_NUM]; Matrix.setIdentityM(scaleMatrix, 0); scaleMatrix[0] = scaleFactor; scaleMatrix[5] = scaleFactor; scaleMatrix[10] = scaleFactor; Matrix.multiplyMM(mModelMatrix, 0, modelMatrix, 0, scaleMatrix, 0); Matrix.rotateM(mModelMatrix,0,315.0f,0f,1f,0f); } public void draw(float[] cameraView, float[] cameraPerspective, float lightIntensity, float[] objColor) { ShaderHelper.checkGLError(TAG, "before draw"); Matrix.multiplyMM(mModelViewMatrix, 0, cameraView, 0, mModelMatrix, 0); Matrix.multiplyMM(mModelViewProjectionMatrix, 0, cameraPerspective, 0, mModelViewMatrix, 0); GLES20.glUseProgram(mProgram); Matrix.multiplyMV(mViewLightDirection, 0, mModelViewMatrix, 0, LIGHT_DIRECTION, 0); normalizeVec3(mViewLightDirection); GLES20.glUniform4f(mLightingParametersUniform, mViewLightDirection[0], mViewLightDirection[1], mViewLightDirection[2], lightIntensity); // Set the object color property. GLES20.glUniform4f(mMaterialParametersUniform, mAmbient, mDiffuse, mSpecular, mSpecularPower); GLES20.glActiveTexture(GLES20.GL_TEXTURE0); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextures[0]); GLES20.glUniform1i(mTextureUniform, 0); GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, mVertexBufferId); GLES20.glVertexAttribPointer( mPositionAttribute, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, 0, mVerticesBaseAddress); GLES20.glVertexAttribPointer( mNormalAttribute, 3, GLES20.GL_FLOAT, false, 0, mNormalsBaseAddress); GLES20.glVertexAttribPointer( mTexCoordAttribute, 2, GLES20.GL_FLOAT, false, 0, mTexCoordsBaseAddress); GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, 0); GLES20.glUniformMatrix4fv( mModelViewUniform, 1, false, mModelViewMatrix, 0); GLES20.glUniformMatrix4fv( mModelViewProjectionUniform, 1, false, mModelViewProjectionMatrix, 0); GLES20.glEnableVertexAttribArray(mPositionAttribute); GLES20.glEnableVertexAttribArray(mNormalAttribute); GLES20.glEnableVertexAttribArray(mTexCoordAttribute); GLES20.glBindBuffer(GLES20.GL_ELEMENT_ARRAY_BUFFER, mIndexBufferId); GLES20.glDrawElements(GLES20.GL_TRIANGLES, mIndexCount, GLES20.GL_UNSIGNED_SHORT, 0); GLES20.glBindBuffer(GLES20.GL_ELEMENT_ARRAY_BUFFER, 0); GLES20.glDisableVertexAttribArray(mPositionAttribute); GLES20.glDisableVertexAttribArray(mNormalAttribute); GLES20.glDisableVertexAttribArray(mTexCoordAttribute); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0); ShaderHelper.checkGLError(TAG, "after draw"); } public static void normalizeVec3(float[] v) { float length = 1.0f / (float) Math.sqrt(v[0] * v[0] + v[1] * v[1] + v[2] * v[2]); v[0] *= length; v[1] *= length; v[2] *= length; }}

Böylelikle rendering işlemi için gerekli classlarımızı oluşturmuş olduk.

Activity oluşturmadan önce son olarak UtilsCommon classımız ve Kamera izni için CameraPermissionHelper classımızı oluşturalım ve içine şu değişkeni ekleyelim.

public class UtilsCommon {    public static final int MAX_TRACKING_ANCHOR_NUM = 16;}

CameraPermissionHelper

import android.Manifest;
import android.app.Activity;
import android.content.pm.PackageManager;

import androidx.core.app.ActivityCompat;
import androidx.core.content.ContextCompat;

import java.util.ArrayList;
import java.util.List;


class CameraPermissionHelper {

private static final String[] permissionsArray = new String[]{
Manifest.permission.CAMERA};

// permission list to request
private static List<String> permissionsList = new ArrayList<>();

// return code
public static final int REQUEST_CODE_ASK_PERMISSIONS = 1;

// check permission
public static boolean hasPermission(final Activity activity){
for (String permission : permissionsArray) {
if (ContextCompat.checkSelfPermission(activity, permission) != PackageManager.PERMISSION_GRANTED) {
return false;
}
}
return true;
}

// require permission
public static void requestPermission(final Activity activity){
for (String permission : permissionsArray) {
if (ContextCompat.checkSelfPermission(activity, permission) != PackageManager.PERMISSION_GRANTED) {
permissionsList.add(permission);
}
}
ActivityCompat.requestPermissions(activity, permissionsList.toArray(new String[permissionsList.size()]), REQUEST_CODE_ASK_PERMISSIONS);
}
}

Şimdi activitymizi oluşturabiliriz.

Activitymiz GLSurfaceView.Renderer interfaceini implement etmeli. GLSurfaceView.Renderer interfacei OpenGL kullanarak render işlemlerini gerçekleştirir. Interfacei implement ettikten sonra

-onSurfaceCreated (Yüzey ilk oluşturulduğunda veya yeniden oluşturulduğunda çağrılır.)

-onSurfaceChanges (Yüzey boyutu değiştiğinde çağrılır.)

-onDrawFrame (Geçerli frame çizilirken çağırılır.)

Metodlarını implement etmeliyiz.

Activitymizin layoutunda GLSurfaceView ve bir textView kullancağız. Layoutumuz şu şekilde olacak.

<?xml version="1.0" encoding="utf-8"?><RelativeLayout    xmlns:android="http://schemas.android.com/apk/res/android"    xmlns:tools="http://schemas.android.com/tools"    android:layout_width="match_parent"    android:layout_height="match_parent"    tools:context="com.huawei.ardemo.MainActivity">    <android.opengl.GLSurfaceView        android:id="@+id/surfaceview"        android:layout_width="fill_parent"        android:layout_height="fill_parent"        android:layout_gravity="top" />    <TextView        android:id="@+id/searchingTextView"        android:layout_width="match_parent"        android:layout_height="47dp"        android:layout_alignParentBottom="true"        android:layout_alignParentStart="true"        android:layout_marginStart="0dp"        android:background="@android:color/darker_gray"        android:text="Searching for surface..."        tools:layout_editor_absoluteX="0dp"        tools:layout_editor_absoluteY="512dp"        android:layout_marginLeft="0dp"        android:layout_alignParentLeft="true" /></RelativeLayout>

Sırada arka plan ve obje için shader eklememiz gerekiyor.

İki tür shader vardır: Vertex Shader ve Fragment Shader. Vertex Shader şekil konumlarını 3D çizim koordinatlarına dönüştürür. Fragment Shader, bir şeklin renklerinin ve diğer niteliklerinin görüntülemelerini hesaplar.

Şimdi shaderlarımız ekleyeceğiz. 2 tane background, 2 tane obje için olmak üzere 4 shader ekliyoruz. Bunları res klasöründeki raw klasörü içerisine ekleyebiliriz.

1. background_fragment_oes.shader

#extension GL_OES_EGL_image_external : require precision mediump float; varying vec2 v_TexCoord; uniform samplerExternalOES sTexture; void main() {     gl_FragColor = texture2D(sTexture, v_TexCoord); }

2. background_vertex.shader

attribute vec4 a_Position;attribute vec2 a_TexCoord;varying vec2 v_TexCoord;void main() {   gl_Position = a_Position;   v_TexCoord = a_TexCoord;}

2. virtualobject_fragment.shader

precision mediump float;uniform sampler2D u_Texture;uniform vec4 u_LightingParameters;uniform vec4 u_MaterialParameters;varying vec3 v_ViewPosition;varying vec3 v_ViewNormal;varying vec2 v_TexCoord;uniform vec4 u_ObjColor;void main() {    const float kGamma = 0.4545454;    const float kInverseGamma = 2.2;    vec3 viewLightDirection = u_LightingParameters.xyz;    float lightIntensity = u_LightingParameters.w;    float materialAmbient = u_MaterialParameters.x;    float materialDiffuse = u_MaterialParameters.y;    float materialSpecular = u_MaterialParameters.z;    float materialSpecularPower = u_MaterialParameters.w;    vec3 viewFragmentDirection = normalize(v_ViewPosition);    vec3 viewNormal = normalize(v_ViewNormal);    vec4 objectColor = texture2D(u_Texture, vec2(v_TexCoord.x, 1.0 - v_TexCoord.y));    if (u_ObjColor.a >= 255.0) {      float intensity = objectColor.r;      objectColor.rgb = u_ObjColor.rgb * intensity / 255.0;    }    objectColor.rgb = pow(objectColor.rgb, vec3(kInverseGamma));    float ambient = materialAmbient;    float diffuse = lightIntensity * materialDiffuse *            0.5 * (dot(viewNormal, viewLightDirection) + 1.0);    vec3 reflectedLightDirection = reflect(viewLightDirection, viewNormal);    float specularStrength = max(0.0, dot(viewFragmentDirection, reflectedLightDirection));    float specular = lightIntensity * materialSpecular *            pow(specularStrength, materialSpecularPower);    gl_FragColor.a = objectColor.a;    gl_FragColor.rgb = pow(objectColor.rgb * (ambient + diffuse) + specular, vec3(kGamma));}

4. virtualobject_vertex.shader

uniform mat4 u_ModelView;uniform mat4 u_ModelViewProjection;attribute vec4 a_Position;attribute vec3 a_Normal;attribute vec2 a_TexCoord;varying vec3 v_ViewPosition;varying vec3 v_ViewNormal;varying vec2 v_TexCoord;void main() {    v_ViewPosition = (u_ModelView * a_Position).xyz;    v_ViewNormal = (u_ModelView * vec4(a_Normal, 0.0)).xyz;    v_TexCoord = a_TexCoord;    gl_Position = u_ModelViewProjection * a_Position;}

Son olarak assets klasörü oluşturup içerisine çizmek istediğimiz objemizi ve onu kaplamak için gerekli dosyayı eklememiz gerekiyor. Internette kolayca bulabileceğiniz ücretsiz 3D obje modellerinden birini seçebilirsiniz. Seçtiğimiz obje ve objenin kaplamasını assets klasörüne eklediyseniz activitymizi aşağıdaki gibi düzenleyip, createOnGLThread metodunda objAssetName parametresine .obj uzantılı obje dosyanızı, diffuseTextureAssetName parametresine kaplayacağınız dosyayı vermeniz gerekiyor.

public class MainActivity extends AppCompatActivity implements GLSurfaceView.Renderer {

private static final String TAG = MainActivity.class.getSimpleName();
private ARSession mSession;
private GLSurfaceView mSurfaceView;
private GestureDetector mGestureDetector;
private DisplayRotationHelper mDisplayRotationHelper;

private BackgroundRenderer mBackgroundRenderer = new BackgroundRenderer();
private VirtualObjectRenderer mVirtualObject = new VirtualObjectRenderer();

private final float[] mAnchorMatrix = new float[UtilsCommon.MAX_TRACKING_ANCHOR_NUM];
private static final float[] DEFAULT_COLOR = new float[] {0f, 0f, 0f, 0f};
private static class ColoredARAnchor {
public final ARAnchor anchor;
public final float[] color;

public ColoredARAnchor(ARAnchor a, float[] color4f) {
this.anchor = a;
this.color = color4f;
}
}

private ArrayBlockingQueue<MotionEvent> mQueuedSingleTaps = new ArrayBlockingQueue<>(2);
private ArrayList<ColoredARAnchor> mAnchors = new ArrayList<>();

private float mScaleFactor = 0.15f;
private boolean installRequested;
private TextView mSearchingTextView;

@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
mSearchingTextView = findViewById(R.id.searchingTextView);
mSurfaceView = findViewById(R.id.surfaceview);
mDisplayRotationHelper = new DisplayRotationHelper(this);

mGestureDetector = new GestureDetector(this, new GestureDetector.SimpleOnGestureListener() {
@Override
public boolean onSingleTapUp(MotionEvent e) {
onSingleTap(e);
return true;
}

@Override
public boolean onDown(MotionEvent e) {
return true;
}
});
mSurfaceView.setOnTouchListener(new View.OnTouchListener() {
@Override
public boolean onTouch(View v, MotionEvent event) {
return mGestureDetector.onTouchEvent(event);
}
});

// Set up renderer.
mSurfaceView.setPreserveEGLContextOnPause(true);
mSurfaceView.setEGLContextClientVersion(2);
mSurfaceView.setEGLConfigChooser(8, 8, 8, 8, 16, 0); // Alpha used for plane blending.
mSurfaceView.setRenderer(this);
mSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_CONTINUOUSLY);

installRequested = false;
}


@Override
protected void onResume() {
super.onResume();
Exception exception = null;
String message = null;
if (null == mSession) {
try {
AREnginesSelector.AREnginesAvaliblity enginesAvaliblity = AREnginesSelector.checkAllAvailableEngines(this);
if ((enginesAvaliblity.ordinal() &
AREnginesSelector.AREnginesAvaliblity.HWAR_ENGINE_SUPPORTED.ordinal()) != 0) {

AREnginesSelector.setAREngine(AREnginesSelector.AREnginesType.HWAR_ENGINE);

switch (AREnginesApk.requestInstall(this, !installRequested)) {
case INSTALL_REQUESTED:
installRequested = true;
return;
case INSTALLED:
break;
}
if (!CameraPermissionHelper.hasPermission(this)) {
CameraPermissionHelper.requestPermission(this);
return;
}
mSession = new ARSession(this);
} else {
message = "This device does not support Huawei AR Engine ";
}
ARConfigBase config = new ARWorldTrackingConfig(mSession);
mSession.configure(config);

} catch (ARUnavailableServiceNotInstalledException e) {
message = "Please install HuaweiARService.apk";
exception = e;
} catch (ARUnavailableServiceApkTooOldException e) {
message = "Please update HuaweiARService.apk";
exception = e;
} catch (ARUnavailableClientSdkTooOldException e) {
message = "Please update this app";
exception = e;
} catch (ARUnavailableDeviceNotCompatibleException e) {
message = "This device does not support Huawei AR Engine ";
exception = e;
} catch (ARUnavailableEmuiNotCompatibleException e) {
message = "Please update EMUI version";
exception = e;
} catch (ARUnavailableUserDeclinedInstallationException e) {
message = "Please agree to install!";
exception = e;
} catch (ARUnSupportedConfigurationException e) {
message = "The configuration is not supported by the device!";
exception = e;
}catch (Exception e) {
message = "exception throwed";
exception = e;
}
if (message != null) {
Toast.makeText(this, message, Toast.LENGTH_LONG).show();
Log.e(TAG, "Creating sesson", exception);
if (mSession != null) {
mSession.stop();
mSession = null;
}
return;
}
}

mSession.resume();
mSurfaceView.onResume();
mDisplayRotationHelper.onResume();
}

@Override
protected void onPause() {

super.onPause();
if (mSession != null) {
mDisplayRotationHelper.onPause();
mSurfaceView.onPause();
mSession.pause();
}
}

@Override
public void onWindowFocusChanged(boolean hasFocus) {
super.onWindowFocusChanged(hasFocus);
if (hasFocus) {
getWindow().getDecorView().setSystemUiVisibility(
View.SYSTEM_UI_FLAG_LAYOUT_STABLE
| View.SYSTEM_UI_FLAG_LAYOUT_HIDE_NAVIGATION
| View.SYSTEM_UI_FLAG_LAYOUT_FULLSCREEN
| View.SYSTEM_UI_FLAG_HIDE_NAVIGATION
| View.SYSTEM_UI_FLAG_FULLSCREEN
| View.SYSTEM_UI_FLAG_IMMERSIVE_STICKY);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
}
}

private void onSingleTap(MotionEvent e) {
mQueuedSingleTaps.offer(e);
}

@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {

mBackgroundRenderer.createOnGlThread(/*context=*/this);

try {
mVirtualObject.createOnGlThread(this, "deer.obj", "Diffuse.jpg");

} catch (IOException e) {
e.printStackTrace();
Log.d(TAG, "Failed to read plane texture");
}
}

@Override
public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] results) {
if (!CameraPermissionHelper.hasPermission(this)) {
Toast.makeText(this,
"This application needs camera permission.", Toast.LENGTH_LONG).show();

finish();
}
}


@Override
public void onSurfaceChanged(GL10 unused, int width, int height) {
GLES20.glViewport(0, 0, width, height);
mDisplayRotationHelper.onSurfaceChanged(width, height);
}

@Override
public void onDrawFrame(GL10 unused) {
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);

if (null == mSession) {
return;
}
mDisplayRotationHelper.updateSessionIfNeeded(mSession);

try {
mSession.setCameraTextureName(mBackgroundRenderer.getTextureId());
ARFrame frame = mSession.update();
ARCamera camera = frame.getCamera();

handleTap(frame, camera);

mBackgroundRenderer.draw(frame);

if (camera.getTrackingState() == ARTrackable.TrackingState.PAUSED) {
return;
}

float[] projmtx = new float[UtilsCommon.MAX_TRACKING_ANCHOR_NUM];
camera.getProjectionMatrix(projmtx, 0, 0.1f, 100.0f);

float[] viewmtx = new float[UtilsCommon.MAX_TRACKING_ANCHOR_NUM];
camera.getViewMatrix(viewmtx, 0);

ARLightEstimate le = frame.getLightEstimate();
float lightIntensity = 1;
if (le.getState() != ARLightEstimate.State.NOT_VALID) {
lightIntensity = le.getPixelIntensity();
}
ARPointCloud arPointCloud = frame.acquirePointCloud();
arPointCloud.release();

if (mSearchingTextView != null) {
for (ARPlane plane : mSession.getAllTrackables(ARPlane.class)) {
if (plane.getType() != ARPlane.PlaneType.UNKNOWN_FACING &&
plane.getTrackingState() == ARTrackable.TrackingState.TRACKING) {
hideLoadingMessage();
break;
}
}
}

Iterator<ColoredARAnchor> ite = mAnchors.iterator();
while (ite.hasNext()) {
ColoredARAnchor coloredAnchor = ite.next();
if (coloredAnchor.anchor.getTrackingState() == ARTrackable.TrackingState.STOPPED) {
ite.remove();
} else if (coloredAnchor.anchor.getTrackingState() == ARTrackable.TrackingState.TRACKING) {
coloredAnchor.anchor.getPose().toMatrix(mAnchorMatrix, 0);
mVirtualObject.updateModelMatrix(mAnchorMatrix, mScaleFactor);
mVirtualObject.draw(viewmtx, projmtx, lightIntensity, coloredAnchor.color);
}
}

} catch (Throwable t) {
Log.e(TAG, "Exception on the OpenGL thread", t);
}
}

private void hideLoadingMessage() {
runOnUiThread(new Runnable() {
@Override
public void run() {
if (mSearchingTextView != null) {
mSearchingTextView.setVisibility(View.GONE);
mSearchingTextView = null;
}
}
});
}

@Override
protected void onDestroy() {
if (mSession != null) {
mSession.stop();
mSession = null;
}
super.onDestroy();
}


// Handle only one tap per frame, as taps are usually low frequency compared to frame rate.
private void handleTap(ARFrame frame, ARCamera camera) {
MotionEvent tap = mQueuedSingleTaps.poll();
if (tap != null && camera.getTrackingState() == ARTrackable.TrackingState.TRACKING) {
for (ARHitResult hit : frame.hitTest(tap)) {
// Check if any plane was hit, and if it was hit inside the plane polygon
ARTrackable trackable = hit.getTrackable();
// Creates an anchor if a plane or an oriented point was hit.
if ((trackable instanceof ARPlane
&& ((ARPlane) trackable).isPoseInPolygon(hit.getHitPose()))
// && (PlaneRenderer.calculateDistanceToPlane(hit.getHitPose(), camera.getPose()) > 0))
|| (trackable instanceof ARPoint
&& ((ARPoint) trackable).getOrientationMode() == ARPoint.OrientationMode.ESTIMATED_SURFACE_NORMAL)) {
// Hits are sorted by depth. Consider only closest hit on a plane or oriented point.
// Cap the number of objects created. This avoids overloading both the
// rendering system and ARCore.
if (mAnchors.size() >= UtilsCommon.MAX_TRACKING_ANCHOR_NUM) {
mAnchors.get(0).anchor.detach();
mAnchors.remove(0);
}

// Assign a color to the object for rendering based on the trackable type
// this anchor attached to. For AR_TRACKABLE_POINT, it's blue color, and
// for AR_TRACKABLE_PLANE, it's green color.
float[] objColor;
if (trackable instanceof ARPoint) {
objColor = new float[] {66.0f, 133.0f, 244.0f, 255.0f};
} else if (trackable instanceof ARPlane) {
objColor = new float[] {139.0f, 195.0f, 74.0f, 255.0f};
} else {
objColor = DEFAULT_COLOR;
}

// Adding an Anchor tells ARCore that it should track this position in
// space. This anchor is created on the Plane to place the 3D model
// in the correct position relative both to the world and to the plane.
mAnchors.add(new ColoredARAnchor(hit.createAnchor(), objColor));
break;
}
}
}
}
}

Şimdi projeyi Huawei AR Engine destekli cihazınızda çalıştırabilirsiniz. Proje çalıştığında yüzey algılaması tamamlandıktan sonra ekrana dokunarak objenizi görebilirsiniz.

Buraya kadar okuduğunuz için teşekkürler.
Umarım bu yazı AR app geliştirmenizde yardımcı olur.

--

--