Working with Vuforia Engine and Android


How to Replace Datasets in the Android Image Targets Sample

The ImageTargets sample provided with Vuforia Engine shows how to augment image targets using the StonesAndChips device database (dataset).

Follow these steps to replace the sample dataset StonesAndChips.xml with your own dataset:

Generating Image Target Databases

  1. Log in to the Target Manager at https://developer.vuforia.com/targetmanager/
  2. Create a Device Database
  3. Add one or more Image Targets to your database.
  4. In your Device Database page, select the targets to download with your database. Then download your device database for the Android platform. This database will be downloaded as a ZIP file.
  5. Unzip the downloaded database. The uncompressed archive will reveal two files: MyImageTargets.dat and MyImageTargets.xml.
  6. Copy the MyImageTargets.xml and MyImageTargets.dat files into the assets/ directory of your Image Targets sample project. This project is located in the same directory where StonesAndChips.xml is located.

Code changes - C++ sample

  1. See ImageTargetsNative sample.
  2. Edit the source code in ImageTargets.cpp (under the /jni directory of your project):
    Set the name of your dataset XML file in this line (replacing StonesAndChips.xml with MyImageTargets.xml). See the following code as an example.
if (!dataSetStonesAndChips - > load(" MyImageTargets.xml ",
                                       Vuforia::DataSet::STORAGE_APPRESOURCE))
{
  LOG(" Failed to load data set. ");
  return 0;
}
  1. Open a cygwin console and rebuild the native part of the project by running ndk-build.
  2. When the build is successful, refresh the Java project in Eclipse. To refresh, right-click on the project and click Refresh, or press F5.
  3. Run the application on your device.

Code changes - Java samples

  1. See the VuforiaSamples, available with Vuforia Engine 2.8 and above.
  2. Edit the source code by replacing the following line found in the onCreate()method in ImageTargets.java under the folder \VuforiaSamples-x.x.x\src\main\java\com\vuforia\samples\VuforiaSamples\app\ImageTargets:
 mDatasetStrings.add("StonesAndChips.xml");

with this line:

 mDatasetStrings.add("MyImageTargets.xml");
  1. Run the application on your device.

Note: If the sample app still detects the chips and stones targets but does not detect your custom image targets, you must manually uninstall the ImageTargets sample app from your device and repeat the steps above.


How To Add Textures to the Native Android Sample

Add Textures to Native Android Samples

This article describes how to add textures to the native Android samples and how to swap out the textures at run time.

Add textures at startup

It can be easy to add new textures at startup. Use any of the sample applications and follow these steps:

  1. Add the texture images to the assets folder.
  2. In the main activity class (for example, ImageTargets.java) find the loadTextures() method. Add more calls to mTextures.add for each of your image files. The order that the textures are added determines their order in the native texture array.
  3. In the native renderFrame() method (for example, in ImageTargets.cpp) find the point at which the texture object is obtained (textures[textureIndex]). Change the texture index to select the desired texture. The indices start at 0, where index 0 is the first texture added in the loadTextures() Java method.

Add textures at run time

It might be preferable to create textures at runtime, for example, when the texture image is downloaded from a server. In this case, do the following:

  1. We need a method to load a texture from a bitmap, rather than from the APK. The loadTextureFromApk() method in Texture.java can be modified for this purpose:
  public static Texture loadTextureFromBitmap(Bitmap bitMap)
  {
      int[] data = new int[bitMap.getWidth() * bitMap.getHeight()];
      bitMap.getPixels(data, 0, bitMap.getWidth(), 0, 0,
                          bitMap.getWidth(), bitMap.getHeight());
      // Convert:
      byte[] dataBytes = new byte[bitMap.getWidth() *
                                 bitMap.getHeight() * 4];
      for (int p = 0; p < bitMap.getWidth() * bitMap.getHeight(); ++p)
      {
          int colour = data[p];
          dataBytes[p * 4]        = (byte)(colour >>> 16);    // R
          dataBytes[p * 4 + 1]    = (byte)(colour >>> 8);     // G
          dataBytes[p * 4 + 2]    = (byte) colour;            // B
          dataBytes[p * 4 + 3]    = (byte)(colour >>> 24);    // A
      }
      Texture texture = new Texture();
      texture.mWidth      = bitMap.getWidth();
      texture.mHeight     = bitMap.getHeight();
      texture.mChannels   = 4;
      texture.mData       = dataBytes;
      return texture;
  }
  1. We need a native method for creating the OpenGL texture. Here is a version that can be added to ImageTargets.cpp:
  Texture* myTexture = NULL;
  JNIEXPORT void JNICALL
  Java_com_vuforia_samples_ImageTargets_ImageTargetsRenderer_createGLTextureNative(
                     JNIEnv* env, jobject obj, 
                     jobject textureObject)
  {
      if (textureObject != NULL)
      {
          myTexture = Texture::create(env, textureObject);
          glGenTextures(1, &(myTexture->mTextureID));
          glBindTexture(GL_TEXTURE_2D, myTexture->mTextureID);
          glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
          glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
          glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
          glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
          glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, myTexture->mWidth,
                       myTexture->mHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE,
                       (GLvoid*) myTexture->mData);
      }
  }
  1. Add the corresponding method signature to ImageTargetsRenderer.java:
  public native void createGLTextureNative(Texture texture);
  public void createGLTexture(Texture texture)
  {
    createGLTextureNative(texture);
  }
  1. Be sure to call this method on the GL thread. To create a texture from an event that occurs on a different thread (for example, on image download), you can use the GLView queueEvent method:
    queueEvent method:
    mGlView.queueEvent(new Runnable() {
    public void run() {
        mRenderer.createGLTexture(Texture.loadTextureFromBitmap(bmp));
    }
  });
  1. When you are done with the native texture, be sure to call glDeleteTextures to delete the GL texture. Also delete the Texture object.
  2. You may also need to free the Android Bitmap using Bitmap.recycle().

How To Capture the AR View on Android

This article describes how to capture the augmented reality view on the Android platform, including the video background and the augmentation (for example, 3D models), which is rendered on top of it. The Image Targets sample is used as reference sample.

  1. In ImageTargetsRenderer.java, you must create methods to:
  • Grab pixels from the OpenGL view
  • Create a bitmap based on the pixels captured in the previous step
  • Save the bitmap onto the device storage
  private void saveScreenShot(int x, int y, int w, int h, String filename) 
{
    Bitmap bmp = grabPixels(x, y, w, h);
 
    try {
        String path = Environment.getExternalStorageDirectory() + "/" + filename;
        DebugLog.LOGD(path);
        File file = new File(path);
        file.createNewFile();
        FileOutputStream fos = new FileOutputStream(file);
        bmp.compress(CompressFormat.PNG, 100, fos);
        fos.flush();
        fos.close();
    } catch (Exception e) {
        DebugLog.LOGD(e.getStackTrace().toString());
    }
}
 
private Bitmap grabPixels(int x, int y, int w, int h) 
{
    int b[] = new int[w * (y + h)];
    int bt[] = new int[w * h];
 
    IntBuffer ib = IntBuffer.wrap(b);
    ib.position(0);
    GLES20.glReadPixels(x, 0, w, y + h, 
    GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, ib);
 
    for (int i = 0, k = 0; i < h; i++, k++) {
        for (int j = 0; j < w; j++) {
            int pix = b[i * w + j];
            int pb = (pix >> 16) & 0xff;
            int pr = (pix << 16) & 0x00ff0000;
            int pix1 = (pix & 0xff00ff00) | pr | pb;
            bt[(h - k - 1) * w + j] = pix1;
        }
    }
 
    Bitmap sb = Bitmap.createBitmap(bt, w, h, Bitmap.Config.ARGB_8888);
    return sb;
}
  1. Define two member variables to store the current view width and height:
 private int mViewWidth = 0; 
 private int mViewHeight = 0;

The mViewWidth and mViewHeight variables are updated in the onSurfaceChanged() method, as shown in the code below:

 public void onSurfaceChanged(GL10 gl, int width, int height) { 
  DebugLog.LOGD("GLRenderer::onSurfaceChanged"); 
  // Call native function to update rendering when render surface 
  // parameters have changed: updateRendering(width, height); 
  mViewWidth = width; mViewHeight = height; 
  // Call Vuforia function to handle render surface size changes: 
  Vuforia.onSurfaceChanged(width, height); }
  1. Call the saveScreenShot() method from the onDrawFrame() method, right after calling the native renderFrame() function:
  public void onDrawFrame(GL10 gl)
  {
      if (!mIsActive)
          return;
   
      // Update render view (projection matrix and viewport) if needed:
      mActivity.updateRenderView();
   
      // Call our native function to render content
      renderFrame();
   
      // Make sure the OpenGL rendering is finalized
      GLES20.glFinish();
   
      if ( some_condition ) {
          saveScreenShot(0, 0, mViewWidth, mViewHeight, "test.png");
      }
  }

NOTE: To enable writing to the external storage, you must edit the AndroidManifest.xml file and add the following permission:

  <uses-permission
         android:name="android.permission.WRITE_EXTERNAL_STORAGE"
         android:maxSdkVersion="18" />

How To Display Toast on Target Detection and Open Website

Display Toast on Target Detection and Open Website

This article describes how to Display a Toast when a trackable is detected and then open a webpage.

  1. Start with the ImageTargets sample project. Open ImageTargets.cpp, which is located in the jni folder.
  2. Replace the for loop in the renderFrameForView method with the following C++ code.
// Did we find any trackables this frame?
for(int tIdx = 0; tIdx &lt; state.getNumTrackableResults(); tIdx++)
  {
      // Get the trackable:
      const Vuforia:TrackableResult* result = state.getTrackableResult(tIdx);
      const Vuforia::Trackable &trackable = result->getTrackable();
      // Compare this trackable's id to a globally stored id
      // If this is a new trackable, find the displayMessage
      // java method and call it with the trackable's name
      if (trackable.getId() != lastTrackableId) {
          jstring js = env->NewStringUTF(trackable.getName());
          jclass javaClass = env->GetObjectClass(obj);
          jmethodID method = env->GetMethodID(javaClass, "displayMessage", "(Ljava/lang/String;)V");
          env->CallVoidMethod(obj, method, js);
          lastTrackableId = trackable->getId();
      }
  }
  1. Define the lastTrackableId variable before the renderFrameForView function. (C++)
int lastTrackableId = -1;
  1. Add the following code to ImageTargetsRenderer.java.
  public void displayMessage(String text)
  {
      // We use a handler because this thread cannot
      // change the UI
      Message message = new Message();
      message.obj = text;
      mainActivityHandler.sendMessage(message);
  }
  1. Add the following to the beginning of the ImageTargetsRenderer.java class definition.
  public static Handler mainActivityHandler;
  1. Add the Handler to ImageTargets.java in the onResume() Method
ImageTargetsRenderer.mainActivityHandler = new Handler() {
  @Override
  public void handleMessage(Message msg) {
      String text = (String) msg.obj;
  
      // The Toast displays the name of the detected
      // trackable on the screen
      Context context = getApplicationContext();
      int duration = Toast.LENGTH_SHORT;
      Toast toast = Toast.makeText(context, text, duration);
      toast.show();
              
      // The following opens a pre-defined URL based on the
      // name of trackable detected
      if (text.equalsIgnoreCase("stones")) {
          Uri uriUrl = Uri.parse("http://www.thingworx.com/");
          Intent launchBrowser = new Intent(Intent.ACTION_VIEW, uriUrl);
          startActivity(launchBrowser);
      }
      if (text.equalsIgnoreCase("chips")) {
          Uri uriUrl = Uri.parse("http://developer.vuforia.com");
          Intent launchBrowser = new Intent(Intent.ACTION_VIEW, uriUrl);
          startActivity(launchBrowser);
      }
  }
};

How To Update Native Android Samples to Use Native Activities

The Vuforia Unity samples use the Android NativeActivity class on devices that support it.It is possible to update the native Android samples to also use Native Activities.This allows the rendering loop to happen completely natively (possibly improving performance).

Use NativeActivity with the ImageTargets sample.
Follow these steps to use a NativeActivity with the ImageTargets sample.

  1. Set the Project Build Target to Android 2.3 at a minimum.
  2. In ImageTargets.java, set the class to extend NativeActivity, rather than Activity.
  3. Add the native-activity library to the static initializer block, like the following example:
static
{
    loadLibrary(NATIVE_LIB_QCAR);
    loadLibrary(NATIVE_LIB_SAMPLE);
    loadLibrary("native-activity");
}
  1. Remove all references to mGlView and mRenderer in ImageTargets.java, since these will be handled natively.
  2. Add the following code to ImageTargets.java:
private native void initRendering();
private native void updateRendering(int width, int height);
private native void cacheJNIVars();
private native void setAnimating(int value);
// Called from native
public void onGLInitialized()
{
    // Call native function to initialize rendering:
    initRendering();
    // Call QCAR function to (re)initialize rendering after first use
    // or after OpenGL ES context was lost (e.g. after onPause/onResume):
    QCAR.onSurfaceCreated();
}
// SurfaceHolder callback
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height)
{
    super.surfaceChanged(holder, format, width, height);
    onSurfaceChanged(width, height);
}
public void onSurfaceChanged(int width, int height)
{
    // Call native function to update rendering when render surface parameters have changed:
    updateRendering(width, height);
    // Call QCAR function to handle render surface size changes:
    QCAR.onSurfaceChanged(width, height);
}
  1. Comment out the call to initApplicationNative(mScreenWidth, mScreenHeight) in the initApplicationAR method. Instead, call it after the loadTextures() call in the onCreate method, as shown in the following example:
    mTextures = new Vector<Texture>();
    loadTextures();
    initApplicationNative(0, 0);
  1. Add the following code to the end of the onCreate method:
    cacheJNIVars();
  1. Call onSurfaceChanged after the call to onQCARInitializedNative() in updateApplicationStatus:
// Native post initialization:
    onQCARInitializedNative();
    onSurfaceChanged(mScreenWidth, mScreenHeight)
  1. Turn animating on and off when the camera is started or stopped:
case APPSTATUS_CAMERA_STOPPED:
    // Call the native function to stop the camera
    stopCamera();
    setAnimating(0);
    break;
case APPSTATUS_CAMERA_RUNNING:
    // Call the native function to start the camera
    startCamera();
    setProjectionMatrix();
    setAnimating(1);
    break;
  1. Set android:hasCode=true in the Android Manifest. Also, add the following metadata just before the intent filters:
<meta-data android:name="android.app.lib_name" android:value="native-activity" />
<intent-filter>
     <action android:name="android.intent.action.MAIN" />
     <category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
  1. Find the native-activity sample in the Android NDK distribution. Copy main.c to the jni folder of the ImageTargets project. Change the filename to main.cpp.
  2. Add the following code to the end of the Android.mk file:
  include $(CLEAR_VARS)
  LOCAL_MODULE := native-activity
  LOCAL_SRC_FILES := main.cpp
  LOCAL_LDLIBS := -llog -landroid -lEGL -lGLESv2
  LOCAL_STATIC_LIBRARIES := android_native_app_glue
  LOCAL_SHARED_LIBRARIES := QCAR-prebuilt ImageTargets
  LOCAL_ARM_MODE := arm
  include $(BUILD_SHARED_LIBRARY)
  $(call import-module,android/native_app_glue)
  1. In main.cpp, move the engine declaration from the android_main method to global space:
  struct engine engine;
  1. Add the following code to main.cpp:
#ifdef __cplusplus
extern "C"
{
#endif
    JavaVM* javaVM = NULL;
    jclass activityClass;
    jobject activityObj;
    extern void renderFrame();
    JNIEXPORT void JNICALL
Java_com_vuforia_samples_ImageTargets_ImageTargets_cacheJNIVars(JNIEnv *env, jobject jobj)
    {
        env->GetJavaVM(&javaVM);
        jclass cls = env->GetObjectClass(jobj);
        activityClass = (jclass) env->NewGlobalRef(cls);
        activityObj = env->NewGlobalRef(jobj);
    }
    JNIEXPORT int JNICALL   Java_com_vuforia_samples_ImageTargets_ImageTargets_setAnimating(JNIEnv *, jobject, jint animating)
    {
        engine.animating = animating;
    }
    void onGLInitialized()
    {
        if (javaVM == NULL)
        {
            LOGI("Error: javaVM is NULL");
            return;
        }
        JNIEnv *env;
        javaVM->AttachCurrentThread(&env, NULL);
        jmethodID method = env->GetMethodID(activityClass, "onGLInitialized", "()V");
        if (method == NULL)
        {
            LOGI("Error: could not find onGLInitialized method");
            return;
        }
        env->CallVoidMethod(activityObj, method);
    }
#ifdef __cplusplus
}
#endif
  1. Replace the engine_init_display method with the following code:
static int engine_init_display(struct engine* engine) {
    // initialize OpenGL ES and EGL
    const EGLint attribs[] = {
            EGL_RENDERABLE_TYPE, EGL_OPENGL_ES2_BIT,
            EGL_SURFACE_TYPE, EGL_WINDOW_BIT,
            EGL_RED_SIZE, 5,
            EGL_GREEN_SIZE, 6,
            EGL_BLUE_SIZE, 5,
            //EGL_ALPHA_SIZE, 8, // assuming we don't need this
            EGL_DEPTH_SIZE, 16,
            EGL_STENCIL_SIZE, 0,
            EGL_NONE
    };
    EGLint w, h, dummy, format;
    EGLint numConfigs;
    EGLConfig config;
    EGLSurface surface;
    EGLContext context;
    EGLDisplay display = eglGetDisplay(EGL_DEFAULT_DISPLAY);
    eglInitialize(display, 0, 0);
    /* Here, the application chooses the configuration it desires. In this
     * sample, we have a very simplified selection process, where we pick
     * the first EGLConfig that matches our criteria */
    eglChooseConfig(display, attribs, &config, 1, &numConfigs);
    /* EGL_NATIVE_VISUAL_ID is an attribute of the EGLConfig that is
     * guaranteed to be accepted by ANativeWindow_setBuffersGeometry().
     * As soon as we picked a EGLConfig, we can safely reconfigure the
     * ANativeWindow buffers to match, using EGL_NATIVE_VISUAL_ID. */
    eglGetConfigAttrib(display, config, EGL_NATIVE_VISUAL_ID, &format);
    ANativeWindow_setBuffersGeometry(engine->app->window, 0, 0, format);
    surface = eglCreateWindowSurface(display, config, engine->app->window, NULL);
    const EGLint attrib_list_gl20[] = { EGL_CONTEXT_CLIENT_VERSION, 2, EGL_NONE };
    context = eglCreateContext(display, config, EGL_NO_CONTEXT, attrib_list_gl20);
    if (eglMakeCurrent(display, surface, surface, context) == EGL_FALSE) {
        LOGW("Unable to eglMakeCurrent");
        return -1;
    }
    eglQuerySurface(display, surface, EGL_WIDTH, &w);
    eglQuerySurface(display, surface, EGL_HEIGHT, &h);
    engine->display = display;
    engine->context = context;
    engine->surface = surface;
    engine->width = w;
    engine->height = h;
    engine->state.angle = 0;
    // Initialize GL state.
    //glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_FASTEST);
    glEnable(GL_CULL_FACE);
    //glShadeModel(GL_SMOOTH);
    glDisable(GL_DEPTH_TEST);
    onGLInitialized();
    return 0;
}
  1. Replace the engine_draw_frame method with the following:
static void engine_draw_frame(struct engine* engine) {
    if (engine->display == NULL) {
        // No display.
        return;
    }
    // Call the ImageTargets.cpp renderFrame method
    renderFrame();
    eglSwapBuffers(engine->display, engine->surface);
}
  1. Comment out the following line in engine_handle_input:
  engine->animating = 1;
  1. In ImageTargets.cpp, change the method signatures of the following methods
  JNIEXPORT void JNICALL
  Java_com_vuforia_samples_ImageTargets_ImageTargetsRenderer_renderFrame(
                          JNIEnv *, jobject)
  // change to
  void renderFrame()
  JNIEXPORT void JNICALL
  Java_com_vuforia_samples_ImageTargets_ImageTargetsRenderer_initRendering(
                          JNIEnv* env, jobject obj)
  // change to
  JNIEXPORT void JNICALL
  Java_com_vuforia_samples_ImageTargets_ImageTargets_initRendering(
                          JNIEnv* env, jobject obj)
  JNIEXPORT void JNICALL
  Java_com_vuforia_samples_ImageTargets_ImageTargetsRenderer_updateRendering(
                          JNIEnv* env, jobject obj, jint width, jint height)
  // change to
  JNIEXPORT void JNICALL
  Java_com_vuforia_samples_ImageTargets_ImageTargets_updateRendering(
                          JNIEnv* env, jobject obj, jint width, jint height)

How To Communicate Between Java and C++ using the JNI

This article describes how to use the JNI (Java Native Interface) to communicate between C++ and Java for the Android Vuforia SDK.

The Vuforia Android samples are set up so that application lifecycle events are handled in Java, but tracking events and rendering are handled natively in C++. Users can leverage Android SDK functionality, such as touch handling or networking logic, while doing the low-level graphics work natively. This functionality requires a way to communicate between Java and C++. This communication is provided by the JNI.

For a practical example of using the JNI to respond to native tracking events in Java, see this article: How To Display Toast on Target Detection and Open Website

Calling native methods from Java

All Vuforia Engine samples make a few JNI calls out-of-the-box. In ImageTargets.java, look for method declarations starting with public native. For example: public native int initTracker() ;

This method is defined in ImageTargets.cpp as follows (C++):

#include < jni.h >
  #ifdef __cplusplus
  extern "C"
  {
  #endif
    JNIEXPORT int JNICALL
    Java_com_vuforia_samples_ImageTargets_ImageTargets_initTracker(JNIEnv *, jobject)
    {
      ...
    }
  #ifdef __cplusplus
  }
  #endif

First, note that all JNI methods exposed in C++ are wrapped in an extern C block. The function return type must be surrounded by the JNIEXPORT and JNICALL macros. Then the function name takes the form Java_package_class_function . We are effectively implementing the method that we defined in Java.

From Java, you can call this method like any other Java method:
int result = initTracker();

Calling Java methods from native

After calling a native method from Java, going in the other direction takes a little more work. The following method exists in the ImageTargets.java class:
public int getTextureCount() { return mTextures.size(); }

Note that this method does not include any special JNI syntax. You can call it natively. Just look up the class that the ImageTargets Java object belongs to, and then look up the getTextureCount method in that class, as follows (C++):

JNIEXPORT void JNICALL
Java_com_vuforia_samples_ImageTargets_ImageTargets_initApplicationNative(
                            JNIEnv* env, jobject obj, jint width, jint height)
{
    ...
    jclass activityClass = env->GetObjectClass(obj);
    jmethodID getTextureCountMethodID = env->GetMethodID(activityClass,
                                                    "getTextureCount", "()I");
    if (getTextureCountMethodID == 0)
    {
        LOG("Function getTextureCount() not found.");
        return;
    }
    textureCount = env->CallIntMethod(obj, getTextureCountMethodID);
    ...
}

The last argument to the GetMethodID call needs some explanation " ()I ".

  • Place the argument types, in order, inside the parentheses.
  • After the parentheses you can place the return type.
  • So (IF)Z would represent a method that takes an int and a float and returns a boolean .

For more information and a table of the field descriptors, see the JNI documentation here: http://java.sun.com/docs/books/jni/html/types.html.

Storing JNI references for later

To call Java methods from C++, we need a handle on the JNIEnv object and the jobject, which in this case is the ImageTargets object. In Vuforia Engine samples, getTextureCount is called directly from a method that is called from Java ( initApplicationNative ).

You automatically get the JNIEnv and jobject as the first two arguments to this function. However, to call a Java method at a later time, you need to cache these values globally. It is somewhat dangerous to store the JNIEnv globally, mainly because it is not thread safe. The following code illustrates a safe way to obtain the JNIEnv for the current thread at a later time. Note the use of NewGlobalRef to make safe global references to the jclass and jobject objects.

JavaVM* javaVM = NULL;
jclass activityClass;
jobject activityObj;
JNIEXPORT void JNICALL
Java_com_vuforia_samples_ImageTargets_ImageTargets_initApplicationNative(
                            JNIEnv* env, jobject obj, jint width, jint height)
{
    env->GetJavaVM(&javaVM);
    jclass cls = env->GetObjectClass(obj);
    activityClass = (jclass) env->NewGlobalRef(cls);
    activityObj = env->NewGlobalRef(obj);
}
void myNativeMethod()
{
    JNIEnv *env;
    javaVM->AttachCurrentThread(&env, NULL);
    jmethodID method = env->GetMethodID(activityClass, "myJavaMethod", "()V");
    env->CallVoidMethod(activityObj, method);
}