gpt4 book ai didi

java - 如何在 Android Studio 中使用原生 C 库

转载 作者:塔克拉玛干 更新时间:2023-11-02 21:04:16 31 4
gpt4 key购买 nike

几年前我根据 https://ikaruga2.wordpress.com/2011/06/15/video-live-wallpaper-part-1/ 创建了一个问题.我的项目是在当时由 Google 直接提供的 Eclipse 版本中构建的,并且可以与使用我的应用程序名称创建的已编译 ffmpeg 库的副本一起正常工作。

现在我正在尝试基于我的旧应用程序创建一个新应用程序。由于 Google 不再支持 Eclipse,我下载了 Android Studio 并导入了我的项目。通过一些调整,我能够成功编译旧版本的项目。所以我修改了名称,将一组新的“.so”文件复制到 app\src\main\jniLibs\armeabi(我认为它们应该去的地方)并尝试在我的手机上再次运行该应用程序,绝对没有其他更改。

NDK 不会抛出任何错误。 Gradle 编译文件没有错误,并将其安装在我的手机上。该应用程序出现在我的动态壁纸列表中,我可以单击它来调出预览。但是我收到的不是视频,而是错误和 logCat 报告:

02-26 21:50:31.164  18757-18757/? E/AndroidRuntime﹕ FATAL EXCEPTION: main
java.lang.ExceptionInInitializerError
at com.nightscapecreations.anim3free.VideoLiveWallpaper.onSharedPreferenceChanged(VideoLiveWallpaper.java:165)
at com.nightscapecreations.anim3free.VideoLiveWallpaper.onCreate(VideoLiveWallpaper.java:81)
at android.app.ActivityThread.handleCreateService(ActivityThread.java:2273)
at android.app.ActivityThread.access$1600(ActivityThread.java:127)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1212)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:137)
at android.app.ActivityThread.main(ActivityThread.java:4441)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:511)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:823)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:590)
at dalvik.system.NativeStart.main(Native Method)
Caused by: java.lang.UnsatisfiedLinkError: Cannot load library: link_image[1936]: 144 could not load needed library '/data/data/com.nightscapecreations.anim1free/lib/libavutil.so' for 'libavcore.so' (load_library[1091]: Library '/data/data/com.nightscapecreations.anim1free/lib/libavutil.so' not found)
at java.lang.Runtime.loadLibrary(Runtime.java:370)
at java.lang.System.loadLibrary(System.java:535)
at com.nightscapecreations.anim3free.NativeCalls.<clinit>(NativeCalls.java:64)
at com.nightscapecreations.anim3free.VideoLiveWallpaper.onSharedPreferenceChanged(VideoLiveWallpaper.java:165)
at com.nightscapecreations.anim3free.VideoLiveWallpaper.onCreate(VideoLiveWallpaper.java:81)
at android.app.ActivityThread.handleCreateService(ActivityThread.java:2273)
at android.app.ActivityThread.access$1600(ActivityThread.java:127)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1212)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:137)
at android.app.ActivityThread.main(ActivityThread.java:4441)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:511)

我是 Android/Java/C++ 开发新手,我不确定这个错误是什么意思,但谷歌让我相信我的新库没有被发现。在我的 Eclipse 项目中,我在“libs\armeabi”中有这组库,在“jni\ffmpeg-android\build\ffmpeg\armeabi\lib”中有一个更复杂的文件夹结构的另一个副本。除了将“libs”重命名为“jniLibs”之外,Android Studio 似乎已将所有内容保持不变,但我遇到了这个错误,无法确定如何继续。

如何使用 Android Studio 以新名称编译这个新应用?

如果有帮助,这里是我的 Android.mk 文件:

    LOCAL_PATH := $(call my-dir)

include $(CLEAR_VARS)
MY_LIB_PATH := ffmpeg-android/build/ffmpeg/armeabi/lib
LOCAL_MODULE := bambuser-libavcore
LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavcore.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := bambuser-libavformat
LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavformat.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := bambuser-libavcodec
LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavcodec.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := bambuser-libavfilter
LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavfilter.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := bambuser-libavutil
LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavutil.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := bambuser-libswscale
LOCAL_SRC_FILES := $(MY_LIB_PATH)/libswscale.so
include $(PREBUILT_SHARED_LIBRARY)

#local_PATH := $(call my-dir)

include $(CLEAR_VARS)

LOCAL_CFLAGS := -DANDROID_NDK \
-DDISABLE_IMPORTGL

LOCAL_MODULE := video
LOCAL_SRC_FILES := video.c

LOCAL_C_INCLUDES := \
$(LOCAL_PATH)/include \
$(LOCAL_PATH)/ffmpeg-android/ffmpeg \
$(LOCAL_PATH)/freetype/include/freetype2 \
$(LOCAL_PATH)/freetype/include \
$(LOCAL_PATH)/ftgl/src \
$(LOCAL_PATH)/ftgl
LOCAL_LDLIBS := -L$(NDK_PLATFORMS_ROOT)/$(TARGET_PLATFORM)/arch-arm/usr/lib -L$(LOCAL_PATH) -L$(LOCAL_PATH)/ffmpeg-android/build/ffmpeg/armeabi/lib/ -lGLESv1_CM -ldl -lavformat -lavcodec -lavfilter -lavutil -lswscale -llog -lz -lm

include $(BUILD_SHARED_LIBRARY)

这是我的 NativeCalls.java:

    package com.nightscapecreations.anim3free;

public class NativeCalls {
//ffmpeg
public static native void initVideo();
public static native void loadVideo(String fileName); //
public static native void prepareStorageFrame();
public static native void getFrame(); //
public static native void freeConversionStorage();
public static native void closeVideo();//
public static native void freeVideo();//
//opengl
public static native void initPreOpenGL(); //
public static native void initOpenGL(); //
public static native void drawFrame(); //
public static native void closeOpenGL(); //
public static native void closePostOpenGL();//
//wallpaper
public static native void updateVideoPosition();
public static native void setSpanVideo(boolean b);
//getters
public static native int getVideoHeight();
public static native int getVideoWidth();
//setters
public static native void setWallVideoDimensions(int w,int h);
public static native void setWallDimensions(int w,int h);
public static native void setScreenPadding(int w,int h);
public static native void setVideoMargins(int w,int h);
public static native void setDrawDimensions(int drawWidth,int drawHeight);
public static native void setOffsets(int x,int y);
public static native void setSteps(int xs,int ys);
public static native void setScreenDimensions(int w, int h);
public static native void setTextureDimensions(int tx,
int ty );
public static native void setOrientation(boolean b);
public static native void setPreviewMode(boolean b);
public static native void setTonality(int t);
public static native void toggleGetFrame(boolean b);
//fps
public static native void setLoopVideo(boolean b);

static {
System.loadLibrary("avcore");
System.loadLibrary("avformat");
System.loadLibrary("avcodec");
//System.loadLibrary("avdevice");
System.loadLibrary("avfilter");
System.loadLibrary("avutil");
System.loadLibrary("swscale");
System.loadLibrary("video");
}

}

编辑

这是我的 video.c 文件的第一部分:

    #include <GLES/gl.h>
#include <GLES/glext.h>

#include <GLES2/gl2.h>
#include <GLES2/gl2ext.h>

#include <stdlib.h>
#include <time.h>

#include <libavcodec/avcodec.h>
#include <libavformat/avformat.h>
#include <libswscale/swscale.h>

#include <jni.h>
#include <string.h>
#include <stdio.h>
#include <android/log.h>

//#include <FTGL/ftgl.h>

//ffmpeg video variables
int initializedVideo=0;
int initializedFrame=0;
AVFormatContext *pFormatCtx=NULL;
int videoStream;
AVCodecContext *pCodecCtx=NULL;
AVCodec *pCodec=NULL;
AVFrame *pFrame=NULL;
AVPacket packet;
int frameFinished;
float aspect_ratio;

//ffmpeg video conversion variables
AVFrame *pFrameConverted=NULL;
int numBytes;
uint8_t *bufferConverted=NULL;

//opengl
int textureFormat=PIX_FMT_RGBA; // PIX_FMT_RGBA PIX_FMT_RGB24
int GL_colorFormat=GL_RGBA; // Must match the colorspace specified for textureFormat
int textureWidth=256;
int textureHeight=256;
int nTextureHeight=-256;
int textureL=0, textureR=0, textureW=0;
int frameTonality;

//GLuint textureConverted=0;
GLuint texturesConverted[2] = { 0,1 };
GLuint dummyTex = 2;
static int len=0;


static const char* BWVertexSrc =
"attribute vec4 InVertex;\n"
"attribute vec2 InTexCoord0;\n"
"attribute vec2 InTexCoord1;\n"
"uniform mat4 ProjectionModelviewMatrix;\n"
"varying vec2 TexCoord0;\n"
"varying vec2 TexCoord1;\n"

"void main()\n"
"{\n"
" gl_Position = ProjectionModelviewMatrix * InVertex;\n"
" TexCoord0 = InTexCoord0;\n"
" TexCoord1 = InTexCoord1;\n"
"}\n";
static const char* BWFragmentSrc =

"#version 110\n"
"uniform sampler2D Texture0;\n"
"uniform sampler2D Texture1;\n"

"varying vec2 TexCoord0;\n"
"varying vec2 TexCoord1;\n"

"void main()\n"
"{\n"
" vec3 color = texture2D(m_Texture, texCoord).rgb;\n"
" float gray = (color.r + color.g + color.b) / 3.0;\n"
" vec3 grayscale = vec3(gray);\n"

" gl_FragColor = vec4(grayscale, 1.0);\n"
"}";
static GLuint shaderProgram;


//// Create a pixmap font from a TrueType file.
//FTGLPixmapFont font("/home/user/Arial.ttf");
//// Set the font size and render a small text.
//font.FaceSize(72);
//font.Render("Hello World!");

//screen dimensions
int screenWidth = 50;
int screenHeight= 50;
int screenL=0, screenR=0, screenW=0;
int dPaddingX=0,dPaddingY=0;
int drawWidth=50,drawHeight=50;

//wallpaper
int wallWidth = 50;
int wallHeight = 50;
int xOffSet, yOffSet;
int xStep, yStep;
jboolean spanVideo = JNI_TRUE;

//video dimensions
int wallVideoWidth = 0;
int wallVideoHeight = 0;
int marginX, marginY;
jboolean isScreenPortrait = JNI_TRUE;
jboolean isPreview = JNI_TRUE;
jboolean loopVideo = JNI_TRUE;
jboolean isGetFrame = JNI_TRUE;

//file
const char * szFileName;

#define max( a, b ) ( ((a) > (b)) ? (a) : (b) )
#define min( a, b ) ( ((a) < (b)) ? (a) : (b) )

//test variables
#define RGBA8(r, g, b) (((r) << (24)) | ((g) << (16)) | ((b) << (8)) | 255)
int sPixelsInited=JNI_FALSE;
uint32_t *s_pixels=NULL;

int s_pixels_size() {
return (sizeof(uint32_t) * textureWidth * textureHeight * 5);
}

void render_pixels1(uint32_t *pixels, uint32_t c) {
int x, y;
/* fill in a square of 5 x 5 at s_x, s_y */
for (y = 0; y < textureHeight; y++) {
for (x = 0; x < textureWidth; x++) {
int idx = x + y * textureWidth;
pixels[idx++] = RGBA8(255, 255, 0);
}
}
}

void render_pixels2(uint32_t *pixels, uint32_t c) {
int x, y;
/* fill in a square of 5 x 5 at s_x, s_y */
for (y = 0; y < textureHeight; y++) {
for (x = 0; x < textureWidth; x++) {
int idx = x + y * textureWidth;
pixels[idx++] = RGBA8(0, 0, 255);
}
}
}

void Java_com_nightscapecreations_anim3free_NativeCalls_initVideo (JNIEnv * env, jobject this) {
initializedVideo = 0;
initializedFrame = 0;
}

/* list of things that get loaded: */
/* buffer */
/* pFrameConverted */
/* pFrame */
/* pCodecCtx */
/* pFormatCtx */
void Java_com_nightscapecreations_anim3free_NativeCalls_loadVideo (JNIEnv * env, jobject this, jstring fileName) {
jboolean isCopy;
szFileName = (*env)->GetStringUTFChars(env, fileName, &isCopy);
//debug
__android_log_print(ANDROID_LOG_DEBUG, "NDK: ", "NDK:LC: [%s]", szFileName);
// Register all formats and codecs
av_register_all();
// Open video file
if(av_open_input_file(&pFormatCtx, szFileName, NULL, 0, NULL)!=0) {
__android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Couldn't open file");
return;
}
__android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Succesfully loaded file");
// Retrieve stream information */
if(av_find_stream_info(pFormatCtx)<0) {
__android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Couldn't find stream information");
return;
}
__android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Found stream info");
// Find the first video stream
videoStream=-1;
int i;
for(i=0; i<pFormatCtx->nb_streams; i++)
if(pFormatCtx->streams[i]->codec->codec_type==CODEC_TYPE_VIDEO) {
videoStream=i;
break;
}
if(videoStream==-1) {
__android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Didn't find a video stream");
return;
}
__android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Found video stream");
// Get a pointer to the codec contetx for the video stream
pCodecCtx=pFormatCtx->streams[videoStream]->codec;
// Find the decoder for the video stream
pCodec=avcodec_find_decoder(pCodecCtx->codec_id);
if(pCodec==NULL) {
__android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Unsupported codec");
return;
}
// Open codec
if(avcodec_open(pCodecCtx, pCodec)<0) {
__android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Could not open codec");
return;
}
// Allocate video frame (decoded pre-conversion frame)
pFrame=avcodec_alloc_frame();
// keep track of initialization
initializedVideo = 1;
__android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Finished loading video");
}

//for this to work, you need to set the scaled video dimensions first
void Java_com_nightscapecreations_anim3free_NativeCalls_prepareStorageFrame (JNIEnv * env, jobject this) {
// Allocate an AVFrame structure
pFrameConverted=avcodec_alloc_frame();
// Determine required buffer size and allocate buffer
numBytes=avpicture_get_size(textureFormat, textureWidth, textureHeight);
bufferConverted=(uint8_t *)av_malloc(numBytes*sizeof(uint8_t));
if ( pFrameConverted == NULL || bufferConverted == NULL )
__android_log_print(ANDROID_LOG_DEBUG, "prepareStorage>>>>", "Out of memory");
// Assign appropriate parts of buffer to image planes in pFrameRGB
// Note that pFrameRGB is an AVFrame, but AVFrame is a superset
// of AVPicture
avpicture_fill((AVPicture *)pFrameConverted, bufferConverted, textureFormat, textureWidth, textureHeight);
__android_log_print(ANDROID_LOG_DEBUG, "prepareStorage>>>>", "Created frame");
__android_log_print(ANDROID_LOG_DEBUG, "prepareStorage>>>>", "texture dimensions: %dx%d", textureWidth, textureHeight);
initializedFrame = 1;
}

jint Java_com_nightscapecreations_anim3free_NativeCalls_getVideoWidth (JNIEnv * env, jobject this) {
return pCodecCtx->width;
}

jint Java_com_nightscapecreations_anim3free_NativeCalls_getVideoHeight (JNIEnv * env, jobject this) {
return pCodecCtx->height;
}

void Java_com_nightscapecreations_anim3free_NativeCalls_getFrame (JNIEnv * env, jobject this) {
// keep reading packets until we hit the end or find a video packet
while(av_read_frame(pFormatCtx, &packet)>=0) {
static struct SwsContext *img_convert_ctx;
// Is this a packet from the video stream?
if(packet.stream_index==videoStream) {
// Decode video frame
/* __android_log_print(ANDROID_LOG_DEBUG, */
/* "video.c", */
/* "getFrame: Try to decode frame" */
/* ); */
avcodec_decode_video(pCodecCtx, pFrame, &frameFinished, packet.data, packet.size);
// Did we get a video frame?
if(frameFinished) {
if(img_convert_ctx == NULL) {
/* get/set the scaling context */
int w = pCodecCtx->width;
int h = pCodecCtx->height;
img_convert_ctx = sws_getContext(w, h, pCodecCtx->pix_fmt, textureWidth,textureHeight, textureFormat, SWS_FAST_BILINEAR, NULL, NULL, NULL);
if(img_convert_ctx == NULL) {
return;
}
}
/* if img convert null */
/* finally scale the image */
/* __android_log_print(ANDROID_LOG_DEBUG, */
/* "video.c", */
/* "getFrame: Try to scale the image" */
/* ); */

//pFrameConverted = pFrame;
sws_scale(img_convert_ctx, pFrame->data, pFrame->linesize, 0, pCodecCtx->height, pFrameConverted->data, pFrameConverted->linesize);
//av_picture_crop(pFrameConverted->data, pFrame->data, 1, pCodecCtx->height, pCodecCtx->width);
//av_picture_crop();
//avfilter_vf_crop();

/* do something with pFrameConverted */
/* ... see drawFrame() */
/* We found a video frame, did something with it, now free up
packet and return */
av_free_packet(&packet);
// __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.age: %d", pFrame->age);
// __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.buffer_hints: %d", pFrame->buffer_hints);
// __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.display_picture_number: %d", pFrame->display_picture_number);
// __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.hwaccel_picture_private: %d", pFrame->hwaccel_picture_private);
// __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.key_frame: %d", pFrame->key_frame);
// __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.palette_has_changed: %d", pFrame->palette_has_changed);
// __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.pict_type: %d", pFrame->pict_type);
// __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.qscale_type: %d", pFrame->qscale_type);
// __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.age: %d", pFrameConverted->age);
// __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.buffer_hints: %d", pFrameConverted->buffer_hints);
// __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.display_picture_number: %d", pFrameConverted->display_picture_number);
// __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.hwaccel_picture_private: %d", pFrameConverted->hwaccel_picture_private);
// __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.key_frame: %d", pFrameConverted->key_frame);
// __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.palette_has_changed: %d", pFrameConverted->palette_has_changed);
// __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.pict_type: %d", pFrameConverted->pict_type);
// __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.qscale_type: %d", pFrameConverted->qscale_type);
return;
} /* if frame finished */
} /* if packet video stream */
// Free the packet that was allocated by av_read_frame
av_free_packet(&packet);
} /* while */
//reload video when you get to the end
av_seek_frame(pFormatCtx,videoStream,0,AVSEEK_FLAG_ANY);
}

void Java_com_nightscapecreations_anim3free_NativeCalls_setLoopVideo (JNIEnv * env, jobject this, jboolean b) {
loopVideo = b;
}

void Java_com_nightscapecreations_anim3free_NativeCalls_closeVideo (JNIEnv * env, jobject this) {
if ( initializedFrame == 1 ) {
// Free the converted image
av_free(bufferConverted);
av_free(pFrameConverted);
initializedFrame = 0;
__android_log_print(ANDROID_LOG_DEBUG, "closeVideo>>>>", "Freed converted image");
}
if ( initializedVideo == 1 ) {
/* // Free the YUV frame */
av_free(pFrame);
/* // Close the codec */
avcodec_close(pCodecCtx);
// Close the video file
av_close_input_file(pFormatCtx);
initializedVideo = 0;
__android_log_print(ANDROID_LOG_DEBUG, "closeVideo>>>>", "Freed video structures");
}
}

void Java_com_nightscapecreations_anim3free_NativeCalls_freeVideo (JNIEnv * env, jobject this) {
if ( initializedVideo == 1 ) {
/* // Free the YUV frame */
av_free(pFrame);
/* // Close the codec */
avcodec_close(pCodecCtx);
// Close the video file
av_close_input_file(pFormatCtx);
__android_log_print(ANDROID_LOG_DEBUG, "closeVideo>>>>", "Freed video structures");
initializedVideo = 0;
}
}

void Java_com_nightscapecreations_anim3free_NativeCalls_freeConversionStorage (JNIEnv * env, jobject this) {
if ( initializedFrame == 1 ) {
// Free the converted image
av_free(bufferConverted);
av_freep(pFrameConverted);
initializedFrame = 0;
}
}

/*--- END OF VIDEO ----*/

/* disable these capabilities. */
static GLuint s_disable_options[] = {
GL_FOG,
GL_LIGHTING,
GL_CULL_FACE,
GL_ALPHA_TEST,
GL_BLEND,
GL_COLOR_LOGIC_OP,
GL_DITHER,
GL_STENCIL_TEST,
GL_DEPTH_TEST,
GL_COLOR_MATERIAL,
0
};

// For stuff that opengl needs to work with,
// like the bitmap containing the texture
void Java_com_nightscapecreations_anim3free_NativeCalls_initPreOpenGL (JNIEnv * env, jobject this) {

}
...

最佳答案

如果你只想重用你以前的库而不是用 NDK 编译任何东西,你可以简单地将你所有的 .so 文件放在 jniLibs/<abi> 中。 .

否则,由于您的 ndk 构建依赖于预构建,您无法正确配置它以直接使用 gradle 配置 (ndk{})。无论如何,由于现在不推荐使用 ndk 支持,让它工作的最干净的方法是让 gradle 调用 ndk-build 并使用现有的 Makefiles:

import org.apache.tools.ant.taskdefs.condition.Os

...

android {
...
sourceSets.main {
jniLibs.srcDir 'src/main/libs' //set .so files location to libs instead of jniLibs
jni.srcDirs = [] //disable automatic ndk-build call
}

// add a task that calls regular ndk-build(.cmd) script from app directory
task ndkBuild(type: Exec) {
if (Os.isFamily(Os.FAMILY_WINDOWS)) {
commandLine 'ndk-build.cmd', '-C', file('src/main').absolutePath
} else {
commandLine 'ndk-build', '-C', file('src/main').absolutePath
}
}

// add this task as a dependency of Java compilation
tasks.withType(JavaCompile) {
compileTask -> compileTask.dependsOn ndkBuild
}
}

关于java - 如何在 Android Studio 中使用原生 C 库,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/28756913/

31 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com