gpt4 book ai didi

cocoa-touch - 来自 CALayer (AVPlayerLayer) 的 OpenGL 纹理

转载 作者:可可西里 更新时间:2023-11-01 03:58:36 24 4
gpt4 key购买 nike

我有一个 AVPlayerLayer,我想从中创建一个 OpenGL 纹理。我对 opengl 纹理很满意,甚至对将 CGImageRef 转换为 opengl 纹理也很满意。在我看来,下面的代码应该可以工作,但我得到的只是纯黑色。我究竟做错了什么?我需要先在 CALayer/AVPlayerLayer 上设置任何属性吗?

CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

int width = (int)[layer bounds].size.width;
int height = (int)[layer bounds].size.height;

CGContextRef context = CGBitmapContextCreate(NULL,
width,
height,
8,
width * 4,
colorSpace,
kCGImageAlphaPremultipliedLast);

CGColorSpaceRelease(colorSpace);

if (context== NULL) {
ofLog(OF_LOG_ERROR, "getTextureFromLayer: failed to create context 1");
return;
}

[[layer presentationLayer] renderInContext:context];

CGImageRef cgImage = CGBitmapContextCreateImage(context);

int bytesPerPixel = CGImageGetBitsPerPixel(cgImage)/8;
if(bytesPerPixel == 3) bytesPerPixel = 4;

GLubyte *pixels = (GLubyte *) malloc(width * height * bytesPerPixel);

CGContextRelease(context);
context = CGBitmapContextCreate(pixels,
width,
height,
CGImageGetBitsPerComponent(cgImage),
width * bytesPerPixel,
CGImageGetColorSpace(cgImage),
kCGImageAlphaPremultipliedLast);

if(context == NULL) {
ofLog(OF_LOG_ERROR, "getTextureFromLayer: failed to create context 2");
free(pixels);
return;
}

CGContextDrawImage(context, CGRectMake(0.0, 0.0, width, height), cgImage);

int glMode;
switch(bytesPerPixel) {
case 1:
glMode = GL_LUMINANCE;
break;
case 3:
glMode = GL_RGB;
break;
case 4:
default:
glMode = GL_RGBA; break;
}

if(texture.bAllocated() == false || texture.getWidth() != width || texture.getHeight() != height) {
NSLog(@"getTextureFromLayer: allocating texture %i, %i\n", width, height);
texture.allocate(width, height, glMode, true);
}

// test texture
// for(int i=0; i<width*height*4; i++) pixels[i] = ofRandomuf() * 255;

texture.loadData(pixels, width, height, glMode);

CGContextRelease(context);
CFRelease(cgImage);
free(pixels);

附言变量 'texture' 是我知道有效的 C++ opengl(-es 兼容)纹理对象。如果我取消注释“测试纹理”for 循环用随机噪声填充纹理,我可以看到,所以问题肯定在之前。

更新

为了回应 Nick Weaver 的回复,我尝试了一种不同的方法,现在我总是从状态 == 3 (AVAssetReaderStatusFailed) 的 copyNextSampleBuffer 返回 NULL。我错过了什么吗?

变量

AVPlayer                *videoPlayer;
AVPlayerLayer *videoLayer;
AVAssetReader *videoReader;
AVAssetReaderTrackOutput*videoOutput;

初始化

    videoPlayer = [[AVPlayer alloc] initWithURL:[NSURL fileURLWithPath:[NSString stringWithUTF8String:videoPath.c_str()]]];

if(videoPlayer == nil) {
NSLog(@"videoPlayer == nil ERROR LOADING %s\n", videoPath.c_str());
} else {
NSLog(@"videoPlayer: %@", videoPlayer);
videoLayer = [[AVPlayerLayer playerLayerWithPlayer:videoPlayer] retain];
videoLayer.frame = [ThreeDView instance].bounds;
// [[ThreeDView instance].layer addSublayer:videoLayer]; // test to see if it's loading and running

AVAsset *asset = videoPlayer.currentItem.asset;
NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA], (NSString*)kCVPixelBufferPixelFormatTypeKey, nil];

videoReader = [[AVAssetReader alloc] initWithAsset:asset error:nil];
videoOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:[tracks objectAtIndex:0] outputSettings:settings];
[videoReader addOutput:videoOutput];
[videoReader startReading];
}

绘制循环

    if(videoPlayer == 0) {
ofLog(OF_LOG_WARNING, "Shot::drawVideo: videoPlayer == 0");
return;
}


if(videoOutput == 0) {
ofLog(OF_LOG_WARNING, "Shot::drawVideo: videoOutput == 0");
return;
}

CMSampleBufferRef sampleBuffer = [videoOutput copyNextSampleBuffer];

if(sampleBuffer == 0) {
ofLog(OF_LOG_ERROR, "Shot::drawVideo: sampleBuffer == 0, status: %i", videoReader.status);
return;
}


CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CFRelease(sampleBuffer);

CVPixelBufferLockBaseAddress(imageBuffer,0);

unsigned char *pixels = ( unsigned char *)CVPixelBufferGetBaseAddress(imageBuffer);

int width = CVPixelBufferGetWidth(imageBuffer);
int height = CVPixelBufferGetHeight(imageBuffer);

if(videoTexture.bAllocated() == false || videoTexture.getWidth() != width || videoTexture.getHeight() != height) {
NSLog(@"Shot::drawVideo() allocating texture %i, %i\n", width, height);
videoTexture.allocate(width, height, GL_RGBA, true);
}

videoTexture.loadData(pixels, width, height, GL_BGRA);

CVPixelBufferUnlockBaseAddress(imageBuffer,0);

最佳答案

我认为iOS4: how do I use video file as an OpenGL texture?将对您的问题有所帮助。

关于cocoa-touch - 来自 CALayer (AVPlayerLayer) 的 OpenGL 纹理,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/5773197/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com