gpt4 book ai didi

c++ - ios opencv 的回调 "processImage"分辨率与 ImageView 不匹配

转载 作者:太空宇宙 更新时间:2023-11-03 23:17:42 25 4
gpt4 key购买 nike

我在 iOS 上使用过 opencv3,我使用下面的代码来捕获视频和处理图像

videoCamera = [[CvVideoCamera alloc] initWithParentView:_imageView];
videoCamera.defaultAVCaptureDevicePosition = AVCaptureDevicePositionBack;
videoCamera.delegate = self;
videoCamera.defaultAVCaptureSessionPreset = AVCaptureSessionPreset640x480;
videoCamera.defaultAVCaptureVideoOrientation = AVCaptureVideoOrientationPortrait;
videoCamera.defaultFPS = 30;

和回调

- (void)processImage:(cv::Mat &)image {}

但是我得到的图像是 640 行和 480 列的尺寸,这很奇怪。因为如果我将图像放入“640 宽 * 480 高”的 ImageView 中,它就非常适合。图像垫应该是 480 行 * 640 列,因为 opencv 的垫是行主要的。我需要将其处理为 480 * 640 垫子,有什么解决方案吗?

我也尝试过转置它,但是当它显示在 ImageView 上时看起来很奇怪,也许 opencv 的内部已经隐式地旋转了垫子?

最佳答案

虽然看起来您已修复它,但我感觉您以后会遇到类似的问题。这是避免所有这些问题的方法(我只是从另一个答案中粘贴代码,该答案恰好也适用于这种情况):

“您没有具体说明所讨论的窗口是 View 还是层,或者它是实时视频还是保存到文件中。您也没有具体说明视频是通过 OpenCV 录制的还是通过是用另一种方式记录的。

因此,我包含了每种意外事件的代码片段;如果您熟悉 OpenCV 和 iOS View 编程的基础知识,那么您应该使用什么应该是显而易见的(顺便说一句,在我的例子中,我使用了所有这些):

- (void)viewDidLayoutSubviews {
[super viewDidLayoutSubviews];

switch ([UIDevice currentDevice].orientation) {
case UIDeviceOrientationPortraitUpsideDown:
self.videoCamera.defaultAVCaptureVideoOrientation = AVCaptureVideoOrientationPortraitUpsideDown;
break;
case UIDeviceOrientationLandscapeLeft:
self.videoCamera.defaultAVCaptureVideoOrientation = AVCaptureVideoOrientationLandscapeLeft;
break;
case UIDeviceOrientationLandscapeRight:
self.videoCamera.defaultAVCaptureVideoOrientation = AVCaptureVideoOrientationLandscapeRight;
break;
default:
self.videoCamera.defaultAVCaptureVideoOrientation = AVCaptureVideoOrientationPortrait;
break;
}

[self refresh];
}

- (void)processImage:(cv::Mat &)mat {

if (self.videoCamera.running) {
switch (self.videoCamera.defaultAVCaptureVideoOrientation) {
case AVCaptureVideoOrientationLandscapeLeft:
case AVCaptureVideoOrientationLandscapeRight:
// The landscape video is captured upside-down.
// Rotate it by 180 degrees.
cv::flip(mat, mat, -1);
break;
default:
break;
}
}


- (void)convertBlendSrcMatToWidth:(int)dstW height:(int)dstH {

double dstAspectRatio = dstW / (double)dstH;

int srcW = originalBlendSrcMat.cols;
int srcH = originalBlendSrcMat.rows;
double srcAspectRatio = srcW / (double)srcH;
cv::Mat subMat;
if (srcAspectRatio < dstAspectRatio) {
int subMatH = (int)(srcW / dstAspectRatio);
int startRow = (srcH - subMatH) / 2;
int endRow = startRow + subMatH;
subMat = originalBlendSrcMat.rowRange(startRow, endRow);
} else {
int subMatW = (int)(srcH * dstAspectRatio);
int startCol = (srcW - subMatW) / 2;
int endCol = startCol + subMatW;
subMat = originalBlendSrcMat.colRange(startCol, endCol);
}
cv::resize(subMat, convertedBlendSrcMat, cv::Size(dstW, dstH), 0.0, 0.0, cv::INTER_LANCZOS4);



- (int)imageWidth {
AVCaptureVideoDataOutput *output = [self.captureSession.outputs lastObject];
NSDictionary *videoSettings = [output videoSettings];
int videoWidth = [[videoSettings objectForKey:@"Width"] intValue];
return videoWidth;
}

- (int)imageHeight {
AVCaptureVideoDataOutput *output = [self.captureSession.outputs lastObject];
NSDictionary *videoSettings = [output videoSettings];
int videoHeight = [[videoSettings objectForKey:@"Height"] intValue];
return videoHeight;
}

- (void)updateSize {
// Do nothing.
}

- (void)layoutPreviewLayer {
if (self.parentView != nil) {

// Center the video preview.
self.customPreviewLayer.position = CGPointMake(0.5 * self.parentView.frame.size.width, 0.5 * self.parentView.frame.size.height);

// Find the video's aspect ratio.
CGFloat videoAspectRatio = self.imageWidth / (CGFloat)self.imageHeight;

// Scale the video preview while maintaining its aspect ratio.
CGFloat boundsW;
CGFloat boundsH;
if (self.imageHeight > self.imageWidth) {
if (self.letterboxPreview) {
boundsH = self.parentView.frame.size.height;
boundsW = boundsH * videoAspectRatio;
} else {
boundsW = self.parentView.frame.size.width;
boundsH = boundsW / videoAspectRatio;
}
} else {
if (self.letterboxPreview) {
boundsW = self.parentView.frame.size.width;
boundsH = boundsW / videoAspectRatio;
} else {
boundsH = self.parentView.frame.size.height;
boundsW = boundsH * videoAspectRatio;
}
}
self.customPreviewLayer.bounds = CGRectMake(0.0, 0.0, boundsW, boundsH);
}
}


- (void)processImage:(cv::Mat &)mat {

if (self.videoCamera.running) {
switch (self.videoCamera.defaultAVCaptureVideoOrientation) {
case AVCaptureVideoOrientationLandscapeLeft:
case AVCaptureVideoOrientationLandscapeRight:
// The landscape video is captured upside-down.
// Rotate it by 180 degrees.
cv::flip(mat, mat, -1);
break;
default:
break;
}
}


- (void)convertBlendSrcMatToWidth:(int)dstW height:(int)dstH {

double dstAspectRatio = dstW / (double)dstH;

int srcW = originalBlendSrcMat.cols;
int srcH = originalBlendSrcMat.rows;
double srcAspectRatio = srcW / (double)srcH;
cv::Mat subMat;
if (srcAspectRatio < dstAspectRatio) {
int subMatH = (int)(srcW / dstAspectRatio);
int startRow = (srcH - subMatH) / 2;
int endRow = startRow + subMatH;
subMat = originalBlendSrcMat.rowRange(startRow, endRow);
} else {
int subMatW = (int)(srcH * dstAspectRatio);
int startCol = (srcW - subMatW) / 2;
int endCol = startCol + subMatW;
subMat = originalBlendSrcMat.colRange(startCol, endCol);
}
cv::resize(subMat, convertedBlendSrcMat, cv::Size(dstW, dstH), 0.0, 0.0, cv::INTER_LANCZOS4);



- (int)imageWidth {
AVCaptureVideoDataOutput *output = [self.captureSession.outputs lastObject];
NSDictionary *videoSettings = [output videoSettings];
int videoWidth = [[videoSettings objectForKey:@"Width"] intValue];
return videoWidth;
}

- (int)imageHeight {
AVCaptureVideoDataOutput *output = [self.captureSession.outputs lastObject];
NSDictionary *videoSettings = [output videoSettings];
int videoHeight = [[videoSettings objectForKey:@"Height"] intValue];
return videoHeight;
}

- (void)updateSize {
// Do nothing.
}

- (void)layoutPreviewLayer {
if (self.parentView != nil) {

// Center the video preview.
self.customPreviewLayer.position = CGPointMake(0.5 * self.parentView.frame.size.width, 0.5 * self.parentView.frame.size.height);

// Find the video's aspect ratio.
CGFloat videoAspectRatio = self.imageWidth / (CGFloat)self.imageHeight;

// Scale the video preview while maintaining its aspect ratio.
CGFloat boundsW;
CGFloat boundsH;
if (self.imageHeight > self.imageWidth) {
if (self.letterboxPreview) {
boundsH = self.parentView.frame.size.height;
boundsW = boundsH * videoAspectRatio;
} else {
boundsW = self.parentView.frame.size.width;
boundsH = boundsW / videoAspectRatio;
}
} else {
if (self.letterboxPreview) {
boundsW = self.parentView.frame.size.width;
boundsH = boundsW / videoAspectRatio;
} else {
boundsH = self.parentView.frame.size.height;
boundsW = boundsH * videoAspectRatio;
}
}
self.customPreviewLayer.bounds = CGRectMake(0.0, 0.0, boundsW, boundsH);
}
}

这里有很多东西,你必须知道把它放在哪里。如果您不明白,请告诉我。

关于c++ - ios opencv 的回调 "processImage"分辨率与 ImageView 不匹配,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/36468593/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com