大佬教程收集整理的这篇文章主要介绍了ios – 如何实时捕捉iPhone视频录制中的逐帧图像,大佬教程大佬觉得挺不错的,现在分享给大家,也给大家做个参考。
Idea http://i62.tinypic.com/2af9zia.png
我跟随苹果公司的this guide.我更新了代码以使用ARC,当然我的视图控制器是AVCaptureVideoDataOutputSampleBufferDelegate,但我不知道如何实际开始捕获数据,就像启动相机获取一些实际输入一样.
这是我的代码:
#import "ViewController.h" @interface ViewController () @property (nonatomic,strong) AVCaptureSession *session; @property (nonatomic,strong) AVCaptureVideoPreviewLayer *previewLayer; @end @implementation ViewController - (void)viewDidLoad { [super viewDidLoad]; // Do any additional setup after loading the view,typically from a nib [self setupCaptureSession]; } - (void)didReceiveMemoryWarning { [super didReceiveMemoryWarning]; // Dispose of any resources that can be recreated. } // Create and configure a capture session and start it running - (void)setupCaptureSession { NSError *error = nil; // Create the session AVCaptureSession *session = [[AVCaptureSession alloc] init]; // Configure the session to produce lower resolution video frames,if your // processing algorithm can cope. We'll specify medium quality for the // chosen device. session.sessionPreset = AVCaptureSessionPresetMedium; // Find a suitable AVCaptureDevice AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; // Create a device input with the device and add it to the session. AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; if (!input) { // Handling the error appropriately. } [session addInput:input]; // Create a VideoDataOutput and add it to the session AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init]; [session addOutput:output]; // Configure your output. dispatch_queue_t queue = dispatch_queue_create("myQueue",null); [output setSampleBufferDelegate:self queue:queue]; // Specify the pixel format output.videoSetTings = [NSDictionary DictionaryWithObject: [NSnumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]; // Start the session running to start the flow of data [self startCapturingWithSession:session]; // Assign session to an ivar. [self setSession:session]; } - (void)startCapturingWithSession: (AVCaptureSession *) captureSession { //----- DISPLAY THE PREVIEW LAYER ----- //Display it full screen under out view controller exisTing controls NSLog(@"Display the preview layer"); CGRect layerRect = [[[self view] layer] bounds]; [self.previewLayer setBounds:layerRect]; [self.previewLayer setPosition:CGPointMake(CGRectGetMidX(layerRect),CGRectGetMidY(layerRect))]; //[[[self view] layer] addSublayer:[[self CaptureManager] self.previewLayer]]; //We use this instead so it goes on a layer behind our UI controls (avoids us having to manually bring each control to the front): UIView *CameraView = [[UIView alloc] init]; [[self view] addSubview:CameraView]; [self.view sendSubviewToBACk:CameraView]; [[CameraView layer] addSublayer:self.previewLayer]; //----- START THE CAPTURE SESSION RUNNING ----- [captureSession startRunning]; } // Delegate routIne that is called when a sample buffer was written - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { // Create a UIImage from the sample buffer data UIImage *image = [self imageFromSampleBuffer:sampleBuffer]; } // Create a UIImage from sample buffer data - (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer { // Get a CMSampleBuffer's Core Video image buffer for the media data CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); // Lock the base address of the pixel buffer CVPixelBufferLockBaseAddress(imageBuffer,0); // Get the number of bytes per row for the pixel buffer void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); // Get the number of bytes per row for the pixel buffer size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); // Get the pixel buffer width and height size_t width = CVPixelBufferGetWidth(imageBuffer); size_t height = CVPixelBufferGetHeight(imageBuffer); // Create a device-dependent RGB color space CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); // Create a bitmap graphics context with the sample buffer data CGContextRef context = CGBitmapContextCreate(baseAddress,width,height,8,bytesPerRow,colorSpace,kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); // Create a Quartz image from the pixel data in the bitmap graphics context CGImageRef quartzImage = CGBitmapContextCreateImage(context); // Unlock the pixel buffer CVPixelBufferUnlockBaseAddress(imageBuffer,0); // Free up the context and color space CGContextRelease(context); CGColorSpaceRelease(colorSpacE); // Create an image object from the Quartz image UIImage *image = [UIImage imageWithCGImage:quartzImage]; // Release the Quartz image CGImageRelease(quartzImagE); return (imagE); } @end@H_874_2@
#import "ViewController.h" @interface ViewController () @property (nonatomic,null); [output setSampleBufferDelegate:self queue:queue]; // Specify the pixel format output.videoSetTings = [NSDictionary DictionaryWithObject: [NSnumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]; // Start the session running to start the flow of data [self startCapturingWithSession:session]; // Assign session to an ivar. [self setSession:session]; } - (void)startCapturingWithSession: (AVCaptureSession *) captureSession { NSLog(@"Adding video preview layer"); [self setPreviewLayer:[[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession]]; [self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill]; //----- DISPLAY THE PREVIEW LAYER ----- //Display it full screen under out view controller exisTing controls NSLog(@"Display the preview layer"); CGRect layerRect = [[[self view] layer] bounds]; [self.previewLayer setBounds:layerRect]; [self.previewLayer setPosition:CGPointMake(CGRectGetMidX(layerRect),CGRectGetMidY(layerRect))]; //[[[self view] layer] addSublayer:[[self CaptureManager] self.previewLayer]]; //We use this instead so it goes on a layer behind our UI controls (avoids us having to manually bring each control to the front): UIView *CameraView = [[UIView alloc] init]; [[self view] addSubview:CameraView]; [self.view sendSubviewToBACk:CameraView]; [[CameraView layer] addSublayer:self.previewLayer]; //----- START THE CAPTURE SESSION RUNNING ----- [captureSession startRunning]; } // Delegate routIne that is called when a sample buffer was written - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { // Create a UIImage from the sample buffer data [connection setVideoOrientation:AVCaptureVideoOrientationLandscapeLeft]; UIImage *image = [self imageFromSampleBuffer:sampleBuffer]; } // Create a UIImage from sample buffer data - (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer { // Get a CMSampleBuffer's Core Video image buffer for the media data CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); // Lock the base address of the pixel buffer CVPixelBufferLockBaseAddress(imageBuffer,0); // Free up the context and color space CGContextRelease(context); CGColorSpaceRelease(colorSpacE); // Create an image object from the Quartz image UIImage *image = [UIImage imageWithCGImage:quartzImage]; // Release the Quartz image CGImageRelease(quartzImagE); return (imagE); } @end@H_874_2@ @H_874_2@
以上是大佬教程为你收集整理的ios – 如何实时捕捉iPhone视频录制中的逐帧图像全部内容,希望文章能够帮你解决ios – 如何实时捕捉iPhone视频录制中的逐帧图像所遇到的程序开发问题。
如果觉得大佬教程网站内容还不错,欢迎将大佬教程推荐给程序员好友。
本图文内容来源于网友网络收集整理提供,作为学习参考使用,版权属于原作者。
如您有任何意见或建议可联系处理。小编QQ:384754419,请注明来意。