大佬教程收集整理的这篇文章主要介绍了如何使用多对等连接将摄像头从一个iOS设备流式传输到另一个设备,大佬教程大佬觉得挺不错的,现在分享给大家,也给大家做个参考。
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { // Create a UIImage from the sample buffer data UIImage *image = [self imageFromSampleBuffer:sampleBuffer]; } // Create a UIImage from sample buffer data - (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer { // Get a CMSampleBuffer's Core Video image buffer for the media data CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); // Lock the base address of the pixel buffer CVPixelBufferLockBaseAddress(imageBuffer,0); // Get the number of bytes per row for the pixel buffer void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); // Get the number of bytes per row for the pixel buffer size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); // Get the pixel buffer width and height size_t width = CVPixelBufferGetWidth(imageBuffer); size_t height = CVPixelBufferGetHeight(imageBuffer); // Create a device-dependent RGB color space CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); // Create a bitmap graphics context with the sample buffer data CGContextRef context = CGBitmapContextCreate(baseAddress,width,height,8,bytesPerRow,colorSpace,kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); // Create a Quartz image from the pixel data in the bitmap graphics context CGImageRef quartzImage = CGBitmapContextCreateImage(context); // Unlock the pixel buffer CVPixelBufferUnlockBaseAddress(imageBuffer,0); // Free up the context and color space CGContextRelease(context); CGColorSpaceRelease(colorSpacE); // Create an image object from the Quartz image UIImage *image = [UIImage imageWithCGImage:quartzImage]; // Release the Quartz image CGImageRelease(quartzImagE); return (imagE); }
我们是否可以使用多个对等方将样本缓冲区信息直接发送到另一个设备,或者是否有任何有效的方法将数据传输到其他iOS设备?
谢谢.
将要发送流的一个对等体将使用此代码.在captureOutput Delegate方法中:
NSData *imageData = UIImageJPEGRepresentation(cgBACkedImage,0.2); // maybe not always the correct input? just using this to send current FPs... AVCaptureInputPort* inputPort = connection.inputPorts[0]; AVCaptureDeviceInput* deviceInput = (AVCaptureDeviceInput*) inputPort.input; CMTime frameDuration = deviceInput.device.activeVideoMaxFrameDuration; NSDictionary* Dict = @{ @"image": imageData,@"timestamp" : timestamp,@"framesPerSecond": @(frameDuration.timescalE) }; NSData *data = [NSKeyedArchiver archivedDataWithRootObject:Dict]; [_session sendData:data toPeers:_session.connectedPeers withMode:MCSessionSendDataReliable error:nil];
在接收方:
- (void)session:(MCSession *)session didReceiveData:(NSData *)data fromPeer:(MCPeerID *)peerID { // NSLog(@"(%@) Read %d bytes",peerID.displayName,data.length); NSDictionary* Dict = (NSDictionary*) [NSKeyedUnarchiver unarchiveObjectWithData:data]; UIImage* image = [UIImage imageWithData:Dict[@"image"] scale:2.0]; NSnumber* framesPerSecond = Dict[@"framesPerSecond"]; }
我们将获得FPS值,因此我们可以设置参数来管理我们的流图像.
希望它会有所帮助.
谢谢.
以上是大佬教程为你收集整理的如何使用多对等连接将摄像头从一个iOS设备流式传输到另一个设备全部内容,希望文章能够帮你解决如何使用多对等连接将摄像头从一个iOS设备流式传输到另一个设备所遇到的程序开发问题。
如果觉得大佬教程网站内容还不错,欢迎将大佬教程推荐给程序员好友。
本图文内容来源于网友网络收集整理提供,作为学习参考使用,版权属于原作者。
如您有任何意见或建议可联系处理。小编QQ:384754419,请注明来意。