Я не могу ответить на конкретный вопрос, поставленный, но я успешно записи видео и захвата кадров в то же время с помощью:
AVCaptureSession
и AVCaptureVideoDataOutput
для маршрутизации кадров в мой собственный код
AVAssetWriter
, AVAssetWriterInput
и AVAssetWriterInputPixelBufferAdaptor
писать кадры из к H.264 закодированного фильма файл
Это не исследуя аудио. В итоге я получаю CMSampleBuffers
из сеанса захвата, а затем вставляю их в адаптер пиксельного буфера.
EDIT: мой код выглядит более или менее, как, с битами вы не имеющие никаких проблем с скользил и игнорируя вопросы охвата:
/* to ensure I'm given incoming CMSampleBuffers */
AVCaptureSession *captureSession = alloc and init, set your preferred preset/etc;
AVCaptureDevice *captureDevice = default for video, probably;
AVCaptureDeviceInput *deviceInput = input with device as above,
and attach it to the session;
AVCaptureVideoDataOutput *output = output for 32BGRA pixel format, with me as the
delegate and a suitable dispatch queue affixed.
/* to prepare for output; I'll output 640x480 in H.264, via an asset writer */
NSDictionary *outputSettings =
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:640], AVVideoWidthKey,
[NSNumber numberWithInt:480], AVVideoHeightKey,
AVVideoCodecH264, AVVideoCodecKey,
nil];
AVAssetWriterInput *assetWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:outputSettings];
/* I'm going to push pixel buffers to it, so will need a
AVAssetWriterPixelBufferAdaptor, to expect the same 32BGRA input as I've
asked the AVCaptureVideDataOutput to supply */
AVAssetWriterInputPixelBufferAdaptor *pixelBufferAdaptor =
[[AVAssetWriterInputPixelBufferAdaptor alloc]
initWithAssetWriterInput:assetWriterInput
sourcePixelBufferAttributes:
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA],
kCVPixelBufferPixelFormatTypeKey,
nil]];
/* that's going to go somewhere, I imagine you've got the URL for that sorted,
so create a suitable asset writer; we'll put our H.264 within the normal
MPEG4 container */
AVAssetWriter *assetWriter = [[AVAssetWriter alloc]
initWithURL:URLFromSomwhere
fileType:AVFileTypeMPEG4
error:you need to check error conditions,
this example is too lazy];
[assetWriter addInput:assetWriterInput];
/* we need to warn the input to expect real time data incoming, so that it tries
to avoid being unavailable at inopportune moments */
assetWriterInput.expectsMediaDataInRealTime = YES;
... eventually ...
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
[captureSession startRunning];
... elsewhere ...
- (void) captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// a very dense way to keep track of the time at which this frame
// occurs relative to the output stream, but it's just an example!
static int64_t frameNumber = 0;
if(assetWriterInput.readyForMoreMediaData)
[pixelBufferAdaptor appendPixelBuffer:imageBuffer
withPresentationTime:CMTimeMake(frameNumber, 25)];
frameNumber++;
}
... and, to stop, ensuring the output file is finished properly ...
[captureSession stopRunning];
[assetWriter finishWriting];
_ «Видео может быть записано непосредственно в файл с помощью AVCaptureMovieFileOutput. Однако этот класс не имеет отображаемых данных и ** не может использоваться ** одновременно с AVCaptureVideoDataOutput.» _ Найдено здесь: [link] (https: // developer.xamarin.com/api/type/MonoTouch.AVFoundation.AVCaptureSession/) .. просто для выяснения фактической причины проблемы – Csharpest