Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory warings and app crashs. #2

Open
5ysourcesafe opened this issue Nov 1, 2017 · 11 comments
Open

Memory warings and app crashs. #2

5ysourcesafe opened this issue Nov 1, 2017 · 11 comments

Comments

@5ysourcesafe
Copy link

5ysourcesafe commented Nov 1, 2017

Hi bottotl,

Can you please help with the below memory warings and app crashs.

crashes the app with this case

Got memory pressure notification (critical)

2017-11-01 18:50:48.364536+0530 Edovi[21650:6364194] [MC] Invalidating cache
2017-11-01 18:51:03.619378+0530 Edovi[21650:6351902] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
2017-11-01 18:51:03.646973+0530 Edovi[21650:6351902] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
2017-11-01 18:51:03.688721+0530 Edovi[21650:6351902] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
2017-11-01 18:51:21.358407+0530 Edovi[21650:6351901] [GatekeeperXPC] Connection to assetsd was interrupted or assetsd died
2017-11-01 18:51:36.579644+0530 Edovi[21650:6347783] Got memory pressure notification (non-critical)
2017-11-01 18:51:43.767365+0530 Edovi[21650:6347783] System is no longer under (non-critical) memory pressure.
2017-11-01 18:52:15.053472+0530 Edovi[21650:6373407] [MC] Invalidating cache
2017-11-01 18:52:15.315994+0530 Edovi[21650:6351902] [MC] Invalidating cache
2017-11-01 18:53:10.168759+0530 Edovi[21650:6347783] Got memory pressure notification (non-critical)
2017-11-01 18:53:34.551883+0530 Edovi[21650:6364194] [MC] Invalidating cache
2017-11-01 18:53:34.748212+0530 Edovi[21650:6364194] [MC] Invalidating cache
2017-11-01 18:54:20.383428+0530 Edovi[21650:6347783] System is no longer under (non-critical) memory pressure.
2017-11-01 18:54:40.657309+0530 Edovi[21650:6388455] [MC] Invalidating cache
2017-11-01 18:54:40.865958+0530 Edovi[21650:6351903] [MC] Invalidating cache
2017-11-01 19:00:03.209629+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
2017-11-01 19:00:03.236189+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
2017-11-01 19:00:03.257170+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
2017-11-01 19:00:03.276643+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
2017-11-01 19:00:03.295550+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
2017-11-01 19:00:03.316632+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
2017-11-01 19:00:03.338264+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
2017-11-01 19:00:03.357406+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
2017-11-01 19:00:03.376734+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
2017-11-01 19:00:03.395740+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
2017-11-01 19:00:03.420811+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
2017-11-01 19:00:03.440474+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
2017-11-01 19:00:03.459381+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
2017-11-01 19:00:08.752881+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.

@bottotl
Copy link
Owner

bottotl commented Nov 1, 2017 via email

@bottotl
Copy link
Owner

bottotl commented Nov 2, 2017

Check this things

Follow these practices for best performance:
Don’t create a CIContext object every time you render.
Contexts store a lot of state information; it’s more efficient to reuse them.
Evaluate whether you app needs color management. Don’t use it unless you need it. See Does Your App Need Color Management? https://developer.apple.com/library/content/documentation/GraphicsImaging/Conceptual/CoreImaging/ci_performance/ci_performance.html#//apple_ref/doc/uid/TP30001185-CH10-SW7.
Avoid Core Animation animations while rendering CIImage objects with a GPU context.
If you need to use both simultaneously, you can set up both to use the CPU.
Make sure images don’t exceed CPU and GPU limits.
Image size limits for CIContext objects differ depending on whether Core Image uses the CPU or GPU. Check the limit by using the methods inputImageMaximumSize https://developer.apple.com/documentation/coreimage/cicontext/1620425-inputimagemaximumsize and outputImageMaximumSize https://developer.apple.com/documentation/coreimage/cicontext/1620335-outputimagemaximumsize.
User smaller images when possible.
Performance scales with the number of output pixels. You can have Core Image render into a smaller view, texture, or framebuffer. Allow Core Animation to upscale to display size.
Use Core Graphics or Image I/O functions to crop or downsample, such as the functions CGImageCreateWithImageInRect https://developer.apple.com/documentation/coregraphics/1454683-cgimagecreatewithimageinrect or CGImageSourceCreateThumbnailAtIndex https://developer.apple.com/documentation/imageio/1465099-cgimagesourcecreatethumbnailatin.
The UIImageView class works best with static images.
If your app needs to get the best performance, use lower-level APIs.
Avoid unnecessary texture transfers between the CPU and GPU.
Render to a rectangle that is the same size as the source image before applying a contents scale factor.
Consider using simpler filters that can produce results similar to algorithmic filters.
For example, CIColorCube can produce output similar to CISepiaTone, and do so more efficiently.
Take advantage of the support for YUV image in iOS 6.0 and later.
Camera pixel buffers are natively YUV but most image processing algorithms expect RBGA data. There is a cost to converting between the two. Core Image supports reading YUB from CVPixelBuffer objects and applying the appropriate color transform.
options = @{ (id)kCVPixelBufferPixelFormatTypeKey :
@(kCVPixelFormatType_420YpCbCr88iPlanarFullRange) };
<>

See more information

Here are some code here.

- (CVPixelBufferRef)finishPassthroughCompositionRequest:(AVAsynchronousVideoCompositionRequest *)request error:(NSError **)errOut {
    @autoreleasepool {
        JFTAVCustomVideoCompositionInstruction *instruction = request.videoCompositionInstruction;
        JFTAVCustomVideoCompositionLayerInstruction *simpleLayerInstruction = instruction.simpleLayerInstructions.firstObject;
        CGSize renderSize = _renderContext.size;
        CVPixelBufferRef pixelBuffer = [_renderContext newPixelBuffer];
        if (!request.sourceTrackIDs.count) {
            NSLog(@"request.sourceTrackIDs.count does not exit");
            
            CIImage *emptyImage = [CIImage imageWithColor:[CIColor colorWithCGColor:[UIColor blackColor].CGColor]];
            emptyImage = [emptyImage imageByCroppingToRect:CGRectMake(0, 0, renderSize.width, renderSize.height)];
            [_ciContext render:emptyImage toCVPixelBuffer:pixelBuffer];
            
            return pixelBuffer;
        }
        CMPersistentTrackID trackID = simpleLayerInstruction?simpleLayerInstruction.trackID:request.sourceTrackIDs[0].intValue;
        CVPixelBufferRef sourcePixels = [request sourceFrameByTrackID:trackID];
        if (!sourcePixels) return nil;
        
        CIImage *sourceImage = [CIImage imageWithCVPixelBuffer:sourcePixels];
        if (simpleLayerInstruction) {
            sourceImage = [sourceImage imageByApplyingTransform:[self transformFix:simpleLayerInstruction.transform extent:sourceImage.extent]];
        }
        
        if (simpleLayerInstruction.videoItem.filter) {
            [simpleLayerInstruction.videoItem.filter setValue:sourceImage forKey:kCIInputImageKey];
            sourceImage = simpleLayerInstruction.videoItem.filter.outputImage;
        }
        
        [_ciContext render:sourceImage toCVPixelBuffer:pixelBuffer];
        
        if (!pixelBuffer) {
            *errOut = [NSError errorWithDomain:@"finishPassthroughCompositionRequest error unknow"
                                                          code:1000
                                                      userInfo:nil];
        }
        return pixelBuffer;
    }
}

@bottotl
Copy link
Owner

bottotl commented Nov 2, 2017

The code you reply in the mail have some problem:

CIContext *context = [CIContext contextWithOptions:nil];
CIImage *inputCIImage = [CIImage imageWithCGImage:[image CGImage]];
[filter setValue:inputCIImage forKeyPath:kCIInputImageKey];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *img = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
return img;

same like you playback video by doing following things:

  1. Decode video --> get UIImage
  2. add filter to UIImage
  3. show UIImage to user

this is not look like a good way. Use CVPixelBufferRef replace of UIImage.

Using AVFoundation for playback is very simple.

@bottotl
Copy link
Owner

bottotl commented Nov 2, 2017 via email

@5ysourcesafe
Copy link
Author

5ysourcesafe commented Nov 2, 2017

Thanks for you response bottotl.

I am try to merge multiple videos with different filters and transitions to individual videos added by the user into project.
But i tried to export video with 20 to 30 mins it crash app with memory usages 60 to 70 MB only but the other process will use around 700 to 800MB can you please help on this. I am new to AVFoundation framework.

Can it is possible to unlimited time to export video.

@bottotl
Copy link
Owner

bottotl commented Nov 2, 2017 via email

@5ysourcesafe
Copy link
Author

Yes i am using JFTAVAssetExportSession

@bottotl
Copy link
Owner

bottotl commented Nov 2, 2017 via email

@bottotl
Copy link
Owner

bottotl commented Nov 2, 2017

It works fine when I try to export a video about 2G on my iPhone7 iOS 11.1 . Instruments no leak, app no crash.I found some little leak but not very serious, I will fix this leak this weekend.

It do have huge leak problem when use CIContext in simulator.I found some one have the same problem
I looks like a Apple's bug still not fixed since iOS 9.
I will try to figure out is there any possible to prevent it.

@bottotl
Copy link
Owner

bottotl commented Nov 4, 2017

@5ysourcesafe
Consider AVAssetReader may post too mush frames to compositor, there were two way to solve this problem

  1. slow down post operation.
  2. stop rendering until objects been released.

I add some protection when memory warning happen.
Custom compositor observe UIApplicationDidReceiveMemoryWarningNotification ,when receive this warning, let the render queue sleep 1 sec .

@bottotl
Copy link
Owner

bottotl commented Nov 6, 2017

My co-worker tell me, consider CIContext do have memory issues(in it's black box), use OpenGL to apply Filter to image can solve memory problem.
But I'm not familiar with OpenGL, I will have a try.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants