You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Imagine this scenario:
I have some imageViews that init with the same frame image( instance of YYFrameImage),and add them to some view simultaneously.Apparently,the more imageViews,the slower animations,because of high CPU usage to decode images.So is adding an array in YYFrameImage to cache the decoded images a correct way like the code below?
Different imageViews may have different playback progress, so you may cache all frames in memory. You may [UIImage animatedImageWithImages:duration:] instead.
OK,I konw.Maybe in my scenario,it's better to use a cached images because there may be many imageViews with the same frameImage showing at the same time.If still using your implementation will cause very high CPU usage (that's to say average 90%).I will subclass UIImage and conforms to <YYAnimatedImage> to cache frame images.
What's more,there are other features like loop mode(repeat or reverse) and loop range (some frame to another not start to end).I will give a pull request if these feature described above are suitable for this framework.
Thanks for your reply and this great framework. 👍
Imagine this scenario:
I have some imageViews that init with the same frame image( instance of YYFrameImage),and add them to some view simultaneously.Apparently,the more imageViews,the slower animations,because of high CPU usage to decode images.So is adding an array in YYFrameImage to cache the decoded images a correct way like the code below?
And we can also add some methods to handle the memory warning and background mode.
The text was updated successfully, but these errors were encountered: