You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I thought at first that this variable "embed" wasn't being released,I added the code manual release on line 2856,but the program reported an error.
then I replace "embed" with "self._last_image_embed",Memory leaks continue to occur.
Environment and Context
Please provide detailed information about your computer setup. This is important in case the issue is not reproducible except for under certain specific conditions.then
Prerequisites
Please answer the following questions for yourself before submitting an issue.
Expected Behavior
MiniCPM-V 2.6 memory leak ?
Current Behavior
I am testing minicpm-v-2.6 ,here is part of my test code:
when i run code ,it adds 10MB per inference,i use memory_profiler find in
llama-cpp-python/llama_cpp/llama_chat_format.py
Line 2839 in 2bc1d97
I thought at first that this variable "embed" wasn't being released,I added the code manual release on line 2856,but the program reported an error.
then I replace "embed" with "self._last_image_embed",Memory leaks continue to occur.
Environment and Context
Please provide detailed information about your computer setup. This is important in case the issue is not reproducible except for under certain specific conditions.then
Example environment info:
The text was updated successfully, but these errors were encountered: