Classic mirrored queue with 24 replicas: one of the mirrors consumes more memory than others #12089
Replies: 3 comments
-
Please carefully read the community support guidelines: In your case:
|
Beta Was this translation helpful? Give feedback.
-
Hi, When you see this behavior you can do the following:
|
Beta Was this translation helpful? Give feedback.
-
@MrQiudaoyu classic mirrorred queues were removed from RabbitMQ and will not get any community support from the core team, see https://www.rabbitmq.com/docs/ha. But one thing that very obviously stands out in the described setup: have you read this doc section on how many mirrors are optimal? In my 14 years as a RabbitMQ contributor, I have never seen a justification for running 24 replicas |
Beta Was this translation helpful? Give feedback.
-
Describe the bug
We are using a 24-node cluster with a total of 18,000 message queues, with the mirroring policy set to HA-ALL and no expiration set for queues and messages.
During the use of the mirror queues, we encountered the following issue: Our message publishing frequency is very low, less than 100 messages per second. After continuous message publishing and consumption, the memory on each node gradually increases but does not decrease, even when publishing and consuming are stopped. After examining the memory details of the queue processes (as shown in the image below), we found that after messages are consumed, the mirror queue process continues to call a function: "process next msg," which consistently occupies heap and stack memory. Additionally, approximately every two months, we experience a sudden increase in memory on one of our nodes, leading to an alert. We are quite puzzled by this and usually resolve the issue by restarting the affected node when a memory alert occurs. We hope the official team can help clarify this issue. Thank you very much!
Reproduction steps
As described above
Expected behavior
The memory of the queue continually grows, causing the memory of the entire server node to increase continuously.
Additional context
No response
Beta Was this translation helpful? Give feedback.
All reactions