You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It could be beneficial to introduce metrics at the granularity of AppWrappers.
For metrics such as consumed memory, this approach would enable us to aggregate data on AppWrappers, providing the total memory consumption across all nodes for a specific AppWrapper. Additionally, we could aggregate on nodes to identify overloaded nodes.
Assuming we can relate AppWrappers to users, this would also allow us to obtain resource consumption metrics per user.
sum by (appwrapper) (used_memory)
used_memory{appwrapper="aw1"} 600MiB
used_memory{appwrapper="aw2"} 900MiB
Aggregating by Node:
sum by (node) (used_memory)
used_memory{node="worker1"} 500MiB
used_memory{node="worker2"} 200MiB
used_memory{node="worker3"} 800MiB
However, a potential concern arises with high label cardinality if there are too many AppWrappers. One solution could be to implement this feature behind a feature flag, allowing operators who specifically require this functionality to utilize it.
It could be beneficial to introduce metrics at the granularity of AppWrappers.
For metrics such as consumed memory, this approach would enable us to aggregate data on AppWrappers, providing the total memory consumption across all nodes for a specific AppWrapper. Additionally, we could aggregate on nodes to identify overloaded nodes.
Assuming we can relate AppWrappers to users, this would also allow us to obtain resource consumption metrics per user.
For example:
Aggregating by AppWrapper:
Aggregating by Node:
However, a potential concern arises with high label cardinality if there are too many AppWrappers. One solution could be to implement this feature behind a feature flag, allowing operators who specifically require this functionality to utilize it.
@tardieu @dgrove-oss WDYT?
cc @eranra @rachelt44 @maia-iyer
The text was updated successfully, but these errors were encountered: