You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The problem is that we cannot get properly the maximum number of volumes allowed per node. This is an important value used by the autoscaler to know when it has to scale up and scale down. Also, it is important because Digital Ocean has a limitation of only 7 attached volumes per node https://docs.digitalocean.com/products/volumes/details/limits/ and this is a low number.
@rberrelleza oh, sorry for not giving context. It is a bit tricky. That setting is a percentage, it means that it has to scale up when the percentage of volumes is 6 and scale down when the percentage of volumes is less than 5.
In the specific case of DO, as we cannot get the maximum number of volumes, our default is always 100, so this is causing that it has to scale up when volumes mounted are 6 and scale down when they are 5 (or less).
So yes, in ' human language` it means that it will scale down when there are less than 5 volumes and scale up when there are 6 volumes, but for the specific case of DigitalOcean given the particularities mentioned above
Currently, there is a limitation with Digital Ocean and out autoscaler regarding the volumes configuration https://www.okteto.com/docs/self-hosted/administration/configuration/#autoscaler.
The problem is that we cannot get properly the maximum number of volumes allowed per node. This is an important value used by the autoscaler to know when it has to scale up and scale down. Also, it is important because Digital Ocean has a limitation of only 7 attached volumes per node https://docs.digitalocean.com/products/volumes/details/limits/ and this is a low number.
We can document in the Digital Ocean installation guide and in the autoscaler setting this specific case and recommend to set the following configuration:
The text was updated successfully, but these errors were encountered: