Skip to content

Commit

Permalink
Update autoscaler-lab.md
Browse files Browse the repository at this point in the history
  • Loading branch information
ajeetraina authored May 16, 2024
1 parent 1f956be commit eee696e
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion Autoscaler101/autoscaler-lab.md
Original file line number Diff line number Diff line change
Expand Up @@ -120,6 +120,7 @@ Watch the pods, and you will see that the resource limits are reached, after whi

Now that we have gotten a complete look at the vertical pod autoscaler, let's take a look at the HPA. Create a file nginx-hpa.yml and paste the below contents into it.

```
apiVersion: autoscaling/v2beta2
kind: HorizontalPodAutoscaler
metadata:
Expand Down Expand Up @@ -177,4 +178,4 @@ You should be able to see the memory limit getting reached, after which the numb

## Conclusion

That sums up the lab on autoscalers. In here, we discussed the two most commonly used in-built autoscalers: HPA and VPA. We also took a hands-on look at how the autoscalers worked. This is just the tip of the iceberg when it comes to scaling, however, and the subject of custom scalers that can scale based on metrics other than memory and CPU is vast. If you are interested in looking at more complicated scaling techniques, you could take a look at the [KEDA section](../Keda101/what-is-keda.md) to get some idea of the keda autoscaler.
That sums up the lab on autoscalers. In here, we discussed the two most commonly used in-built autoscalers: HPA and VPA. We also took a hands-on look at how the autoscalers worked. This is just the tip of the iceberg when it comes to scaling, however, and the subject of custom scalers that can scale based on metrics other than memory and CPU is vast. If you are interested in looking at more complicated scaling techniques, you could take a look at the [KEDA section](../Keda101/what-is-keda.md) to get some idea of the keda autoscaler.

0 comments on commit eee696e

Please sign in to comment.