Metrics Server is a scalable, efficient source of container resource metrics for Kubernetes built-in autoscaling pipelines. These metrics will drive the scaling behavior of the deployments. We will deploy the metrics server using Kubernetes Metrics Server.
kubectl apply -f https://github.com/kubernetes-sigs/metrics-server/releases/download/v0.6.1/components.yaml
Lets' verify the status of the metrics-server APIService
kubectl get apiservice v1beta1.metrics.k8s.io -o json | jq '.status'
It may take a minute or so for the metric service to fully initialize. If on your first attempt you don’t get the output indicated below, wait for a minute or so and try again.
If all is well, you should see a status message similar to the one below in the response
{
"conditions": [
{
"lastTransitionTime": "2021-12-01T19:22:27Z",
"message": "all checks passed",
"reason": "Passed",
"status": "True",
"type": "Available"
}
]
}