LLM Status
LLM deployments do not remain completely active all of the time and they can be in various operational statuses over the course of your workstream. These are a few methods available to check on your deployments or trigger them to get ready for prompting.
llm_deployment.get_status
llm_deployment.get_status()
This method returns the state of your deployment job in Predibase.
Parameters:
None
Returns:
A short description of your deployment status, such as "queued", "updating", "active", etc.
Example Usage:
Get the status of an LLM deployment (e.g. llm_deployment
from the previous page)
llm_deployment = pc.LLM("pb://deployments/deployment-name")
llm_deployment.get_status() # queued
llm_deployment.wait_for_ready
llm_deployment.wait_for_ready()
This method blocks until your deployment is ready for complete usage (i.e. both active and scaled up compute-wise).
Parameters:
None
Returns:
None
Example Usage:
Get the status of an LLM deployment (e.g. llm_deployment
from the previous page)
llm_deployment = pc.LLM("pb://deployments/deployment-name")
llm_deployment.wait_for_ready()
# waits.
# waits..
# waits...
# return when ready!
For users looking for finer-grained control and insight into their deployments, there are two more specialized methods for assessing deployment state:
llm_deployment.is_ready
llm_deployment.is_ready()
This method returns whether your deployment is active and its endpoint is responsive
Parameters:
None
Returns:
True or False
Example Usage:
llm_deployment = pc.LLM("pb://deployments/deployment-name")
llm_deployment.is_ready() # True or False