status llm
The status llm
command displays the status of an LLM job. If the job is RUNNING, the output of this command will display URLs to which inference requests may be sent. The base URL supports a REST API that lists LLMs and embeddings models that are currently being served. The endpoint URL supports a REST API for LLM chat completions or embeddings requests.
Syntax
Parameters
Parameter
Description
<llm name>
The name of the LLM job
Output Fields
Output Field
Description
status
The status of the job, which is one of the following: - STARTED: The job has just started. - RUNNING: The job is running normally. - STOPPED: The job was stopped by the user. - COMPLETED: The job ran to completion. - FAILED: The job failed.