Cortex Inference

Returns the LLMs available for the current session¶

GET/api/v2/cortex/models
Returns the LLMs available for the current session

Response¶

CodeDescription
200
OK

Perform LLM text completion inference¶

POST/api/v2/cortex/inference:complete
Perform LLM text completion inference, similar to snowflake.cortex.Complete.

For more information

Go to the SQL command page to view more information about arguments, options, privileges requirements, and usage guidelines.

View

Response¶

CodeDescription
200
OK