snowflake.cortex.Complete

snowflake.cortex.Complete(model: Union[str, Column], prompt: Union[str, List[ConversationMessage], Column], *, options: Optional[CompleteOptions] = None, session: Optional[Session] = None, stream: bool = False, timeout: Optional[float] = None, deadline: Optional[float] = None) Union[str, Iterator[str], Column]

Complete calls into the LLM inference service to perform completion.

Parameters:
  • model – A Column of strings representing model types.

  • prompt – A Column of prompts to send to the LLM.

  • options – A instance of snowflake.cortex.CompleteOptions

  • session – The snowpark session to use. Will be inferred by context if not specified.

  • stream (bool) – Enables streaming. When enabled, a generator function is returned that provides the streaming output as it is received. Each update is a string containing the new text content since the previous update.

  • timeout (float) – Timeout in seconds to retry failed REST requests.

  • deadline (float) – Time in seconds since the epoch (as returned by time.time()) to retry failed REST requests.

Raises:

ValueError – incorrect argument.

Returns:

A column of string responses.