You are viewing documentation about an older version (1.5.4). View latest version

snowflake.cortex.Complete

snowflake.cortex.Complete(model: Union[str, Column], prompt: Union[str, List[ConversationMessage], Column], *, options: Optional[CompleteOptions] = None, session: Optional[Session] = None, use_rest_api_experimental: bool = False, stream: bool = False) Union[str, Iterator[str], Column]

Complete calls into the LLM inference service to perform completion.

Parameters:
  • model – A Column of strings representing model types.

  • prompt – A Column of prompts to send to the LLM.

  • options – A instance of snowflake.cortex.CompleteOptions

  • session – The snowpark session to use. Will be inferred by context if not specified.

  • use_rest_api_experimental (bool) – Toggles between the use of SQL and REST implementation. This feature is experimental and can be removed at any time.

  • stream (bool) – Enables streaming. When enabled, a generator function is returned that provides the streaming output as it is received. Each update is a string containing the new text content since the previous update. The use of streaming requires the experimental use_rest_api_experimental flag to be enabled.

Raises:

ValueError – If stream is set to True and use_rest_api_experimental is set to False.

Returns:

A column of string responses.