Preparing a local folder with configured Snowflake Native App artifacts

Create a local folder with configured artifacts

The snow app bundle command creates a local directory in your project, populates it with the file structure you specified in the project definition file, and generates CREATE FUNCTION or CREATE PROCEDURE declarations in Snowflake Native App setup scripts from Snowpark Python code that includes decorators (such as @sproc or @udaf). For more information, see the Snowpark Python documentation corresponding to your chosen function decorator, such as snowflake.snowpark.functions.udaf.

The snow app deploy and snow app run commands already use this functionality. However, now with an explicit snow app bundle command at your disposal, you can explore this directory before it gets uploaded to the stage, to verify the artifacts were created as expected.

To create a local folder with the configured artifacts, do the following:

  1. Create or verify your Snowflake snowflake.yml project definition file, such as:

    definition_version: 1
    
    native_app:
      name: codegen_nativeapp
      artifacts:
        - src: root_files/README.md
          dest: README.md
        - src: root_files/_manifest.yml
          dest: manifest.yml
        - src: root_files/setup_scripts/*
          dest: setup_scripts/
        - src: python/user_gen/echo.py
          dest: user_gen/echo.py
        - src: python/cli_gen/*
          dest: cli_gen/
          processors:
            - snowpark
    
    Copy
  2. From your project directory, run the snow app bundle command to create the temporary output/deploy directory that contains your configured artifacts.

    snow app bundle
    
    Copy
  3. Verify the contents of the output or deploy directory match the rules you specified in the snowflake.yml. file. If you invoked Snowpark annotation processing in your Python files, you can see the generated code in the amended setup script in the directory.

For more information, see the snow app bundle command.

Generate SQL code using Snowpark annotation processing

As a Snowflake Native App developer with a limited SQL background, you might find it cumbersome to write and maintain setup scripts, which can get quite large and complicated over time. Setup scripts contain all the application logic that a customer can use with their data, and hence are a required part of developing a Snowflake Native App. One of the core components of setup scripts is your ability to use Snowpark Python extension functions for functions and stored procedures. In addition to writing Snowpark code in Python, Java, or other Snowpark supported languages, you need to write the corresponding portions of those functions and procedures using SQL in the setup script.

For example, you could create a basic function and stored procedure using Snowpark Python, as shown:

# Example python file "echo.py" that a developer writes

def echo_fn(data):
    return 'echo_fn: ' + data

def echo_proc(session, data):
    return 'echo_proc: ' + data
Copy

You would then need to upload the file to a stage and refer to it from the setup script SQL code, similar to the following:

-- Sample setup_script.sql SQL file for a Snowflake Native App

CREATE APPLICATION ROLE IF NOT EXISTS app_instance_role;

CREATE OR ALTER VERSIONED SCHEMA ext_code_schema;
GRANT USAGE ON SCHEMA ext_code_schema TO APPLICATION ROLE app_instance_role;

CREATE OR REPLACE PROCEDURE ext_code_schema.py_echo_proc(DATA string)
  RETURNS STRING
  LANGUAGE PYTHON
  RUNTIME_VERSION=3.8
  PACKAGES=('snowflake-snowpark-python')
  HANDLER='echo.echo_proc'
  IMPORTS=('/echo.py');

    GRANT USAGE ON PROCEDURE ext_code_schema.py_echo_proc(string)
      TO APPLICATION ROLE app_instance_role;

-- Wraps a function from a python file
CREATE OR REPLACE FUNCTION ext_code_schema.py_echo_fn(string)
RETURNS STRING
LANGUAGE PYTHON
RUNTIME_VERSION=3.8
PACKAGES=('snowflake-snowpark-python')
HANDLER='echo.echo_fn'
IMPORTS=('/echo.py');

GRANT USAGE ON FUNCTION ext_code_schema.py_echo_fn(DATA string)
  TO APPLICATION ROLE app_instance_role;
Copy

Automatic SQL code generation

Note

To take advantage of automatic SQL code generation, you must use Snowpark Python version 1.15.0 and above.

To help alleviate this extra work, Snowflake CLI can automatically generate the necessary SQL for your setup scripts. Snowpark Python supports a feature called extension function decorators (@udf, @sproc, @udtf, and @udaf) that let you annotate your Python code, such as using the snowflake.snowpark.functions.udf function decorator. Snowflake CLI can use these decorators to automatically create and validate the necessary SQL code for your setup scripts.

For example, you can use the @udf decorator for the function in the previous example:

# some python file echo.py
@udf(name="echo_fn")
def echo_fn(data) -> str:
  return 'echo_fn: ' + str
Copy

Using the @udf decorator tells the Snowflake CLI snow app bundle (and other commands that internally invoke the snow app bundle command) to process the Snowpark Python decorators, generate the corresponding SQL commands, and include them in the setup script automatically, as shown. You can, therefore, minimize the amount of SQL code you need to write for your setup script.

-- Sample setup_script.sql SQL file for a Snowflake Native App

-- User-written code
CREATE OR REPLACE APPLICATION ROLE app_instance_role;

CREATE OR ALTER VERSIONED SCHEMA ext_code_schema;
GRANT USAGE ON SCHEMA ext_code_schema TO APPLICATION ROLE app_instance_role;

-- Snowflake CLI generated code
CREATE OR REPLACE FUNCTION ext_code_schema.py_echo_fn(DATA string)
  RETURNS STRING
  LANGUAGE PYTHON
  RUNTIME_VERSION=3.8
  PACKAGES=('snowflake-snowpark-python')
  HANDLER='echo.echo_fn'
  IMPORTS=('/echo.py');

  GRANT USAGE ON FUNCTION ext_code_schema.py_echo_fn(string)
    TO APPLICATION ROLE app_instance_role;
Copy

Using the Snowpark Python decorators

While the Snowpark decorators in Snowflake CLI work the same as regular Snowpark Python decorators, you should be aware of the following differences when writing Python code files specifically for a Snowflake Native App:

  • You can’t use any Session objects in these files, as Snowflake CLI executes these Python files in a sandbox environment with no connection to Snowflake. As a result, any reference of a Snowpark Session results in an error.

  • You can use only the @udf, @sproc, @udaf and @udtf Snowpark Python decorators.

  • You can’t use these decorators as regular functions to register your code as a Snowflake object. Only code explicitly annotated with the supported decorators is recognized. Therefore, the Python function must be a named function. Lambda functions are not supported.

  • Snowflake CLI always generates CREATE OR REPLACE statements, as recommended by Snowflake, for creating functions and procedures in your setup scripts.

More about decorator properties

The following table lists the Python decorator properties and explains how Snowflake CLI uses them.

Python decorator properties

Property

Details

name

Optional

Name of the function or stored procedure Snowflake CLI uses to generate the SQL statements.

If you omit this property, Snowflake CLI reuses the Python function name to generate the SQL statements.

input_types

Required

Types for each input parameter for this function or stored procedure.

You must provide this information either in this decorator parameter or provide type annotations directly in your code. If this information is not available in either location, Snowflake CLI does not generate SQL statements for this function or stored procedure.

return_type

Required

Type for the return value for this function or stored procedure.

You must provide this information either in this decorator parameter or provide type annotation directly in your code. If this information is not available in either location, Snowflake CLI does not generate SQL statements for this function or stored procedure.

packages

Optional

List of packages. You can specify snowflake-snowpark-python with or without a version number. If you provide a version number for this package, Snowflake CLI does not use the version as part of its SQL generation, but does retain the version number for any other packages in the list.

If you omit this property, Snowflake CLI automatically adds snowflake-snowpark-python as the only package and reflects it in the generated SQL statements.

imports

Optional

List of files your Snowflake function or stored procedure needs to import from the stage. You can specify them either as a string or a tuple of strings. If you specify a tuple, Snowflake CLI only uses the string at the 0th index. For an example of using a tuple, see Use external Python files.

If you do not specify any imports, Snowflake CLI automatically adds an import for the Python file that contains the function or stored procedure for which it generates SQL. The path of the import is determined by the dest parameter of the Python file in the deploy root directory, based on the project definition file.

execute_as

Optional

Persona to use when executing a stored procedure. Values include: caller and owner. If unspecified, Snowflake CLI defaults to owner. Note that this property does not apply to functions.

handler

N/A

Handler for the function or stored procedure. Snowflake CLI automatically populates this field.

replace

Unused

Snowflake CLI assumes true for code generation.

session

Required

Must be None. If omitted, Snowflake CLI throws an error.

is_permanent

Unused

Snowflake CLI does not use this field for SQL generation.

stage_location

Unused

Snowflake CLI does not use this field for SQL generation.

if_not_exists

Unused

Snowflake CLI does not use this field for SQL generation.

strict

Unused

Snowflake CLI does not use this field for SQL generation.

secure

Unused

Snowflake CLI does not use this field for SQL generation.

immutable

Unused

Snowflake CLI does not use this field for SQL generation.

native_app_params

Optional

(For a Snowflake Native App only)

Python dictionary containing the following Snowflake Native App parameters:

  • schema: Name of the schema to contain the Snowpark function or stored procedure. This schema must already be defined in your setup script. Snowflake recommends setting the value to the name of a versioned schema in your setup script file. Snowflake CLI prefixes this value to the name of the Snowpark function or procedure name in the generated SQL statement. Note that Snowflake CLI does not create the schema for you.

  • application_roles: List of application roles to be granted USAGE privileges on the generated Snowpark function or procedure. Snowflake CLI does not create the application roles; it only creates SQL statements like GRANT USAGE ON FUNCTION <schema_name.func_name> TO APPLICATION ROLE <app_role> and adds them to the setup script.

While technically optional, not specifying the native_app_params property in your project definition file might result in an invalid setup script.

When uploading your Python files to a destination stage, Snowflake CLI converts the decorators to comments so these UDFs and stored procedures are not created in your current session. The original source files are not changed so that the snow app bundle command remains idempotent. Only Python files in the deploy root directory are changed to contain the comments, as the the deploy root is recreated every time you run the snow app bundle command. The following example illustrates how Snowflake CLI comments decorators.

# output/deploy/dest_dir1/dest_dir2/echo.py
#: @sproc(
#:    return_type=IntegerType(),
#:    input_types=[IntegerType(), IntegerType()],
#:    packages=["snowflake-snowpark-python==1.15.0"],
#:    native_app_params={
#:        "schema": "ext_code_schema",
#:        "application_roles": ["app_instance_role"],
#:    },
#: )
def add_sp(session_, x, y):
    return x + y
Copy

Also, only the Python files with a processors property in the project definition file are affected.