Snowpark Migration Accelerator: Deploying the Output Code¶

Running the output code from the SMA depends on your local environment. Here are some recommendations based on source.

Spark Scala¶

Before running migrated spark source code, there are a couple of things to consider

Add snowpark and snowpark extensions library reference¶

Snowpark and snowpark extensions libraries must be referenced from migrated project.

Snowpark Extensions¶

Snowpark Extensions is a support library that extends the standard Snowpark library by adding different functionalities that are present in Apache Spark but are not currently supported by Snowpark. The goal of this library is to facilitate the conversion process of projects from Apache Spark to Snowpark.

Here are the steps to reference snowpark and snowpark extensions libraries from the migrated code.

Step 1 - Add snowpark and snowpark extensions library references to the project configuration file¶

The tool will try to add these dependencies to the project configuration file. Once the references has been added to the project configuration file, the build tool will take care of resolving the references.

Based on the extension of the project configuration file, the tool adds the references as follows:

build.gradle¶

dependencies {
    implementation 'com.snowflake:snowpark:1.6.2'
    implementation 'net.mobilize.snowpark-extensions:snowparkextensions:0.0.9'
    ...
}
Copy

build.sbt¶

...
libraryDependencies += "com.snowflake" % "snowpark" % "1.6.2"
libraryDependencies += "net.mobilize.snowpark-extensions" % "snowparkextensions" % "0.0.9"
...
Copy

pom.xml¶

<dependencies>
    <dependency>
        <groupId>com.snowflake</groupId>
        <artifactId>snowpark</artifactId>
        <version>1.6.2</version>
    </dependency>
    <dependency>
        <groupId>net.mobilize.snowpark-extensions</groupId>
        <artifactId>snowparkextensions</artifactId>
        <version>0.0.9</version>
    </dependency>
    ...
</dependencies>
Copy

Step 2 - Add snowpark extensions library import statements¶

The tool includes these two import statements in all output .scala files.

import com.snowflake.snowpark_extensions.Extensions._
import com.snowflake.snowpark_extensions.Extensions.functions._
Copy

Code example¶

In the following code, hex and isin are supported by Spark, but these are not supported by Snowpark. The code will work because hex and isin are functions included as extensions.

Input code¶

package com.mobilize.spark

import org.apache.spark.sql._

object Main {

   def main(args: Array[String]) : Unit = {

      var languageArray = Array("Java");

      var languageHex = hex(col("language"));

      col("language").isin(languageArray:_*);
   }

}
Copy

Output code¶

package com.mobilize.spark

import com.snowflake.snowpark._
import com.snowflake.snowpark_extensions.Extensions._
import com.snowflake.snowpark_extensions.Extensions.functions._

object Main {

   def main(args: Array[String]) : Unit = {

      var languageArray = Array("Java");
      
      // hex does not exist on Snowpark. It is a extension.
      var languageHex = hex(col("language"));
      
      // isin does not exist on Snowpark. It is a extension.
      col("language").isin(languageArray :_*)

   }

}
Copy

PySpark¶

Before running migrated pyspark source code, there are a couple of things to consider

Install snowpark and snowpark extensions libraries¶

Snowpark and snowpark extensions libraries must be referenced from migrated project.

Snowpark Extensions¶

Snowpark Extensions is a support library that extends the standard Snowpark library by adding different functionalities that are present in PySpark but are not currently supported by Snowpark. The goal of this library is to facilitate the conversion process of projects from PySpark to Snowpark.

Here are the steps to reference snowpark and snowpark extensions libraries from the migrated code.

Step 1 - Install snowpark library¶

pip install snowpark-extensions
Copy

Step 2 - Install snowpark extensions library¶

pip install snowflake-snowpark-python
Copy

Step 3 - Add snowpark extensions library import statements¶

The tool includes this import in each file that uses pyspark.

import snowpark_extensions
Copy

Code example¶

In the following code, create_map function is not supported by PySpark, but not supported by Snowpark. The code will work because create_map function is one of the included in snowpark extensions.

Input code¶

import pyspark.sql.functions as df
df.select(create_map('name', 'age').alias("map")).collect()
Copy

Output code¶

import snowpark_extensions
import snowflake.snowpark.functions as df
df.select(create_map('name', 'age').alias("map")).collect()
Copy