Snowpark Migration Accelerator:Spark - Scala用問題コード¶
SPRKSCL1126¶
メッセージ: org.apache.spark.sql.functions.covar_pop has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.covar_pop function, which has a workaround.
入力
Below is an example of the org.apache.spark.sql.functions.covar_pop function, first used with column names as the arguments and then with column objects.
出力
The SMA adds the EWI SPRKSCL1126 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent covar_pop function that receives two column objects as arguments. For that reason, the Spark overload that receives two column objects as arguments is directly supported by Snowpark and does not require any changes.
For the overload that receives two string arguments, you can convert the strings into column objects using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1112¶
Message: *spark element* is not supported
カテゴリ: 変換エラー
説明¶
このエラーは、 SnowparkがサポートしていないSpark要素の使用を SMA が検出し、その要素に独自のエラーコードが関連付けられていない場合に表示されます。これは、サポートされていないSpark要素に対して SMA が使用する一般的なエラーコードです。
シナリオ¶
入力
以下は、SnowparkがサポートしていないSpark要素の例で、この EWI が生成されます。
出力
The SMA adds the EWI SPRKSCL1112 to the output code to let you know that this element is not supported by Snowpark.
推奨される修正
これは、サポートされていない関数の範囲に適用される一般的なエラーコードであるため、単一の特定の修正方法はありません。適切なアクションは、使用されている特定の要素によって異なります。
その要素がサポートされていないからといって、解決策や回避策が見つからないとは限らないことに注意してください。それは、 SMA 自身が解決策を見つけられないことを意味するだけです。
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1143¶
メッセージ:An error occurred when loading the symbol table
カテゴリ: 変換エラー
説明¶
この問題は、 SMA 記号テーブルの記号をロードする際にエラーが発生すると表示されます。記号テーブルは、 SMA の基になるアーキテクチャの一部であり、より複雑な変換を可能にします。
その他の推奨事項¶
This is unlikely to be an error in the source code itself, but rather is an error in how the SMA processes the source code. The best resolution would be to post an issue in the SMA.
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1153¶
警告
This issue code has been deprecated since Spark Conversion Core Version 4.3.2
メッセージ: org.apache.spark.sql.functions.max has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.max function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.max function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1153 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent max function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1102¶
This issue code has been deprecated since Spark Conversion Core 2.3.22
メッセージ:Explode is not supported
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.explode function, which is not supported by Snowpark.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.explode function used to get the consolidated information of the array fields of the dataset.
出力
The SMA adds the EWI SPRKSCL1102 to the output code to let you know that this function is not supported by Snowpark.
推奨される修正
Since explode is not supported by Snowpark, the function flatten could be used as a substitute.
次の修正では、dfExplodeデータフレームのフラット化を実行し、クエリ結果をSparkに再現します。
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1136¶
警告
This issue code is deprecated since Spark Conversion Core 4.3.2
メッセージ: org.apache.spark.sql.functions.min has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.min function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.min function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1136 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent min function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that takes a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1167¶
メッセージ:Project file not found on input folder
カテゴリ: 警告
説明¶
この問題は、入力フォルダーにプロジェクト構成ファイルがないことを SMA が検出した場合に発生します。SMA でサポートされているプロジェクト構成ファイルは以下のとおりです。
build.sbt
build.gradle
pom.xml
その他の推奨事項¶
入力フォルダーに構成プロジェクトファイルを含めます。
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1147¶
メッセージ: org.apache.spark.sql.functions.tanh has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.tanh function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.tanh function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1147 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent tanh function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1116¶
警告
This issue code has been deprecated since Spark Conversion Core Version 2.40.1
メッセージ: org.apache.spark.sql.functions.split has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.split function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.split function that generates this EWI.
出力
The SMA adds the EWI SPRKSCL1116 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
For the Spark overload that receives two arguments, you can convert the second argument into a column object using the com.snowflake.snowpark.functions.lit function as a workaround.
3つの引数を受け取るオーバーロードはSnowparkではまだサポートされておらず、回避策もありません。
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1122¶
メッセージ: org.apache.spark.sql.functions.corr has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.corr function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.corr function, first used with column names as the arguments and then with column objects.
出力
The SMA adds the EWI SPRKSCL1122 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent corr function that receives two column objects as arguments. For that reason, the Spark overload that receives column objects as arguments is directly supported by Snowpark and does not require any changes.
For the overload that receives two string arguments, you can convert the strings into column objects using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1173¶
メッセージ: SQL embedded code cannot be processed.
カテゴリ: 警告。
説明¶
この問題は、 SMA が、処理できない SQL 埋め込みコードを検出した場合に発生します。その場合、 SQL 組み込みコードをSnowflakeに変換することはできません。
シナリオ¶
入力
以下は、処理できない SQL 埋め込みコードの例です。
出力
The SMA adds the EWI SPRKSCL1173 to the output code to let you know that the SQL-embedded code can not be processed.
推奨される修正
SQL 埋め込みコードが、補間、変数、文字列連結のない文字列であることを確認してください。
その他の推奨事項¶
You can find more information about SQL-embedded here.
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1163¶
メッセージ:The element is not a literal and can't be evaluated.
カテゴリ: 変換エラー。
説明¶
この問題は、現在の処理要素がリテラルでない場合に発生し、 SMA で評価することができません。
シナリオ¶
入力
以下は、処理する要素がリテラルではなく、 SMA で評価できない場合の例です。
出力
The SMA adds the EWI SPRKSCL1163 to the output code to let you know that format_type parameter is not a literal and it can not be evaluated by the SMA.
推奨される修正
予期せぬ動作を避けるために、変数の値が有効なものであることを確認してください。
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1132¶
メッセージ: org.apache.spark.sql.functions.grouping_id has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.grouping_id function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.grouping_id function, first used with multiple column name as arguments and then with column objects.
出力
The SMA adds the EWI SPRKSCL1132 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent grouping_id function that receives multiple column objects as arguments. For that reason, the Spark overload that receives multiple column objects as arguments is directly supported by Snowpark and does not require any changes.
For the overload that receives multiple string arguments, you can convert the strings into column objects using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1106¶
警告
この問題コードは、 廃止 されました。
メッセージ:Writer option is not supported.
カテゴリ: 変換エラー。
説明¶
この問題は、ツールがライターステートメントでSnowparkがサポートしていないオプションの使用を検出した場合に発生します。
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.DataFrameWriter.option used to add options to a writer statement.
出力
The SMA adds the EWI SPRKSCL1106 to the output code to let you know that the option method is not supported by Snowpark.
推奨される修正
このシナリオに対する推奨される修正はありません
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1157¶
メッセージ: org.apache.spark.sql.functions.kurtosis has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.kurtosis function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.kurtosis function that generates this EWI. In this example, the kurtosis function is used to calculate the kurtosis of selected column.
出力
The SMA adds the EWI SPRKSCL1157 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent kurtosis function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1146¶
メッセージ: org.apache.spark.sql.functions.tan has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.tan function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.tan function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1146 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent tan function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1117¶
警告
This issue code is deprecated since Spark Conversion Core 2.40.1
メッセージ: org.apache.spark.sql.functions.translate has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.translate function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.translate function that generates this EWI. In this example, the translate function is used to replace the characters 'a', 'e' and 'o' in each word with '1', '2' and '3', respectively.
出力
The SMA adds the EWI SPRKSCL1117 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
As a workaround, you can convert the second and third argument into a column object using the com.snowflake.snowpark.functions.lit function.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1123¶
メッセージ: org.apache.spark.sql.functions.cos has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.cos function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.cos function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1123 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent cos function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1172¶
メッセージ:Snowpark does not support StructFiled with metadata parameter.
カテゴリ: 警告
説明¶
This issue appears when the SMA detects that org.apache.spark.sql.types.StructField.apply with org.apache.spark.sql.types.Metadata as parameter. This is because Snowpark does not supported the metadata parameter.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.types.StructField.apply function that generates this EWI. In this example, the apply function is used to generate and instance of StructField.
出力
The SMA adds the EWI SPRKSCL1172 to the output code to let you know that metadata parameter is not supported by Snowflake.
推奨される修正
Snowpark has an equivalent com.snowflake.snowpark.types.StructField.apply function that receives three parameters. Then, as workaround, you can try to remove the metadata argument.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1162¶
注釈
この問題コードは、 廃止 されました。
メッセージ:An error occurred when extracting the dbc files.
カテゴリ: 警告。
説明¶
この問題は、dbcファイルを抽出できない場合に発生します。この警告は、重すぎる、アクセスできない、読み取り専用などのような原因が考えられます。
その他の推奨事項¶
回避策として、ファイルが重すぎて処理できない場合は、ファイルのサイズを確認することができます。また、アクセスの問題を避けるために、ツールがアクセスできるかどうかを分析します。
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1133¶
メッセージ: org.apache.spark.sql.functions.least has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.least function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.least function, first used with multiple column name as arguments and then with column objects.
出力
The SMA adds the EWI SPRKSCL1133 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent least function that receives multiple column objects as arguments. For that reason, the Spark overload that receives multiple column objects as arguments is directly supported by Snowpark and does not require any changes.
For the overload that receives multiple string arguments, you can convert the strings into column objects using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1107¶
警告
この問題コードは、 廃止 されました。
メッセージ:Writer save is not supported.
カテゴリ: 変換エラー。
説明¶
この問題は、ツールがライターステートメントで、Snowparkがサポートしていないライター保存メソッドの使用を検出した場合に発生します。
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.DataFrameWriter.save used to save the DataFrame content.
出力
The SMA adds the EWI SPRKSCL1107 to the output code to let you know that the save method is not supported by Snowpark.
推奨される修正
このシナリオに対する推奨される修正はありません
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1156¶
メッセージ: org.apache.spark.sql.functions.degrees has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.degrees function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.degrees function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1156 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent degrees function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1127¶
メッセージ: org.apache.spark.sql.functions.covar_samp has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.covar_samp function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.covar_samp function, first used with column names as the arguments and then with column objects.
出力
The SMA adds the EWI SPRKSCL1127 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent covar_samp function that receives two column objects as arguments. For that reason, the Spark overload that receives two column objects as arguments is directly supported by Snowpark and does not require any changes.
For the overload that receives two string arguments, you can convert the strings into column objects using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1113¶
メッセージ: org.apache.spark.sql.functions.next_day has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.next_day function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.next_day function, first used with a string as the second argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1113 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent next_day function that receives two column objects as arguments. For that reason, the Spark overload that receives two column objects as arguments is directly supported by Snowpark and does not require any changes.
For the overload that receives a column object and a string, you can convert the string into a column object using the com.snowflake.snowpark.functions.lit function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1002¶
Message: This code section has recovery from parsing errors *statement*
カテゴリ: 解析エラー。
説明¶
この問題は、 SMA がファイルのコードを正しく読み取れない、または理解できないステートメントを検出した場合に発生します。解析エラー と呼ばれますが、 SMA はこの解析エラーから回復し、ファイルのコードの分析を続行できます。この場合、 SMA はファイルのコードをエラーなく処理することができます。
シナリオ¶
入力
以下は、 SMA が回復できる無効なScalaコードの例です。
出力
The SMA adds the EWI SPRKSCL1002 to the output code to let you know that the code of the file has parsing errors, however the SMA can recovery from that error and continue analyzing the code of the file.
推奨される修正
メッセージはステートメントのエラーをピンポイントで示しているため、無効な構文を特定してそれを削除するか、そのステートメントをコメントアウトして解析エラーを回避することができます。
その他の推奨事項¶
ファイルのコードが有効なScalaコードであることを確認します。
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1142¶
Message: *spark element* is not defined
カテゴリ: 変換エラー
説明¶
この問題は、指定された要素に対して適切なマッピングステータスを SMA が決定できなかった場合に発生します。これは、この要素がSnowparkでサポートされているかどうかを SMA がまだ知らないことを意味します。これは、定義されていない要素に対して SMA が使用する一般的なエラーコードです。
シナリオ¶
入力
Below is an example of a function for which the SMA could not determine an appropriate mapping status, and therefore it generated this EWI. In this case, you should assume that notDefinedFunction() is a valid Spark function and the code runs.
出力
The SMA adds the EWI SPRKSCL1142 to the output code to let you know that this element is not defined.
推奨される修正
問題の識別を試みるために、以下を検証します。
有効なSpark要素かどうかを確認します。
要素の構文とスペルが正しいかどうかを確認します。
SMA でサポートされているSparkのバージョンを使用しているかどうかを確認します。
If this is a valid Spark element, please report that you encountered a conversion error on that particular element using the Report an Issue option of the SMA and include any additional information that you think may be helpful.
Please note that if an element is not defined by the SMA, it does not mean necessarily that it is not supported by Snowpark. You should check the Snowpark Documentation to verify if an equivalent element exist.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1152¶
メッセージ: org.apache.spark.sql.functions.variance has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.variance function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.variance function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1152 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent variance function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1103¶
この問題コードは、 廃止 されました。
Message: SparkBuilder method is not supported *method name*
カテゴリ: 変換エラー
説明¶
この問題は、 SparkBuilder メソッドチェーンでSnowflakeがサポートしていないメソッドを SMA が検出した場合に表示されます。したがって、リーダーステートメントの移行に影響を与える可能性があります。
以下はサポートされていない SparkBuilder メソッドです。
master
appName
enableHiveSupport
withExtensions
シナリオ¶
入力
以下は、Snowflakeがサポートしていないメソッドが多数ある SparkBuilder メソッドチェーンの例です。
出力
The SMA adds the EWI SPRKSCL1103 to the output code to let you know that master, appName and enableHiveSupport methods are not supported by Snowpark. Then, it might affects the migration of the Spark Session statement.
推奨される修正
セッションを作成するには、適切なSnowflake Snowparkの構成を追加する必要があります。
この例ではconfigs変数が使われています。
また、接続情報を含むconfigFile (profile.properties)の使用を推奨します。
And with the Session.builder.configFile the session can be created:
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1137¶
メッセージ: org.apache.spark.sql.functions.sin has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.sin function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.sin function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1137 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent sin function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1166¶
注釈
この問題コードは、 廃止 されました。
メッセージ: org.apache.spark.sql.DataFrameReader.format is not supported.
カテゴリ: 警告。
説明¶
This issue appears when the org.apache.spark.sql.DataFrameReader.format has an argument that is not supported by Snowpark.
シナリオ¶
There are some scenarios depending on the type of format you are trying to load. It can be a supported, or non-supported format.
シナリオ1¶
入力
このツールは、読み込む形式のタイプを分析します。
csvjsonorcparquettext
The below example shows how the tool transforms the format method when passing a csv value.
出力
The tool transforms the format method into a csv method call when load function has one parameter.
推奨される修正
この場合、ツールは EWI を表示しません。
シナリオ2¶
入力
The below example shows how the tool transforms the format method when passing a net.snowflake.spark.snowflake value.
出力
The tool shows the EWI SPRKSCL1166 indicating that the value net.snowflake.spark.snowflake is not supported.
推奨される修正
For the not supported scenarios there is no specific fix since it depends on the files that are trying to be read.
シナリオ3¶
入力
The below example shows how the tool transforms the format method when passing a csv, but using a variable instead.
出力
Since the tool can not determine the value of the variable in runtime, shows the EWI SPRKSCL1163 indicating that the value is not supported.
推奨される修正
As a workaround, you can check the value of the variable and add it as a string to the format call.
その他の推奨事項¶
The Snowpark location only accepts cloud locations using a snowflake stage.
The documentation of methods supported by Snowpark can be found in the documentation
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1118¶
メッセージ: org.apache.spark.sql.functions.trunc has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.trunc function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.trunc function that generates this EWI.
出力
The SMA adds the EWI SPRKSCL1118 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
As a workaround, you can convert the second argument into a column object using the com.snowflake.snowpark.functions.lit function.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1149¶
メッセージ: org.apache.spark.sql.functions.toRadians has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.toRadians function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.toRadians function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1149 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
As a workaround, you can use the radians function. For the Spark overload that receives a string argument, you additionally have to convert the string into a column object using the com.snowflake.snowpark.functions.col function.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1159¶
メッセージ: org.apache.spark.sql.functions.stddev_samp has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.stddev_samp function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.stddev_samp function that generates this EWI. In this example, the stddev_samp function is used to calculate the sample standard deviation of selected column.
出力
The SMA adds the EWI SPRKSCL1159 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent stddev_samp function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1108¶
注釈
この問題コードは、 廃止されました。
メッセージ: org.apache.spark.sql.DataFrameReader.format is not supported.
カテゴリ: 警告。
説明¶
This issue appears when the org.apache.spark.sql.DataFrameReader.format has an argument that is not supported by Snowpark.
シナリオ¶
There are some scenarios depending on the type of format you are trying to load. It can be a supported, or non-supported format.
シナリオ1¶
入力
このツールは、読み込む形式のタイプを分析します。
csvjsonorcparquettext
The below example shows how the tool transforms the format method when passing a csv value.
出力
The tool transforms the format method into a csv method call when load function has one parameter.
推奨される修正
この場合、ツールは EWI を表示しません。
シナリオ2¶
入力
The below example shows how the tool transforms the format method when passing a net.snowflake.spark.snowflake value.
出力
The tool shows the EWI SPRKSCL1108 indicating that the value net.snowflake.spark.snowflake is not supported.
推奨される修正
For the not supported scenarios there is no specific fix since it depends on the files that are trying to be read.
シナリオ3¶
入力
The below example shows how the tool transforms the format method when passing a csv, but using a variable instead.
出力
Since the tool can not determine the value of the variable in runtime, shows the EWI SPRKSCL1163 indicating that the value is not supported.
推奨される修正
As a workaround, you can check the value of the variable and add it as a string to the format call.
その他の推奨事項¶
The Snowpark location only accepts cloud locations using a snowflake stage.
The documentation of methods supported by Snowpark can be found in the documentation
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1128¶
メッセージ: org.apache.spark.sql.functions.exp has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.exp function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.exp function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1128 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent exp function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1169¶
Message: *Spark element* is missing on the method chaining.
カテゴリ: 警告。
説明¶
この問題は、メソッドチェーンでSpark要素の呼び出しが欠落していることを SMA が検出した場合に発生します。SMA ステートメントを分析するには、Spark要素を知る必要があります。
シナリオ¶
入力
以下は、メソッドチェーンでロード関数呼び出しが欠落している例です。
出力
The SMA adds the EWI SPRKSCL1169 to the output code to let you know that load function call is missing on the method chaining and SMA can not analyze the statement.
推奨される修正
メソッドチェーンのすべての関数呼び出しが同じステートメント内にあることを確認してください。
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1138¶
メッセージ: org.apache.spark.sql.functions.sinh has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.sinh function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.sinh function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1138 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent sinh function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1129¶
メッセージ: org.apache.spark.sql.functions.floor has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.floor function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.floor function, first used with a column name as an argument, then with a column object and finally with two column objects.
出力
The SMA adds the EWI SPRKSCL1129 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent floor function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
For the overload that receives a column object and a scale, you can use the callBuiltin function to invoke the Snowflake builtin FLOOR function. To use it, you should pass the string "floor" as the first argument, the column as the second argument and the scale as the third argument.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1168¶
Message: *Spark element* with argument(s) value(s) *given arguments* is not supported.
カテゴリ: 警告。
説明¶
この問題は、指定されたパラメーターを持つSpark要素がサポートされていないことを SMA が検出した場合に発生します。
シナリオ¶
入力
以下は、パラメーターがサポートされていないSpark要素の例です。
出力
The SMA adds the EWI SPRKSCL1168 to the output code to let you know that Spark element with the given parameter is not supported.
推奨される修正
このシナリオについては、特に修正はありません。
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1139¶
メッセージ: org.apache.spark.sql.functions.sqrt has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.sqrt function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.sqrt function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1139 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent sqrt function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1119¶
メッセージ: org.apache.spark.sql.Column.endsWith has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.Column.endsWith function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.Column.endsWith function, first used with a literal string argument and then with a column object argument.
出力
The SMA adds the EWI SPRKSCL1119 to the output code to let you know that this function is not directly supported by Snowpark, but it has a workaround.
推奨される修正
As a workaround, you can use the com.snowflake.snowpark.functions.endswith function, where the first argument would be the column whose values will be checked and the second argument the suffix to check against the column values. Please note that if the argument of the Spark's endswith function is a literal string, you should convert it into a column object using the com.snowflake.snowpark.functions.lit function.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1148¶
メッセージ: org.apache.spark.sql.functions.toDegrees has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.toDegrees function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.toDegrees function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1148 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
As a workaround, you can use the degrees function. For the Spark overload that receives a string argument, you additionally have to convert the string into a column object using the com.snowflake.snowpark.functions.col function.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1158¶
メッセージ: org.apache.spark.sql.functions.skewness has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.skewness function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.skewness function that generates this EWI. In this example, the skewness function is used to calculate the skewness of selected column.
出力
The SMA adds the EWI SPRKSCL1158 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent skew function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1109¶
注釈
この問題コードは、 廃止 されました。
メッセージ:The parameter is not defined for org.apache.spark.sql.DataFrameReader.option
カテゴリ: 警告
説明¶
This issue appears when the SMA detects that giving parameter of org.apache.spark.sql.DataFrameReader.option is not defined.
シナリオ¶
入力
Below is an example of undefined parameter for org.apache.spark.sql.DataFrameReader.option function.
出力
The SMA adds the EWI SPRKSCL1109 to the output code to let you know that giving parameter to the org.apache.spark.sql.DataFrameReader.option function is not defined.
推奨される修正
Check the Snowpark documentation for reader format option here, in order to identify the defined options.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1114¶
メッセージ: org.apache.spark.sql.functions.repeat has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.repeat function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.repeat function that generates this EWI.
出力
The SMA adds the EWI SPRKSCL1114 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
As a workaround, you can convert the second argument into a column object using the com.snowflake.snowpark.functions.lit function.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1145¶
メッセージ: org.apache.spark.sql.functions.sumDistinct has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.sumDistinct function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.sumDistinct function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1145 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
As a workaround, you can use the sum_distinct function. For the Spark overload that receives a string argument, you additionally have to convert the string into a column object using the com.snowflake.snowpark.functions.col function.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1171¶
メッセージ:Snowpark does not support split functions with more than two parameters or containing regex pattern.詳細については、ドキュメントをご参照ください。
カテゴリ: 警告。
説明¶
This issue appears when the SMA detects that org.apache.spark.sql.functions.split has more than two parameters or containing regex pattern.
シナリオ¶
The split function is used to separate the given column around matches of the given pattern. This Spark function has three overloads.
シナリオ1¶
入力
Below is an example of the org.apache.spark.sql.functions.split function that generates this EWI. In this example, the split function has two parameters and the second argument is a string, not a regex pattern.
出力
The SMA adds the EWI SPRKSCL1171 to the output code to let you know that this function is not fully supported by Snowpark.
推奨される修正
Snowpark has an equivalent split function that receives a column object as a second argument. For that reason, the Spark overload that receives a string argument in the second argument, but it is not a regex pattern, can convert the string into a column object using the com.snowflake.snowpark.functions.lit function as a workaround.
シナリオ2¶
入力
Below is an example of the org.apache.spark.sql.functions.split function that generates this EWI. In this example, the split function has two parameters and the second argument is a regex pattern.
出力
The SMA adds the EWI SPRKSCL1171 to the output code to let you know that this function is not fully supported by Snowpark because regex patterns are not supported by Snowflake.
推奨される修正
Snowflakeは正規表現パターンをサポートしていないため、正規表現パターンではない文字列で置き換えてみてください。
シナリオ3¶
入力
Below is an example of the org.apache.spark.sql.functions.split function that generates this EWI. In this example, the split function has more than two parameters.
出力
The SMA adds the EWI SPRKSCL1171 to the output code to let you know that this function is not fully supported by Snowpark, because Snowflake does not have a split function with more than two parameters.
推奨される修正
Snowflakeは2つ以上のパラメーターを持つ分割関数をサポートしていないため、Snowflakeがサポートしている分割関数を使用してみてください。
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1120¶
メッセージ: org.apache.spark.sql.functions.asin has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.asin function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.asin function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1120 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent asin function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1130¶
メッセージ: org.apache.spark.sql.functions.greatest has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.greatest function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.greatest function, first used with multiple column names as arguments and then with multiple column objects.
出力
The SMA adds the EWI SPRKSCL1130 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent greatest function that receives multiple column objects as arguments. For that reason, the Spark overload that receives column objects as arguments is directly supported by Snowpark and does not require any changes.
For the overload that receives multiple string arguments, you can convert the strings into column objects using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
説明: >- SnowparkとSnowpark Extensionsはプロジェクト構成ファイルに追加されませんでした。
SPRKSCL1161¶
メッセージ:Failed to add dependencies.
カテゴリ: 変換エラー。
説明¶
この問題は、 SMA でサポートされていないSparkバージョンがプロジェクト構成ファイルで SMA により検出された場合に発生します。このため、 SMA は、SnowparkおよびSnowpark Extensionsの依存関係を対応するプロジェクト構成ファイルに追加できませんでした。Snowpark依存関係が追加されていない場合、移行されたコードはコンパイルされません。
シナリオ¶
sbt、gradle、pom.xmlの3つのシナリオが考えられます。SMA は、Spark依存関係を削除し、SnowparkとSnowpark Extensions依存関係を追加することで、プロジェクト構成ファイルを処理しようとします。
シナリオ1¶
入力
Below is an example of the dependencies section of a sbt project configuration file.
出力
The SMA adds the EWI SPRKSCL1161 to the issues inventory since the Spark version is not supported and keeps the output the same.
推奨される修正
Manually, remove the Spark dependencies and add Snowpark and Snowpark Extensions dependencies to the sbt project configuration file.
プロジェクトの要件に最適なSnowparkバージョンを使用してください。
シナリオ2¶
入力
Below is an example of the dependencies section of a gradle project configuration file.
出力
The SMA adds the EWI SPRKSCL1161 to the issues inventory since the Spark version is not supported and keeps the output the same.
推奨される修正
Manually, remove the Spark dependencies and add Snowpark and Snowpark Extensions dependencies to the gradle project configuration file.
依存関係のバージョンが、ご自身のプロジェクトのニーズに従っていることを確認してください。
シナリオ3¶
入力
Below is an example of the dependencies section of a pom.xml project configuration file.
出力
The SMA adds the EWI SPRKSCL1161 to the issues inventory since the Spark version is not supported and keeps the output the same.
推奨される修正
Manually, remove the Spark dependencies and add Snowpark and Snowpark Extensions dependencies to the gradle project configuration file.
依存関係のバージョンが、ご自身のプロジェクトのニーズに従っていることを確認してください。
その他の推奨事項¶
入力にプロジェクト構成ファイルがあることを確認してください。
build.sbt
build.gradle
pom.xml
SMA がサポートするSparkのバージョンは2.12:3.1.2です。
You can check the latest Snowpark version here.
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1155¶
警告
This issue code has been deprecated since Spark Conversion Core Version 4.3.2
メッセージ: org.apache.spark.sql.functions.countDistinct has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.countDistinct function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.countDistinct function, first used with column names as arguments and then with column objects.
出力
The SMA adds the EWI SPRKSCL1155 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
As a workaround, you can use the count_distinct function. For the Spark overload that receives string arguments, you additionally have to convert the strings into column objects using the com.snowflake.snowpark.functions.col function.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1104¶
この問題コードは、 廃止 されました。
メッセージ:Spark Session builder option is not supported.
カテゴリ: 変換エラー。
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.SparkSession.Builder.config function, which is setting an option of the Spark Session and it is not supported by Snowpark.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.SparkSession.Builder.config function used to set an option in the Spark Session.
出力
The SMA adds the EWI SPRKSCL1104 to the output code to let you know config method is not supported by Snowpark. Then, it is not possible to set options in the Spark Session via config function and it might affects the migration of the Spark Session statement.
推奨される修正
セッションを作成するには、適切なSnowflake Snowparkの構成を追加する必要があります。
この例ではconfigs変数が使われています。
また、接続情報を含むconfigFile (profile.properties)の使用を推奨します。
And with the Session.builder.configFile the session can be created:
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1124¶
メッセージ: org.apache.spark.sql.functions.cosh has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.cosh function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.cosh function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1124 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent cosh function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1175¶
Message: The two-parameter udf function is not supported in Snowpark. It should be converted into a single-parameter udf function. Please check the documentation to learn how to manually modify the code to make it work in Snowpark.
カテゴリ: 変換エラー。
説明¶
This issue appears when the SMA detects an use of the two-parameter org.apache.spark.sql.functions.udf function in the source code, because Snowpark does not have an equivalent two-parameter udf function, then the output code might not compile.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.udf function that generates this EWI. In this example, the udf function has two parameters.
出力
The SMA adds the EWI SPRKSCL1175 to the output code to let you know that the udf function is not supported, because it has two parameters.
推奨される修正
Snowpark only supports the single-parameter udf function (without the return type parameter), so you should convert your two-parameter udf function into a single-parameter udf function in order to make it work in Snowpark.
たとえば、上記のサンプルコードの場合は、手動で次のように変換する必要があります。
Please note that there are some caveats about creating udf in Snowpark that might require you to make some additional manual changes to your code. Please check this other recommendations here related with creating single-parameter udf functions in Snowpark for more details.
その他の推奨事項¶
To learn more about how to create user-defined functions in Snowpark, please refer to the following documentation: Creating User-Defined Functions (UDFs) for DataFrames in Scala
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1001¶
Message: This code section has parsing errors. The parsing error was found at: line *line number*, column *column number*. When trying to parse *statement*. This file was not converted, so it is expected to still have references to the Spark API.
カテゴリ: 解析エラー。
説明¶
このエラーは、 SMA がファイルのコードを正しく読み取れない、または理解できないステートメントを検出した場合に発生します。これは 解析エラー と呼ばれます。また、この問題は、ファイルに1つ以上の解析エラーがある場合に発生します。
シナリオ¶
入力
以下は無効なScalaコードの例です。
出力
The SMA adds the EWI SPRKSCL1001 to the output code to let you know that the code of the file has parsing errors. Therefore, SMA is not able to process a file with this error.
推奨される修正
メッセージはエラーステートメントをピンポイントで示しているため、無効な構文を特定してそれを削除するか、そのステートメントをコメントアウトして解析エラーを回避することができます。
その他の推奨事項¶
ファイルのコードが有効なScalaコードであることを確認します。
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1141¶
メッセージ: org.apache.spark.sql.functions.stddev_pop has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.stddev_pop function, which has a workaround.
シナリオ¶
Below is an example of the org.apache.spark.sql.functions.stddev_pop function, first used with a column name as an argument and then with a column object.
入力
出力
The SMA adds the EWI SPRKSCL1141 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent stddev_pop function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1110¶
注釈
この問題コードは、 廃止 されました。
Message: Reader method not supported *method name*.
カテゴリ: 警告
説明¶
この問題は、 DataFrameReader メソッドチェーンでSnowflakeがサポートしていないメソッドを SMA が検出した場合に表示されます。この場合、リーダーステートメントの移行に影響を与える可能性があります。
シナリオ¶
入力
以下は、ロードメソッドがSnowflakeでサポートされていない DataFrameReader メソッドチェーンの例です。
出力
The SMA adds the EWI SPRKSCL1110 to the output code to let you know that load method is not supported by Snowpark. Then, it might affects the migration of the reader statement.
推奨される修正
Check the Snowpark documentation for reader here, in order to know the supported methods by Snowflake.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1100¶
This issue code has been deprecated since Spark Conversion Core 2.3.22
メッセージ:Repartition is not supported.
カテゴリ: 解析エラー。
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.DataFrame.repartition function, which is not supported by Snowpark. Snowflake manages the storage and the workload on the clusters making repartition operation inapplicable.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.DataFrame.repartition function used to return a new DataFrame partitioned by the given partitioning expressions.
出力
The SMA adds the EWI SPRKSCL1100 to the output code to let you know that this function is not supported by Snowpark.
推奨される修正
Snowflakeはストレージとクラスタ上のワークロードを管理するため、再パーティション操作は適用できません。つまり、結合の前に再パーティションを使用する必要はまったくありません。
その他の推奨事項¶
The Snowflake's architecture guide provides insight about Snowflake storage management.
Snowpark Dataframe reference could be useful in how to adapt a particular scenario without the need of repartition.
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1151¶
メッセージ: org.apache.spark.sql.functions.var_samp has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.var_samp function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.var_samp function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1151 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent var_samp function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
説明: >- DataFrameReader メソッドチェーンのリーダーの形式はSnowparkによって定義されたものではありません。
SPRKSCL1165¶
メッセージ:Reader format on DataFrameReader method chaining can't be defined
カテゴリ: 警告
説明¶
This issue appears when the SMA detects that format of the reader in DataFrameReader method chaining is not one of the following supported for Snowpark: avro, csv, json, orc, parquet and xml. Therefore, the SMA can not determine if setting options are defined or not.
シナリオ¶
入力
以下は、 DataFrameReader メソッドチェーンの例です。SMA は、リーダーの形式を決定することができます。
出力
The SMA adds the EWI SPRKSCL1165 to the output code to let you know that format of the reader can not be determine in the giving DataFrameReader method chaining.
推奨される修正
Check the Snowpark documentation here to get more information about format of the reader.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1134¶
メッセージ: org.apache.spark.sql.functions.log has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.log function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.log function that generates this EWI.
出力
The SMA adds the EWI SPRKSCL1134 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Below are the different workarounds for all the overloads of the log function.
1. def log(base:Double, columnName:String): 列
You can convert the base into a column object using the com.snowflake.snowpark.functions.lit function and convert the column name into a column object using the com.snowflake.snowpark.functions.col function.
2. def log(base:Double, a:Column): 列
You can convert the base into a column object using the com.snowflake.snowpark.functions.lit function.
3.def log(columnName:String): 列
You can pass lit(Math.E) as the first argument and convert the column name into a column object using the com.snowflake.snowpark.functions.col function and pass it as the second argument.
4. def log(e:Column): 列
You can pass lit(Math.E) as the first argument and the column object as the second argument.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1125¶
警告
This issue code is deprecated since Spark Conversion Core 2.9.0
メッセージ: org.apache.spark.sql.functions.count has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.count function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.count function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1125 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent count function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1174¶
Message: The single-parameter udf function is supported in Snowpark but it might require manual intervention. Please check the documentation to learn how to manually modify the code to make it work in Snowpark.
カテゴリ: 警告。
説明¶
This issue appears when the SMA detects an use of the single-parameter org.apache.spark.sql.functions.udf function in the code. Then, it might require a manual intervention.
The Snowpark API provides an equivalent com.snowflake.snowpark.functions.udf function that allows you to create a user-defined function from a lambda or function in Scala, however, there are some caveats about creating udf in Snowpark that might require you to make some manual changes to your code in order to make it work properly.
シナリオ¶
The Snowpark udf function should work as intended for a wide range of cases without requiring manual intervention. However, there are some scenarios that would requiere you to manually modify your code in order to get it work in Snowpark. Some of those scenarios are listed below:
シナリオ1¶
入力
以下は、App Traitを使ってオブジェクト内に UDFs を作成する例です。
The Scala's App trait simplifies creating executable programs by providing a main method that automatically runs the code within the object definition. Extending App delays the initialization of the fields until the main method is executed, which can affect the UDFs definitions if they rely on initialized fields. This means that if an object extends App and the udf references an object field, the udf definition uploaded to Snowflake will not include the initialized value of the field. This can result in null values being returned by the udf.
For example, in the following code the variable myValue will resolve to null in the udf definition:
出力
The SMA adds the EWI SPRKSCL1174 to the output code to let you know that the single-parameter udf function is supported in Snowpark but it requires manual intervention.
推奨される修正
To avoid this issue, it is recommended to not extend App and implement a separate main method for your code. This ensure that object fields are initialized before udf definitions are created and uploaded to Snowflake.
For more details about this topic, see Caveat About Creating UDFs in an Object With the App Trait.
シナリオ2¶
入力
以下は、Jupyter Notebooksで UDFs を作成する例です。
出力
The SMA adds the EWI SPRKSCL1174 to the output code to let you know that the single-parameter udf function is supported in Snowpark but it requires manual intervention.
推奨される修正
To create a udf in a Jupyter Notebook, you should define the implementation of your function in a class that extends Serializable. For example, you should manually convert it into this:
For more details about how to create UDFs in Jupyter Notebooks, see Creating UDFs in Jupyter Notebooks.
その他の推奨事項¶
To learn more about how to create user-defined functions in Snowpark, please refer to the following documentation: Creating User-Defined Functions (UDFs) for DataFrames in Scala
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1000¶
Message: Source project spark-core version is *version number*, the spark-core version supported by snowpark is 2.12:3.1.2 so there may be functional differences between the existing mappings
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a version of the spark-core that is not supported by SMA. Therefore, there may be functional differences between the existing mappings and the output might have unexpected behaviors.
その他の推奨事項¶
SMA でサポートされているSpark-coreのバージョンは2.12:3.1.2です。ソースコードのバージョンを変更することを検討してください。
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1140¶
メッセージ: org.apache.spark.sql.functions.stddev has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.stddev function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.stddev function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1140 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent stddev function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1111¶
注釈
この問題コードは、 廃止 されました。
メッセージ: CreateDecimalType is not supported.
カテゴリ: 変換エラー。
説明¶
This issue appears when the SMA detects a usage org.apache.spark.sql.types.DataTypes.CreateDecimalType function.
シナリオ¶
入力
以下は、org.apache.spark.sql.types.DataTypes.CreateDecimalType 関数の使用例です。
出力
The SMA adds the EWI SPRKSCL1111 to the output code to let you know that CreateDecimalType function is not supported by Snowpark.
推奨される修正
推奨される修正はまだありません。
メッセージ:Spark Session builder option is not supported.
カテゴリ: 変換エラー。
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.SparkSession.Builder.config function, which is setting an option of the Spark Session and it is not supported by Snowpark.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.SparkSession.Builder.config function used to set an option in the Spark Session.
出力
The SMA adds the EWI SPRKSCL1104 to the output code to let you know config method is not supported by Snowpark. Then, it is not possible to set options in the Spark Session via config function and it might affects the migration of the Spark Session statement.
推奨される修正
セッションを作成するには、適切なSnowflake Snowparkの構成を追加する必要があります。
この例ではconfigs変数が使われています。
また、接続情報を含むconfigFile (profile.properties)の使用を推奨します。
And with the Session.builder.configFile the session can be created:
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1101¶
This issue code has been deprecated since Spark Conversion Core 2.3.22
メッセージ:Broadcast is not supported
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.broadcast function, which is not supported by Snowpark. This function is not supported because Snowflake does not support broadcast variables.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.broadcast function used to create a broadcast object to use on each Spark cluster:
出力
The SMA adds the EWI SPRKSCL1101 to the output code to let you know that this function is not supported by Snowpark.
推奨される修正
Snowflakeはストレージとクラスター上のワークロードを管理するため、ブロードキャストオブジェクトは適用できません。つまり、ブロードキャストの使用はまったく必要ないということですが、それぞれのケースでさらなる分析が必要です。
The recommended approach is replace a Spark dataframe broadcast by a Snowpark regular dataframe or by using a dataframe method as Join.
For the proposed input the fix is to adapt the join to use directly the dataframe collegeDF without the use of broadcast for the dataframe.
その他の推奨事項¶
The Snowflake's architecture guide provides insight about Snowflake storage management.
Snowpark Dataframe reference could be useful in how to adapt a particular broadcast scenario.
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1150¶
メッセージ: org.apache.spark.sql.functions.var_pop has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.var_pop function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.var_pop function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1150 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent var_pop function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
説明: >- org.apache.spark.sql.DataFrameReader.option関数のパラメーターが定義されていません。
SPRKSCL1164¶
注釈
この問題コードは、 廃止 されました。
メッセージ:The parameter is not defined for org.apache.spark.sql.DataFrameReader.option
カテゴリ: 警告
説明¶
This issue appears when the SMA detects that giving parameter of org.apache.spark.sql.DataFrameReader.option is not defined.
シナリオ¶
入力
Below is an example of undefined parameter for org.apache.spark.sql.DataFrameReader.option function.
出力
The SMA adds the EWI SPRKSCL1164 to the output code to let you know that giving parameter to the org.apache.spark.sql.DataFrameReader.option function is not defined.
推奨される修正
Check the Snowpark documentation for reader format option here, in order to identify the defined options.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1135¶
警告
This issue code is deprecated since Spark Conversion Core 4.3.2
メッセージ: org.apache.spark.sql.functions.mean has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.mean function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.mean function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1135 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent mean function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1115¶
警告
This issue code has been deprecated since Spark Conversion Core Version 4.6.0
メッセージ: org.apache.spark.sql.functions.round has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.round function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.round function that generates this EWI.
出力
The SMA adds the EWI SPRKSCL1115 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent round function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a column object and a scale, you can convert the scale into a column object using the com.snowflake.snowpark.functions.lit function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1144¶
メッセージ:The symbol table could not be loaded
カテゴリ: 解析エラー
説明¶
この問題は、 SMA の実行プロセスで重大なエラーが発生した場合に表示されます。記号テーブルがロードできないため、 SMA は評価または変換処理を開始できません。
その他の推奨事項¶
This is unlikely to be an error in the source code itself, but rather is an error in how the SMA processes the source code. The best resolution would be to post an issue in the SMA.
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1170¶
注釈
この問題コードは、 廃止 されました。
メッセージ: sparkConfig member key is not supported with platform specific key.
カテゴリ: 変換エラー
説明¶
古いバージョンをお使いの場合は、最新のバージョンにアップグレードしてください。
その他の推奨事項¶
アプリケーションを最新バージョンにアップグレードします。
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1121¶
メッセージ: org.apache.spark.sql.functions.atan has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.atan function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.atan function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1121 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent atan function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1131¶
メッセージ: org.apache.spark.sql.functions.grouping has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.grouping function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.grouping function, first used with a column name as an argument and then with a column object.
出力
The SMA adds the EWI SPRKSCL1131 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent grouping function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1160¶
注釈
This issue code has been deprecated since Spark Conversion Core 4.1.0
メッセージ: org.apache.spark.sql.functions.sum has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.sum function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.sum function that generates this EWI. In this example, the sum function is used to calculate the sum of selected column.
出力
The SMA adds the EWI SPRKSCL1160 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent sum function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1154¶
メッセージ: org.apache.spark.sql.functions.ceil has a workaround, see documentation for more info
カテゴリ: 警告
説明¶
This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.ceil function, which has a workaround.
シナリオ¶
入力
Below is an example of the org.apache.spark.sql.functions.ceil function, first used with a column name as an argument, then with a column object and finally with a column object and a scale.
出力
The SMA adds the EWI SPRKSCL1154 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
推奨される修正
Snowpark has an equivalent ceil function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.
For the overload that receives a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.
For the overload that receives a column object and a scale, you can use the callBuiltin function to invoke the Snowflake builtin CEIL function. To use it, you should pass the string "ceil" as the first argument, the column as the second argument and the scale as the third argument.
その他の推奨事項¶
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.
SPRKSCL1105¶
この問題コードは、 廃止 されました。
メッセージ:Writer format value is not supported.
カテゴリ: 変換エラー
説明¶
This issue appears when the org.apache.spark.sql.DataFrameWriter.format has an argument that is not supported by Snowpark.
シナリオ¶
There are some scenarios depending on the type of format you are trying to save. It can be a supported, or non-supported format.
シナリオ1¶
入力
このツールは、保存を試みている形式のタイプを分析します。
csvjsonorcparquettext
出力
The tool transforms the format method into a csv method call when save function has one parameter.
推奨される修正
この場合、ツールは EWI を表示しません。
シナリオ2¶
入力
The below example shows how the tool transforms the format method when passing a net.snowflake.spark.snowflake value.
出力
The tool shows the EWI SPRKSCL1105 indicating that the value net.snowflake.spark.snowflake is not supported.
推奨される修正
For the not supported scenarios there is no specific fix since it depends on the files that are trying to be read.
シナリオ3¶
入力
The below example shows how the tool transforms the format method when passing a csv, but using a variable instead.
出力
Since the tool can not determine the value of the variable in runtime, shows the EWI SPRKSCL1163 indicating that the value is not supported.
推奨される修正
As a workaround, you can check the value of the variable and add it as a string to the format call.
その他の推奨事項¶
The Snowpark location only accepts cloud locations using a snowflake stage.
The documentation of methods supported by Snowpark can be found in the documentation
For more support, you can email us at sma-support@snowflake.com or post an issue in the SMA.