Analyzing Queries and Troubleshooting with Snowpark Java

This topic provides some guidelines on analyzing queries and troubleshooting problems when working with the Snowpark library.

Viewing the Execution Plan for a Query in Snowpark

To inspect the evaluation plan of a DataFrame, call the explain method of the DataFrame. This prints the SQL statements used to evaluate the DataFrame. If there is only one SQL statement, the method also prints the logical plan for the statement.

----------DATAFRAME EXECUTION PLAN----------
Query List:
  "_1" AS "col %",
  "_2" AS "col *"
          (1 :: int, 2 :: int),
          (3 :: int, 4 :: int) AS SN_TEMP_OBJECT_639016133("_1", "_2")
Logical Execution Plan:
1:0     ->Result  SN_TEMP_OBJECT_639016133.COLUMN1, SN_TEMP_OBJECT_639016133.COLUMN2
1:1          ->ValuesClause  (1, 2), (3, 4)


After the execution of a DataFrame has been triggered, you can check on the progress of the query in the History History tab page in the Classic Console.

In the Query Tag column, you can find the name of the function and the line number in your code that triggered this query.

Snowpark request in the History page in the Classic Console


Changing the Logging Settings

By default, the Snowpark library logs INFO level messages to stdout. To change the logging settings, create a file, and configure the logger properties in that file. For example, to set the log level to DEBUG:

# file (a text file)
# Set the default log level for the SimpleLogger to DEBUG.

Put this file in your classpath. If you are using a Maven directory layout, put the file in the src/main/resources/ directory.

java.lang.OutOfMemoryError Exceptions

If a java.lang.OutOfMemoryError exception is thrown, increase the maximum heap size for the JVM (e.g. through the -J-Xmxmaximum_size flag).