SnowConvert: Data Migration¶
SnowConvert AI, as part of the end-to-end migration experience, provides the capability to migrate your actual data from source tables to Snowflake after the database structure has been deployed. This data migration feature ensures your data is transferred efficiently and accurately to complete the migration process.
Migration Process Overview¶
The data migration process varies depending on your source database platform:
Amazon Redshift to Snowflake¶
SnowConvert migrates data from Redshift tables by unloading it to PARQUET files in an S3 bucket, then copying the data directly from those files to the deployed tables in Snowflake.
SQL Server to Snowflake¶
Data migration from SQL Server utilizes optimized data transfer methods to move your table data efficiently to the corresponding Snowflake tables.
Prerequisites¶
For Amazon Redshift Sources¶
Before executing data migration from Redshift, you need the following prerequisites:
S3 Bucket Requirements¶
An S3 bucket in AWS in the same region as your Redshift cluster
Empty bucket path (the process will fail if files exist in the specified path)
Warning
To migrate data from Redshift tables, your S3 bucket must be in the same region as your Redshift cluster. Data migration with S3 buckets in different regions will be supported in future releases.
IAM Role for Redshift¶
Create an IAM role associated with your Redshift cluster that can unload data to your S3 bucket:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:GetBucketLocation",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::<your_bucket_name>/*",
"arn:aws:s3:::<your_bucket_name>"
]
}
]
}
IAM User for S3 Access¶
Create an IAM user with permissions to read and delete objects from your S3 bucket:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:GetObjectVersion",
"s3:DeleteObject",
"s3:DeleteObjectVersion",
"s3:ListBucket",
"s3:GetBucketLocation"
],
"Resource": [
"arn:aws:s3:::<your_bucket_name>/*",
"arn:aws:s3:::<your_bucket_name>"
]
}
]
}
Warning
If you don’t provide s3:DeleteObject
and s3:DeleteObjectVersion
permissions, the data migration will succeed, but the temporary data files will not be deleted from the S3 bucket.
For SQL Server Sources¶
Valid connection to your SQL Server source database
Appropriate permissions to read data from source tables
Network connectivity between the migration tool and both source and target systems
General Requirements¶
Completed deployment process with database structure in Snowflake
Active connections to both source database and Snowflake account
Sufficient permissions for data operations on both source and target systems
How to use this feature¶
This feature completes the end-to-end migration process by transferring your actual data to the deployed database structure in Snowflake.
For Amazon Redshift Data Migration¶
Usage¶
Configure S3 bucket settings by clicking “Set S3 Bucket Settings” and providing:
S3 Bucket URL (must end with “/”)
IAM Role ARN for unloading data to S3
Access Key ID for the IAM user with S3 permissions
Secret Access Key for the IAM user
Select tables for migration by choosing the tables whose data you want to migrate to Snowflake.
Initiate data migration by clicking “Migrate Data”. This starts the process of unloading data to S3 and copying it to Snowflake tables.
Monitor migration progress as the system updates the data migration status for each table, indicating success or failure.
Review migration results on the results page, which validates the number of rows transferred and provides detailed information about each migrated table.
For SQL Server Data Migration¶
Usage¶
Verify source connection to ensure connectivity to your SQL Server database.
Select tables for migration by choosing the tables whose data you want to transfer to Snowflake.
Execute data migration to transfer data from SQL Server tables to corresponding Snowflake tables.
Validate migration results by reviewing row counts and data integrity reports.