With SQL Azure Managed Instance entering General Availability some months ago – https://azure.microsoft.com/en-us/blog/azure-sql-database-managed-instance-general-purpose-tier-general-availability/ we are seeing a lot of interest in this offering. I already blogged about this offering in post Forecast Cloudy – Set Up SQL Server Managed Instance via Azure PowerShell, so you can see basic details on that offering in that post. But if I am running SQL Server in AWS Relational Database Service (RDS) how can I migrate database to SQL Azure MI? This is what I would like to show you today.
For my test I have created SQL Server express instance in AWS RDS as you can see below:
Once its created you will need to enable Backup\Restore functionality to this instance from AWS S3 storage. I will create custom option group , assign that group to my database and add special SQLSERVER_BACKUP_RESTORE option to be able to do so.
After I am done, one can see SQLSERVER_BACKUP_RESTORE in Options tab In AWS RDS
Now, I will create S3 bucket to be able to backup my source Adventure Works database to S3. You can use directions here to create AWS S3 Storage Bucket – https://docs.aws.amazon.com/AmazonS3/latest/gsg/CreatingABucket.html
After creation it should look something like this , note I also added folder under the backet named Backup
Now I can backup my Adventure Works database to S3 bucket. I will use built in rds_backup_database stored procedure in msdb database to backup Adventure Works
exec msdb.dbo.rds_backup_database @source_db_name='database_name', @s3_arn_to_backup_to='arn:aws:s3:::bucket_name/file_name_and_extension', @overwrite_S3_backup_file=1;
Here running backup with my example below
Once you start a backup or restore process, you can track their progress or cancel it:
- rds_task_status: to track the progress of the backup or restore task
- rds_cancel_task: to cancel a running backup or restore task
I will grab task id and use rds_task_status to query and see status of backup
exec msdb..rds_task_status @task_id= 5;
We can keep running this command to check the status of the restore task. The “lifecycle” column of the output will change to reflect the status which can be any of these:
The “% complete” column will show how far the operation has completed. When finished successfully, the output will show a “% complete” value of 100 and a “lifecycle” value of “success”
Now we can see that Adventure Works is fully backed up to S3
Next I will have to move that backup into Azure Blob Storage. To do so I will use awesome utility written by Ken Faulkner called AzureCopy. Cannot believe that no one else thought of anything like it, for example AzCopy standard tool from Microsoft cannot move files from AWS S3 bucket to Azure Blob. You can get this utility here – https://kpfaulkner.wordpress.com/azurecopyrelease/ . Once downloaded and installed I will use command line and navigate to the folder where its installed. Then I will set up all of my necessary secret keys to connect to both S3 bucket and Azure blob:
set AzureAccountKey=MyAzureStorageAccountKey set AWSAccessKeyID=MyS3AccessId set AWSSecretAccessKeyID=MyS3SecretKey set AWSRegion value=us-west-2
Obviously I am not listing my secret keys here, however hopefully you get an idea. Now that my huge secret keys are successfully packaged in variables I can go ahead and try listing my files in S3 bucket like:
azurecopy -list https://s3-us-west-2.amazonaws.com/gennadykbackup/Backup/ -azurekey %AzureAccountKey% -s3k %AWSAccessKeyID% -s3sk %AWSSecretAccessKeyID%
Get response like this:
Next I list files in Azure Blob, that of course will be empty as I have no files there yet.
azurecopy -list https://gennadyktarget.blob.core.windows.net/backups -azurekey %AzureAccountKey% -s3k %AWSAccessKeyID% -s3sk %AWSSecretAccessKeyID%
Response looks like this:
Now lets actually move the backup file:
azurecopy -i https://s3-us-west-2.amazonaws.com/gennadykbackup/Backup/ -o https://gennadyktarget.blob.core.windows.net/backups -azurekey %AzureAccountKey% -s3k %AWSAccessKeyID% -s3sk %AWSSecretAccessKeyID% -blobcopy -destblobtype block
And the result after a bit is here:
Now I see backup in my Azure Blob Storage:
Now I need to restore it in SQL Azure Managed Instance. I already created one, to do so one can follow tutorials here – https://docs.microsoft.com/en-us/azure/sql-database/sql-database-managed-instance-get-started , https://www.brentozar.com/archive/2018/03/how-to-deploy-your-first-azure-sql-db-managed-instance-preview/ and https://blogs.msdn.microsoft.com/sqlserverstorageengine/2018/11/14/create-azure-sql-managed-instance-using-azure-cli/.
I will log into my SQL Azure MI via SSMS and my Azure Blob storage account and SAS key to create a credential in my Managed Instance.
CREATE CREDENTIAL [https://gennadyktarget.blob.core.windows.net/backups] WITH IDENTITY = 'SHARED ACCESS SIGNATURE' , SECRET = 'My Secret SSAS Key'
After that to check my credential, I will run the following script, which uses a container URL to get a backup file list.
RESTORE FILELISTONLY FROM URL = 'https://gennadyktarget.blob.core.windows.net/backups/AdventureWorks2014.bak'
Now since I can see file list I can proceed with restore:
RESTORE DATABASE [AdventureWorks2014] FROM URL = 'https://gennadyktarget.blob.core.windows.net/backups/AdventureWorks2014.bak'
And finally after a bit – result is in SSMS:
So finally we are done. My Adventure Works database is fully migrated from AWS RDS to Azure DB Managed Instance. Hope this helps you or you find it interesting.