Azure SQL Import/Export

Chetan Prakash Saraswat 26 Reputation points
2024-12-21T14:24:28.0433333+00:00

We have multiple databases within a single server sizing way above 100GB of data hosted in Azure. Normally we tend to restore the same production copy to our non-prod environments using SSMS .backpac method of export and import, however in bigger data sets we have encountered multiple network failures and also non-clustered indexes not getting migrated at all, which results in their recreation. Tried Azure SQL's 'export the db' and 'import' , however it just keeps on staying at 1% and never completes (tried for sample AdventureWorks db and results in same stuck at 1% which is weird). What might be the best solutions available for this. Can we do it via ADF, Power Automate or any kind of CLI script or any feasible and targeted solution would be helpful. Please suggest.. Thanks in advance..

Azure SQL Database
SQL Server
SQL Server
A family of Microsoft relational database management and analysis systems for e-commerce, line-of-business, and data warehousing solutions.
14,223 questions
{count} votes

2 answers

Sort by: Most helpful
  1. Azar 24,685 Reputation points MVP
    2024-12-21T20:29:56.7633333+00:00

    Hi there Chetan Prakash Saraswat

    Thanks for using QandA platform

    Tryr storing the .bacpac in Blob and for import/export operations, which gives better control and reliability. ADF is another goood option, enabling schema and data transfer through pipelines with support for parallelism, ideal for large datasets. For consistent and efficient migrations, transactional replication can replicate data between environments. also, partitioning data with PowerShell can avoid failures by handling data in smaller chunks.

    If this helps kindly accept the answer thanks much.


  2. Sina Salam 14,626 Reputation points
    2024-12-21T20:54:07.0166667+00:00

    Hello Chetan Prakash Saraswat,

    Welcome to the Microsoft Q&A and thank you for posting your questions here.

    I understand that you are having challenges with Azure SQL Import and Export large data.

    For you restoring large databases reliably without network failures and preserving indexes. The combination of Azure Data Factory, SQLPackage.exe, and manual index scripting provides a robust solution with the steps below, you can achieve your goals with no overhead and try to use the links for provided for more details:

    1. To avoid single-point failures and ensures data consistency, implement a reliable Data Transfer Using Azure Data Factory. - https://zcusa.951200.xyz/en-us/azure/data-factory/quickstart-hello-world-copy-data-tool Azure Data Factory can handle large dataset transfers effectively, bypassing network issues:

    Create a pipeline in ADF to copy data directly between databases.

    • Use the Copy Data Tool in ADF to set up source (production database) and sink (non-prod database).
    • Configure retry policies to handle intermittent failures.
    • Schedule incremental or full data copies based on your requirements.
    1. Preserve Non-Clustered Indexes because non-clustered indexes often do not get included in BACPAC exports by using a Dedicated Script. Run the following SQL query in SSMS to generate the creation scripts for all non-clustered indexes:
    SELECT 
        'CREATE NONCLUSTERED INDEX [' + i.name + '] ON [' + s.name + '].[' + t.name + '] (' +
        STUFF((SELECT ', [' + c.name + ']' + CASE ic.is_descending_key WHEN 1 THEN ' DESC' ELSE '' END
               FROM sys.index_columns ic
               INNER JOIN sys.columns c ON ic.object_id = c.object_id AND ic.column_id = c.column_id
               WHERE ic.object_id = i.object_id AND ic.index_id = i.index_id
               ORDER BY ic.key_ordinal
               FOR XML PATH('')), 1, 2, '') +
        ')' + ISNULL(' INCLUDE (' + STUFF((SELECT ', [' + c.name + ']'
                                           FROM sys.index_columns ic
                                           INNER JOIN sys.columns c ON ic.object_id = c.object_id AND ic.column_id = c.column_id
                                           WHERE ic.object_id = i.object_id AND ic.index_id = i.index_id AND ic.is_included_column = 1
                                           FOR XML PATH('')), 1, 2, '') + ')', '') AS IndexScript
    FROM sys.indexes i
    INNER JOIN sys.tables t ON i.object_id = t.object_id
    INNER JOIN sys.schemas s ON t.schema_id = s.schema_id
    WHERE i.is_primary_key = 0 AND i.is_unique = 0 AND i.type = 2
    ORDER BY s.name, t.name, i.name;
    

    Save the output and run it on your non-prod database after importing the data.

    • Apply these scripts to the target database after restoring the BACPAC.
    1. Should there be any issue with Export/Import here is Troubleshooting steps. For the "stuck at 1%" issue:
    • Diagnosis:
      • Ensure the storage account performance (use Premium or Standard with sufficient IOPS).
      • Check the Azure SQL DB logs for throttling or connectivity issues.
      • Use smaller datasets to isolate the problem (as you tried with AdventureWorks but encountered similar issues).
    • Solution: Use SQLPackage.exe for a reliable export/import:
      • Export using bash: SqlPackage.exe /Action:Export /SourceServerName:<server> /SourceDatabaseName:<db> /TargetFile:<path.bacpac>
      • Import using bash: SqlPackage.exe /Action:Import /TargetServerName:<server> /TargetDatabaseName:<db> /SourceFile:<path.bacpac>
    1. If you would like to use Azure CLI or PowerShell Automation for scripting the process:
    az sql db export --admin-password <password> --admin-user <username> --name <db> --resource-group <resource-group> --server <server-name> --storage-uri <uri>
    
    1. You will need to "Optimize Network Configuration"
    • Use ExpressRoute or a VPN Gateway if frequent network failures persist.
    • Optimize the storage account location to reduce latency.

    https://zcusa.951200.xyz/en-us/azure/expressroute and https://zcusa.951200.xyz/en-us/azure/vpn-gateway/vpn-gateway-about-vpngateways

    1. Future-proof Solution

    Consider Geo-Replication - https://zcusa.951200.xyz/en-us/azure/azure-sql/database/active-geo-replication-overview or Database Copy features in Azure for large datasets - https://zcusa.951200.xyz/en-us/azure/azure-sql/database/database-copy?view=azuresql These methods are built for high reliability and preserve indexes.

    I hope this is helpful! Do not hesitate to let me know if you have any other questions.


    Please don't forget to close up the thread here by upvoting and accept it as an answer if it is helpful.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.