Getting error creating database in SQL Elastic Pool, but can do it through the portal

Joe DeNicola 0 Reputation points
2024-12-26T21:50:02.2966667+00:00

I have a SQL Elastic Pool, MyPool, configured with 200 eDTUs and 800 GB storage

It contains a database, DBPrimary, with data space used = 127 GB, max storage = 300 GB

I have another database outside of the pool, DBProd, with data space used = 117 GB, Max Storage = 320 GB

I want to create a copy of DBProd in MyPool using a script.

I run these commands:

DROP DATABASE IF EXISTS DBCopy
CREATE DATABASE DBCopy AS COPY OF DBProd (SERVICE_OBJECTIVE = ELASTIC_POOL (name = MyPool))

I get this error:

The edition 'Standard' does not support the database data max size '343597383680'

However, I ran the following command to create the copy outside of the pool:

CREATE DATABASE DBCopy AS COPY OF DBProd

Then in the Azure Portal, I added DBCopy to the SQL Elastic Pool manually, and it did it without any errors.

So what is the correct command so that I can create the copy of the database directly in the pool?

I tried adding the MAXSIZE parameter to the CREATE TABLE command but it gives me a syntax error.

I tried adding the EDITION = 'GeneralPurpose' and it didn't help.

Azure SQL Database
{count} votes

2 answers

Sort by: Most helpful
  1. Mahesh Kurva 1,700 Reputation points Microsoft Vendor
    2024-12-27T23:59:57.1533333+00:00

    Hi @Joe DeNicola,

    Thanks for the response.

    I am seeing an error code 45350 in the activity log when I try to move it into the pool, or if I try to change the edition/service tier/maxsize.

    The error code 45350 typically indicates that the database size exceeds the maximum allowed size for the specified service tier or elastic pool.User's image Adjust the service tier and size to fit within the pool’s limits:

    ALTER DATABASE DBCopy MODIFY (SERVICE_OBJECTIVE = 'S3', MAXSIZE = 250 GB);
    

    After adjusting the MAXSIZE, you can then move the database into the elastic pool.

    For more information, please refer the documents:

    https://zcusa.951200.xyz/en-us/sql/relational-databases/errors-events/database-engine-events-and-errors-41400-to-49999?view=sql-server-ver16

    https://zcusa.951200.xyz/en-us/azure/azure-sql/database/resource-limits-dtu-single-databases?view=azuresql#standard-service-tier

    Hope this helps. Do let us know if you have any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.


  2. Sina Salam 15,011 Reputation points
    2025-01-13T15:13:58.2433333+00:00

    Hello Joe DeNicola,

    Welcome to the Microsoft Q&A and thank you for posting your questions here.

    To simplify the issue and your use case, I understand that you have a SQL Elastic Pool, MyPool, with 200 eDTUs and 800 GB storage. It contains a database, DBPrimary, with 127 GB used and 300 GB max storage. You want to create a copy of another database, DBProd (117 GB used, 320 GB max storage), directly into MyPool using a script.

    Here is full script, please read the comment before implementing the process:

    /* Check the Original Database's Storage Usage to ensure that `DBProd`'s used space (117 GB) is compatible with the target pool's service objective.*/
    
    -- Check original database storage usage
    SELECT 
        name AS DatabaseName, 
        state_desc AS State, 
        size * 8 / 1024 AS AllocatedSizeMB, 
        max_size * 8 / 1024 AS MaxSizeMB 
    FROM sys.master_files
    WHERE database_id = DB_ID('DBProd');
    
    -- Create the Database Copy Outside the Pool, create the database copy outside the pool to avoid size constraints.
    
    CREATE DATABASE DBCopy AS COPY OF DBProd;
    
    -- Monitor the Copy Operation and use the following queries to track the progress of the copy operation.
    
    -- Check database status
    SELECT name, state_desc 
    FROM sys.databases 
    WHERE name = 'DBCopy';
    -- Monitor copy completion
    SELECT database_id, start_date, modify_date, percent_complete 
    FROM sys.dm_database_copies 
    WHERE database_id = DB_ID('DBCopy');
    
    -- Once the copy is complete, move the database into the Elastic Pool.
    
    ALTER DATABASE DBCopy MODIFY (SERVICE_OBJECTIVE = ELASTIC_POOL (name = MyPool));
    
    -- Optimize Elastic Pool Configuration (If Needed), that's if storage constraints are causing issues, consider temporarily increasing the DTU allocation or pool storage.
    
    ALTER ELASTIC_POOL [MyPool]
    MODIFY (EDTU_LIMIT = <New Limit>, STORAGE_LIMIT = <New Storage>);
    
    -- You can alternatively, use a vCore-based pool for greater flexibility during the operation.
    
    -- To automate the Process, use a script to poll the database status and trigger the move operation upon completion.
    
    
    WHILE EXISTS (
        SELECT 1 
        FROM sys.dm_database_copies 
        WHERE database_id = DB_ID('DBCopy') AND percent_complete < 100
    )
    BEGIN
        WAITFOR DELAY '00:01:00'; -- Wait for 1 minute
    END;
    ALTER DATABASE DBCopy MODIFY (SERVICE_OBJECTIVE = ELASTIC_POOL (name = MyPool));
    

    NOTE:

    1. Make sure the database's allocated storage is reduced to fit within the Elastic Pool's constraints before attempting to move it.
    2. Avoid frequent use of DBCC SHRINKDATABASE as it can cause fragmentation and impact performance. Consider it a last resort.
    3. If storage or DTU constraints persist, consider increasing the pool's capacity temporarily to accommodate the database.

    This above should comprehensively provide a clear path to creating a copy of DBProd directly in MyPool.

    I hope this is helpful! Do not hesitate to let me know if you have any other questions.


    Please don't forget to close up the thread here by upvoting and accept it as an answer if it is helpful.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.