Azure: Next Data max size automaticly

André Lindqvist 20 Reputation points
2024-12-03T10:02:46.2033333+00:00

Hi, so basically what i am trying to do is run a runbook.

And i want that runbook to tell me the next level in what data max size is.
I then later gonna want to have parameters like if size is 90% full, check again in 5 min to verify it. And if so, update to next level in gb size.

And if needed to change the tier level to.

Something with downsize later would be good also later.

But i cant figure it out, and i wonder if need to hardcode the information with data max sizes instead?

Azure Automation
Azure Automation
An Azure service that is used to automate, configure, and install updates across hybrid environments.
1,278 questions
0 comments No comments
{count} votes

Accepted answer
  1. Vinodh247 26,371 Reputation points MVP
    2024-12-04T04:54:35.98+00:00

    Hi ,

    Thanks for reaching out to Microsoft Q&A.

    To implement a solution where a runbook monitors and automatically adjusts the max data size of a resource (a database or storage) based on usage thresholds, you can follow these steps. You may need to use Azure Automation Runbooks along with Azure Resource Manager (ARM) APIs or SDKs for automation.

    1. Plan the Workflow
    • Monitor Resource Size: Query the current size and usage percentage.
    • Check Threshold: If the resource is 90% full, wait for 5 minutes and re-check to confirm.
    • Adjust Size: If usage is still above 90%, increase the max size to the next available tier/size.
    • Optional Downgrade: Include logic to check for downsizing when usage consistently drops below a lower threshold.

    1. Determine Max Size Levels
    • If the resource has predefined size levels (Azure SQL Database tiers or storage limits), you can hardcode these levels into a configuration file or use an Azure API to fetch them dynamically.

    Example: For Azure SQL Database, size levels might include 5 GB, 10 GB, 20 GB, etc. For Azure Storage, tiers and limits vary by account type.


    1. Set Up an Azure Automation Runbook

    Use a PowerShell or Python runbook for implementation.

    import time
    from azure.identity import DefaultAzureCredential
    from azure.mgmt.sql import SqlManagementClient
    RESOURCE_GROUP = "YourResourceGroupName"
    DATABASE_NAME = "YourDatabaseName"
    THRESHOLD_PERCENTAGE = 90
    CHECK_INTERVAL_SECONDS = 300
    def get_resource_usage(client):
        db = client.databases.get(RESOURCE_GROUP, "YourServerName", DATABASE_NAME)
        used_space = db.current_service_objective_name
        max_size = db.max_size_bytes
        return used_space / 1e9, max_size / 1e9  # Convert to GB
    def update_max_size(client, new_size_gb):
        client.databases.update(
            RESOURCE_GROUP,
            "YourServerName",
            DATABASE_NAME,
            {"max_size_bytes": int(new_size_gb * 1e9)}
        )
    def main():
        credential = DefaultAzureCredential()
        client = SqlManagementClient(credential, "YourSubscriptionId")
        while True:
            used_gb, max_gb = get_resource_usage(client)
            usage_percentage = (used_gb / max_gb) * 100
            if usage_percentage >= THRESHOLD_PERCENTAGE:
                print(f"Threshold reached: {usage_percentage}%")
                time.sleep(CHECK_INTERVAL_SECONDS)
                
                # Re-check
                used_gb, max_gb = get_resource_usage(client)
                usage_percentage = (used_gb / max_gb) * 100
                if usage_percentage >= THRESHOLD_PERCENTAGE:
                    print("Confirmed threshold. Increasing size.")
                    new_size = max_gb + 10  # Increment by 10 GB
                    update_max_size(client, new_size)
                    print(f"Max size updated to {new_size} GB")
            else:
                print(f"Usage within limits: {usage_percentage}%")
            
            time.sleep(CHECK_INTERVAL_SECONDS)
    if __name__ == "__main__":
        main()
    
    
    
    1. Deploy and Test
    • Deploy the runbook in Azure Automation.
    • Add parameters to allow flexibility (e.g., ThresholdPercentage, CheckIntervalMinutes, IncrementSize).
    • Test thoroughly in a non-production environment to avoid unexpected scaling or costs.

    1. Optional: Auto-Scaling for Downgrade
    • Include logic to monitor usage drops below a lower threshold (ex:, 30%) for a sustained period (24 hours).
    • Automate tier downgrades based on this logic.

    Notes

    • APIs: Use the relevant Azure API for your resource (ex: Azure SQL, Blob Storage) to retrieve usage and adjust limits.
    • Costs: Be aware of potential cost implications with automatic scaling.
    • Hardcoding: Hardcoding size levels is acceptable if there’s no API to fetch them dynamically, but ensure it’s well-documented for future changes.

    Please feel free to click the 'Upvote' (Thumbs-up) button and 'Accept as Answer'. This helps the community by allowing others with similar queries to easily find the solution.

    1 person found this answer helpful.
    0 comments No comments

1 additional answer

Sort by: Most helpful
  1. Rahul Podila 895 Reputation points Microsoft Vendor
    2024-12-04T06:04:39.7166667+00:00

    Hi @André Lindqvist,

    Welcome to the Microsoft Q&A Platform! Thank you for asking your question here. 

    Create an Azure Automation runbook to monitor the data size of a resource using Azure Monitor metrics and take actions (like scaling or alerting) based on the size thresholds.

    Here are the high-level steps you can follow: 

    1.Create an Azure Automation account and import the necessary modules for your runbook. 

    2.Create a runbook that retrieves the data size of the resource you want to monitor. You can use PowerShell cmdlets or Azure CLI commands to achieve this. 

    3.Use Azure Monitor to create an alert rule that triggers when the data size reaches a certain threshold. You can configure the alert rule to run your runbook when triggered. 

    4.In your runbook, add logic to check the data size and take actions based on the size. For example, if the size is 90% full, you can check again in 5 minutes and update to the next level in GB size. You can also add logic to change the tier level if needed. 

    5.Schedule your runbook to run at regular intervals to monitor the data size. 

    Regarding your question about hardcoding the information with data max sizes, it's not necessary. You can retrieve the data size dynamically using PowerShell or Azure CLI commands in your runbook. Welcome to the Microsoft Q&A Platform! Thank you for asking your question here. 

    If you have any concern, please follow this links: -

    https://zcusa.951200.xyz/en-us/azure/automation/runbook-input-parameters

    https://zcusa.951200.xyz/en-us/azure/automation/manage-runbooks

    If you have any further queries, do let us know  


    If the answer is helpful, please click "Accept Answer" and "Upvote it" 

    1 person found this answer helpful.
    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.