Hello Najam ul Saqib,
Based on your description this is the architecture of how your external application will work to get the storage account objects:
It looks like quite good , however there is here some points of improvements:
- Enforce Least Privilege Principle: Limit SAS tokens to the minimum permissions and scope required for each operation. Best Practices for SAS.
- Shorten SAS Token Expiry: Set short-lived expiry times for SAS tokens to reduce the risk of misuse. Regenerate tokens dynamically if needed. SAS Token Expiry Recommendations.
- Use User-Specific Claims: Embed user-specific claims (e.g., roles or IDs) in JWTs to enable fine-grained access control. Secure Azure Storage with Identity-Based Access.
- Introduce Server-Side Mapping: Maintain a mapping of external users to Azure Storage permissions for centralized and dynamic access control. Custom Authorization Techniques
Mapping Example:
A database table might look like this:
User ID | Allowed Containers |
---|---|
JohnDoe | container1, container2 |
JohnDoe | container1, container2 |
JaneDoe | container3 |
- Strengthen Backend Security:
- Use certificates instead of client secrets for service principal authentication. Azure AD Certificates and Secrets.
- Restrict backend access via IP allowlists, VPNs.
- Monitor SAS Usage: Enable Azure Storage logging to track SAS token activities and detect anomalies. Azure Storage Analytics Logging.
- Encrypt Data in Transit and at Rest: Enforce HTTPS (TLS 1.2+) for all communication and use Azure Storage Customer-Managed Keys for data encryption. Azure Storage Encryption.
- Rate-Limit Backend SAS Requests: Implement rate-limiting and caching for SAS generation to prevent abuse and improve performance. Designing Resilient Applications. If the information helped address your question, please Accept the answer.
Luis