how to update apache spark pool version from3.3 to 3.4 if don't have any spark pools under the synapse workspace

ANKIT GHENGE 40 Reputation points
2024-10-23T09:51:20.06+00:00

i dont have spark pools configured under synapse analytics workspaces so how can i update syanpse spark pools to 3.4 version. is there any dependency validation required for this?

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
5,093 questions
0 comments No comments
{count} votes

Accepted answer
  1. Smaran Thoomu 18,790 Reputation points Microsoft Vendor
    2024-10-23T18:35:00.55+00:00

    Hi @ANKIT GHENGE

    Thank you for posting query on Microsoft Q&A Platform.

    If you don’t have any Spark pools currently configured in your Synapse Analytics workspace, you won’t be able to "update" a Spark pool directly since there are no existing pools to update. However, you can create a new Spark pool with Apache Spark 3.4 as the version. Here’s how you can proceed:

    Steps to Create a New Spark Pool with Apache Spark 3.4:

    1. Navigate to Synapse Studio:
      • Open your Synapse workspace and go to the Manage hub.
    2. Create a New Apache Spark Pool:
      • Under the Apache Spark pools section, click on + New.
      • In the Spark version dropdown, select Apache Spark 3.4.
      • Configure other settings (node size, autoscale, etc.) as per your requirements.
      • Click Create to provision the new Spark pool.

    User's image

    is there any dependency validation required for this?

    • No direct dependency validation is required during the creation of the Spark pool itself.
    • If you plan to use existing notebooks, pipelines, or jobs that were written for Spark 3.3, you may need to validate and test those with Spark 3.4 to ensure compatibility. Spark version updates can sometimes introduce changes to libraries or behaviors, so reviewing the release notes of Apache Spark 3.4 and testing your workloads is recommended.

    Once the pool is created, you can start running workloads on Apache Spark 3.4.

    For reference: Apache Spark pools in Azure Synapse use runtimes

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.


0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.