Compartilhar via


SparkJobEntry Classe

Entrada para o trabalho do Spark.

Herança
azure.ai.ml.entities._mixins.RestTranslatableMixin
SparkJobEntry

Construtor

SparkJobEntry(*, entry: str, type: str = 'SparkJobPythonEntry')

Parâmetros somente de palavra-chave

Nome Description
entry
str

O ponto de entrada de arquivo ou classe.

type

O tipo de entrada. Os valores aceitos são SparkJobEntryType.SPARK_JOB_FILE_ENTRY ou SparkJobEntryType.SPARK_JOB_CLASS_ENTRY. O padrão é SparkJobEntryType.SPARK_JOB_FILE_ENTRY.

valor padrão: SparkJobPythonEntry

Exemplos

Criando SparkComponent.


   from azure.ai.ml.entities import SparkComponent

   component = SparkComponent(
       name="add_greeting_column_spark_component",
       display_name="Aml Spark add greeting column test module",
       description="Aml Spark add greeting column test module",
       version="1",
       inputs={
           "file_input": {"type": "uri_file", "mode": "direct"},
       },
       driver_cores=2,
       driver_memory="1g",
       executor_cores=1,
       executor_memory="1g",
       executor_instances=1,
       code="./src",
       entry={"file": "add_greeting_column.py"},
       py_files=["utils.zip"],
       files=["my_files.txt"],
       args="--file_input ${{inputs.file_input}}",
       base_path="./sdk/ml/azure-ai-ml/tests/test_configs/dsl_pipeline/spark_job_in_pipeline",
   )