Pipeline Class
Base class for pipeline node, used for pipeline component version consumption. You should not instantiate this class directly. Instead, you should use @pipeline decorator to create a pipeline node.
- Inheritance
-
azure.ai.ml.entities._builders.base_node.BaseNodePipeline
Constructor
Pipeline(*, component: Component | str, inputs: Dict[str, Input | str | bool | int | float | Enum] | None = None, outputs: Dict[str, str | Output] | None = None, settings: PipelineJobSettings | None = None, **kwargs: Any)
Parameters
Name | Description |
---|---|
component
Required
|
Id or instance of the pipeline component/job to be run for the step. |
inputs
Required
|
Optional[Dict[str, Union[ <xref:azure.ai.ml.entities.Input>, str, bool, int, float, <xref:Enum>, <xref:"Input">]]]<xref:./>
Inputs of the pipeline node. |
outputs
Required
|
Outputs of the pipeline node. |
settings
Required
|
Setting of pipeline node, only taking effect for root pipeline job. |
Keyword-Only Parameters
Name | Description |
---|---|
component
Required
|
|
inputs
Required
|
|
outputs
Required
|
|
settings
Required
|
|
Methods
clear | |
copy | |
dump |
Dumps the job content into a file in YAML format. |
fromkeys |
Create a new dictionary with keys from iterable and values set to value. |
get |
Return the value for key if key is in the dictionary, else default. |
items | |
keys | |
pop |
If the key is not found, return the default if given; otherwise, raise a KeyError. |
popitem |
Remove and return a (key, value) pair as a 2-tuple. Pairs are returned in LIFO (last-in, first-out) order. Raises KeyError if the dict is empty. |
setdefault |
Insert key with a value of default if key is not in the dictionary. Return the value for key if key is in the dictionary, else default. |
update |
If E is present and has a .keys() method, then does: for k in E: D[k] = E[k] If E is present and lacks a .keys() method, then does: for k, v in E: D[k] = v In either case, this is followed by: for k in F: D[k] = F[k] |
values |
clear
clear() -> None. Remove all items from D.
copy
copy() -> a shallow copy of D
dump
Dumps the job content into a file in YAML format.
dump(dest: str | PathLike | IO, **kwargs: Any) -> None
Parameters
Name | Description |
---|---|
dest
Required
|
The local path or file stream to write the YAML content to. If dest is a file path, a new file will be created. If dest is an open file, the file will be written to directly. |
Exceptions
Type | Description |
---|---|
Raised if dest is a file path and the file already exists. |
|
Raised if dest is an open file and the file is not writable. |
fromkeys
Create a new dictionary with keys from iterable and values set to value.
fromkeys(value=None, /)
Positional-Only Parameters
Name | Description |
---|---|
iterable
Required
|
|
value
|
Default value: None
|
Parameters
Name | Description |
---|---|
type
Required
|
|
get
Return the value for key if key is in the dictionary, else default.
get(key, default=None, /)
Positional-Only Parameters
Name | Description |
---|---|
key
Required
|
|
default
|
Default value: None
|
items
items() -> a set-like object providing a view on D's items
keys
keys() -> a set-like object providing a view on D's keys
pop
If the key is not found, return the default if given; otherwise, raise a KeyError.
pop(k, [d]) -> v, remove specified key and return the corresponding value.
popitem
Remove and return a (key, value) pair as a 2-tuple.
Pairs are returned in LIFO (last-in, first-out) order. Raises KeyError if the dict is empty.
popitem()
setdefault
Insert key with a value of default if key is not in the dictionary.
Return the value for key if key is in the dictionary, else default.
setdefault(key, default=None, /)
Positional-Only Parameters
Name | Description |
---|---|
key
Required
|
|
default
|
Default value: None
|
update
If E is present and has a .keys() method, then does: for k in E: D[k] = E[k] If E is present and lacks a .keys() method, then does: for k, v in E: D[k] = v In either case, this is followed by: for k in F: D[k] = F[k]
update([E], **F) -> None. Update D from dict/iterable E and F.
values
values() -> an object providing a view on D's values
Attributes
base_path
component
Id or instance of the pipeline component/job to be run for the step.
Returns
Type | Description |
---|---|
Id or instance of the pipeline component/job. |
creation_context
The creation context of the resource.
Returns
Type | Description |
---|---|
The creation metadata for the resource. |
id
The resource ID.
Returns
Type | Description |
---|---|
The global ID of the resource, an Azure Resource Manager (ARM) ID. |
inputs
Get the inputs for the object.
Returns
Type | Description |
---|---|
A dictionary containing the inputs for the object. |
log_files
Job output files.
Returns
Type | Description |
---|---|
The dictionary of log names and URLs. |
name
outputs
Get the outputs of the object.
Returns
Type | Description |
---|---|
A dictionary containing the outputs for the object. |
settings
Settings of the pipeline.
Note: settings is available only when create node as a job. i.e. ml_client.jobs.create_or_update(node).
Returns
Type | Description |
---|---|
Settings of the pipeline. |
status
The status of the job.
Common values returned include "Running", "Completed", and "Failed". All possible values are:
NotStarted - This is a temporary state that client-side Run objects are in before cloud submission.
Starting - The Run has started being processed in the cloud. The caller has a run ID at this point.
Provisioning - On-demand compute is being created for a given job submission.
Preparing - The run environment is being prepared and is in one of two stages:
Docker image build
conda environment setup
Queued - The job is queued on the compute target. For example, in BatchAI, the job is in a queued state
while waiting for all the requested nodes to be ready.
Running - The job has started to run on the compute target.
Finalizing - User code execution has completed, and the run is in post-processing stages.
CancelRequested - Cancellation has been requested for the job.
Completed - The run has completed successfully. This includes both the user code execution and run
post-processing stages.
Failed - The run failed. Usually the Error property on a run will provide details as to why.
Canceled - Follows a cancellation request and indicates that the run is now successfully cancelled.
NotResponding - For runs that have Heartbeats enabled, no heartbeat has been recently sent.
Returns
Type | Description |
---|---|
Status of the job. |
studio_url
type
Azure SDK for Python