DML_ACTIVATION_CELU_OPERATOR_DESC structure (directml.h)
Performs the continuously differentiable exponential linear unit (CELU) activation function on every element in InputTensor, placing the result into the corresponding element of OutputTensor.
f(x) = max(0, x) + min(0, Alpha * (exp(x / Alpha) - 1));
Where:
- exp(x) is the natural exponentiation function
- max(a,b) returns the larger of the two values a,b
- min(a,b) returns the smaller of the two values a,b
This operator supports in-place execution, meaning that the output tensor is permitted to alias InputTensor during binding.
Syntax
struct DML_ACTIVATION_CELU_OPERATOR_DESC {
const DML_TENSOR_DESC *InputTensor;
const DML_TENSOR_DESC *OutputTensor;
FLOAT Alpha;
};
Members
InputTensor
Type: const DML_TENSOR_DESC*
The input tensor to read from.
OutputTensor
Type: const DML_TENSOR_DESC*
The output tensor to write the results to.
Alpha
Type: FLOAT
The alpha coefficient. A typical default for this value is 1.0.
Availability
This operator was introduced in DML_FEATURE_LEVEL_3_0
.
Tensor constraints
InputTensor and OutputTensor must have the same DataType, DimensionCount, and Sizes.
Tensor support
Tensor | Kind | Supported Dimension Counts | Supported Data Types |
---|---|---|---|
InputTensor | Input | 1 to 8 | FLOAT32, FLOAT16 |
OutputTensor | Output | 1 to 8 | FLOAT32, FLOAT16 |
Requirements
Requirement | Value |
---|---|
Minimum supported client | Windows 10 Build 20348 |
Minimum supported server | Windows 10 Build 20348 |
Header | directml.h |