ClassificationModels Struct
Definition
Important
Some information relates to prerelease product that may be substantially modified before it’s released. Microsoft makes no warranties, express or implied, with respect to the information provided here.
Enum for all classification models supported by AutoML.
[System.ComponentModel.TypeConverter(typeof(Microsoft.Azure.PowerShell.Cmdlets.MachineLearningServices.Support.ClassificationModelsTypeConverter))]
public struct ClassificationModels : IEquatable<Microsoft.Azure.PowerShell.Cmdlets.MachineLearningServices.Support.ClassificationModels>, System.Management.Automation.IArgumentCompleter
[<System.ComponentModel.TypeConverter(typeof(Microsoft.Azure.PowerShell.Cmdlets.MachineLearningServices.Support.ClassificationModelsTypeConverter))>]
type ClassificationModels = struct
interface IArgumentCompleter
Public Structure ClassificationModels
Implements IArgumentCompleter, IEquatable(Of ClassificationModels)
- Inheritance
-
ClassificationModels
- Attributes
- Implements
Fields
BernoulliNaiveBayes |
Naive Bayes classifier for multivariate Bernoulli models. |
DecisionTree |
Decision Trees are a non-parametric supervised learning method used for both classification and regression tasks. The goal is to create a model that predicts the value of a target variable by learning simple decision rules inferred from the data features. |
ExtremeRandomTrees |
Extreme Trees is an ensemble machine learning algorithm that combines the predictions from many decision trees. It is related to the widely used random forest algorithm. |
GradientBoosting |
The technique of transiting week learners into a strong learner is called Boosting. The gradient boosting algorithm process works on this theory of execution. |
Knn |
K-nearest neighbors (KNN) algorithm uses 'feature similarity' to predict the values of new datapoints which further means that the new data point will be assigned a value based on how closely it matches the points in the training set. |
LightGbm |
LightGBM is a gradient boosting framework that uses tree based learning algorithms. |
LinearSvm |
A support vector machine (SVM) is a supervised machine learning model that uses classification algorithms for two-group classification problems. After giving an SVM model sets of labeled training data for each category, they're able to categorize new text. Linear SVM performs best when input data is linear, i.e., data can be easily classified by drawing the straight line between classified values on a plotted graph. |
LogisticRegression |
Logistic regression is a fundamental classification technique. It belongs to the group of linear classifiers and is somewhat similar to polynomial and linear regression. Logistic regression is fast and relatively uncomplicated, and it's convenient for you to interpret the results. Although it's essentially a method for binary classification, it can also be applied to multiclass problems. |
MultinomialNaiveBayes |
The multinomial Naive Bayes classifier is suitable for classification with discrete features (e.g., word counts for text classification). The multinomial distribution normally requires integer feature counts. However, in practice, fractional counts such as tf-idf may also work. |
RandomForest |
Random forest is a supervised learning algorithm. The "forest" it builds, is an ensemble of decision trees, usually trained with the “bagging” method. The general idea of the bagging method is that a combination of learning models increases the overall result. |
Sgd |
SGD: Stochastic gradient descent is an optimization algorithm often used in machine learning applications to find the model parameters that correspond to the best fit between predicted and actual outputs. |
Svm |
A support vector machine (SVM) is a supervised machine learning model that uses classification algorithms for two-group classification problems. After giving an SVM model sets of labeled training data for each category, they're able to categorize new text. |
XgBoostClassifier |
XGBoost: Extreme Gradient Boosting Algorithm. This algorithm is used for structured data where target column values can be divided into distinct class values. |
Methods
CompleteArgument(String, String, String, CommandAst, IDictionary) |
Implementations of this function are called by PowerShell to complete arguments. |
Equals(ClassificationModels) |
Compares values of enum type ClassificationModels |
Equals(Object) |
Compares values of enum type ClassificationModels (override for Object) |
GetHashCode() |
Returns hashCode for enum ClassificationModels |
ToString() |
Returns string representation for ClassificationModels |
Operators
Equality(ClassificationModels, ClassificationModels) |
Overriding == operator for enum ClassificationModels |
Implicit(ClassificationModels to String) |
Implicit operator to convert ClassificationModels to string |
Implicit(String to ClassificationModels) |
Implicit operator to convert string to ClassificationModels |
Inequality(ClassificationModels, ClassificationModels) |
Overriding != operator for enum ClassificationModels |