KnownForecastingModels enum
Known values of ForecastingModels that the service accepts.
Fields
Arimax | An Autoregressive Integrated Moving Average with Explanatory Variable (ARIMAX) model can be viewed as a multiple regression model with one or more autoregressive (AR) terms and/or one or more moving average (MA) terms. This method is suitable for forecasting when data is stationary/non stationary, and multivariate with any type of data pattern, i.e., level/trend /seasonality/cyclicity. |
AutoArima | Auto-Autoregressive Integrated Moving Average (ARIMA) model uses time-series data and statistical analysis to interpret the data and make future predictions. This model aims to explain data by using time series data on its past values and uses linear regression to make predictions. |
Average | The Average forecasting model makes predictions by carrying forward the average of the target values for each time-series in the training data. |
DecisionTree | Decision Trees are a non-parametric supervised learning method used for both classification and regression tasks. The goal is to create a model that predicts the value of a target variable by learning simple decision rules inferred from the data features. |
ElasticNet | Elastic net is a popular type of regularized linear regression that combines two popular penalties, specifically the L1 and L2 penalty functions. |
ExponentialSmoothing | Exponential smoothing is a time series forecasting method for univariate data that can be extended to support data with a systematic trend or seasonal component. |
ExtremeRandomTrees | Extreme Trees is an ensemble machine learning algorithm that combines the predictions from many decision trees. It is related to the widely used random forest algorithm. |
GradientBoosting | The technique of transiting week learners into a strong learner is called Boosting. The gradient boosting algorithm process works on this theory of execution. |
KNN | K-nearest neighbors (KNN) algorithm uses 'feature similarity' to predict the values of new datapoints which further means that the new data point will be assigned a value based on how closely it matches the points in the training set. |
LassoLars | Lasso model fit with Least Angle Regression a.k.a. Lars. It is a Linear Model trained with an L1 prior as regularizer. |
LightGBM | LightGBM is a gradient boosting framework that uses tree based learning algorithms. |
Naive | The Naive forecasting model makes predictions by carrying forward the latest target value for each time-series in the training data. |
Prophet | Prophet is a procedure for forecasting time series data based on an additive model where non-linear trends are fit with yearly, weekly, and daily seasonality, plus holiday effects. It works best with time series that have strong seasonal effects and several seasons of historical data. Prophet is robust to missing data and shifts in the trend, and typically handles outliers well. |
RandomForest | Random forest is a supervised learning algorithm. The "forest" it builds, is an ensemble of decision trees, usually trained with the “bagging” method. The general idea of the bagging method is that a combination of learning models increases the overall result. |
SeasonalAverage | The Seasonal Average forecasting model makes predictions by carrying forward the average value of the latest season of data for each time-series in the training data. |
SeasonalNaive | The Seasonal Naive forecasting model makes predictions by carrying forward the latest season of target values for each time-series in the training data. |
SGD | SGD: Stochastic gradient descent is an optimization algorithm often used in machine learning applications to find the model parameters that correspond to the best fit between predicted and actual outputs. It's an inexact but powerful technique. |
TCNForecaster | TCNForecaster: Temporal Convolutional Networks Forecaster. //TODO: Ask forecasting team for brief intro. |
XGBoostRegressor | XGBoostRegressor: Extreme Gradient Boosting Regressor is a supervised machine learning model using ensemble of base learners. |