Neural Network-based Forecasting Algorithm

From GM-RKB
Jump to navigation Jump to search

An Neural Network-based Forecasting Algorithm is a forecasting algorithm that is a neural network-based algorithm.



References

2018

  • https://towardsdatascience.com/using-lstms-to-forecast-time-series-4ab688386b1f
    • QUOTE: There are several time-series forecasting techniques like auto regression (AR) models, moving average (MA) models, Holt-winters, ARIMA etc., to name a few. So, what is the need for yet another model like LSTM-RNN to forecast time-series? This is quite a valid question to begin with and here are the reasons that I could come up with (respond below if you are aware of more, I will be curious to know)—
      • RNN’s (LSTM’s) are pretty good at extracting patterns in input feature space, where the input data spans over long sequences. Given the gated architecture of LSTM’s that has this ability to manipulate its memory state, they are ideal for such problems.
      • LSTMs can almost seamlessly model problems with multiple input variables. All we need is a 3D input vector that needs to be fed into the input shape of the LSTM. So long as we figure out a way to convert all our input variables to be represented in a 3D vector form, we are good use LSTM. This adds a great benefit in time series forecasting, where classical linear methods can be difficult to adapt to multivariate or multiple input forecasting problems (A side note here for multivariate forecasting — keep in mind that when we use multivariate data for forecasting, then we also need “future multi-variate” data to predict the future outcome!)
      • In general, while using LSTM’s, I found that they offer lot of flexibility in modelling the problem — meaning we have a good control over several parameters of the time series. In particular we can —
      • Flexibility to use several combinations of seq2seq LSTM models to forecast time-series — many to one model(useful when we want to predict at the current timestep given all the previous inputs), many to many model (useful when we want to predict multiple future time steps at once given all the previous inputs) and several other variations on these. We can customize several things for example — the size of look-back window to predict at the current step, the number of time steps we want to predict into the future, feeding the current prediction back into the window to make prediction at the next time step (this technique also known as moving-forward window) and so on.