Vtome.ru - электронная библиотека

Recent Advances in Time Series Forecasting

  • Добавил: literator
  • Дата: 1-08-2021, 14:47
  • Комментариев: 0
Recent Advances in Time Series ForecastingНазвание: Recent Advances in Time Series Forecasting
Автор: Dinesh C.S. Bisht, Mangey Ram
Издательство: CRC Press
Год: 2022
Страниц: 239
Язык: английский
Формат: pdf (true)
Размер: 20.1 MB

A time series is a chronological collection of data. The sequential analysis of data and information gathered from past to present is called time series analysis. Time series data are of high dimension, have a large size and are updated continuously. A time series depends on various factors like trend, seasonality, cycle and irregular data set. A time series is basically a series of data points well-organized in time. Time series forecasting is a significant area of Machine Learning (ML). There are various prediction problems that are time dependent, and these problems can be handled through time series analysis. This book aims to cover the recent advancement in the field of time series analysis. It will cover theoretical as well as recent applications of the time series.

Future predictions are always a topic of interest. Precise estimates are crucial in many activities as forecasting errors can lead to big financial loss. The sequential analysis of data and information gathered from past to present is call time series analysis. This book covers the recent advancements in time series forecasting. The book includes theoretical as well as recent applications of time series analysis. It focuses on the recent techniques used, discusses a combination of methodology and applications, presents traditional and advanced tools, new applications, and identifies the gaps in knowledge in engineering applications. This book is aimed at scientists, researchers, postgraduate students and engineers in the areas of supply chain management, production, inventory planning, and statistical quality control.

The neural network or learning-based methods do not require any assumption or knowledge about the process. The technique involves learning parameters known as weights, tuned by backpropagating the output error. The Feed-Forward Neural (FNN) network can learn the representations in a given da­ taset but falls short in learning sequential representations. Recurrent neural network (RNN) architecture is adopted to overcome this limitation; it uses a weight-sharing method and introduces a hidden state that distills the information in the series to a limited-sized vector. RNNs can learn time-stamped data, but they suffer from the problem of vanishing gradients, which prevents them from learning long sequences. Long short-term memory (LSTM) alleviates this problem by modifying the vanilla RNN.

Скачать Recent Advances in Time Series Forecasting












ОТСУТСТВУЕТ ССЫЛКА/ НЕ РАБОЧАЯ ССЫЛКА ЕСТЬ РЕШЕНИЕ, ПИШИМ СЮДА!


ПРАВООБЛАДАТЕЛЯМ


СООБЩИТЬ ОБ ОШИБКЕ ИЛИ НЕ РАБОЧЕЙ ССЫЛКЕ



Внимание
Уважаемый посетитель, Вы зашли на сайт как незарегистрированный пользователь.
Мы рекомендуем Вам зарегистрироваться либо войти на сайт под своим именем.
Информация
Посетители, находящиеся в группе Гости, не могут оставлять комментарии к данной публикации.