N-BEATS(ICLR2020)
00 分钟
2023-12-9
N-BEATS: Neural basis expansion analysis for interpretable time series forecasting
Nixtla: Deep Learning for Time Series Forecasting
Time series forecasting has a wide range of applications: finance, retail, healthcare, IoT, etc. Recently deep learning models such as ESRNN or N-BEATS have proven to have state-of-the-art performance in these tasks. Nixtlats is a python library that we have developed to facilitate the use of these state-of-the-art models to data scientists and developers, so that they can use them in productive environments. Written in pytorch, its design is focused on usability and reproducibility of experiments. For this purpose, nixtlats has several modules: Data: contains datasets of various time series competencies. Models: includes state-of-the-art models. Evaluation: has various loss functions and evaluation metrics. Objective: - To introduce attendees to the challenges of time series forecasting with deep learning. - Commercial applications of time series forecasting. - Describe nixtlats, their components and best practices for training and deploying state-of-the-art models in production. - Reproduction of state-of-the-art results using nixtlats from the winning model of the M4 time series competition (ESRNN). Project repository: https://github.com/Nixtla/nixtlats. Connect with us: Website: https://databricks.com Facebook: https://www.facebook.com/databricksinc Twitter: https://twitter.com/databricks LinkedIn: https://www.linkedin.com/company/data... Instagram: https://www.instagram.com/databricksinc/
Nixtla: Deep Learning for Time Series Forecasting
Advantages of DL Large dataset environments have witnessed Deep Learning grow in popularity becoming a valuable and general purppose technique. Main advantages: - Forecasting Accuracy. - Forecasting Pipeline's simplification. - Scalability of Forecasting systems.
DL performed well in competions 1. M4 International Forecasting Competition: Point and probabilistic forecasting task for 100K series at different frequencies. Hosted by Makridakis l.,248 participants. 2. M5 International Forecasting Competition: Hierarchical Forecasting task for retail goods, 30K series across 10st 88K submissions by 5,558 participating teams (Kaggle).
NBEATSx: Motivation and Contributions The NBEATSx contributions include: 1. Novel Time Series Decomposition method. 2. Convolutional Encoders for the Exogenous Variables. 3. State-of-the-Art performance on Electricity Price data.
  • M4数据集:
    • Pure ML: B. Trotta提交的模型是6个纯机器学习模型中最优的。
    • Statistical: N.Z. Legaki和K. Koutsouri的模型是最佳的纯统计模型。
    • ML/TS Combination: P. Montero-Manso, T. Talagala, R.J. Hyndman和G. Athanasopoulos的模型,这是一个基于梯度提升树和几种统计时间序列模型的组合。
    • ProLogistica: 基于统计方法加权集成的第三名入围模型。
    • DL/TS Hybrid: Smyl (2020)的模型,获得M4竞赛冠军,是一个深度学习与时间序列预测的混合模型。
  • M3数据集:
    • Theta Method: 由Assimakopoulos & Nikolopoulos (2000)提出,获得M3竞赛冠军。
    • DOTA: Fiorucci等人(2016)提出的动态优化Theta模型。
    • EXP: Spiliotis等人(2019)提出的最新统计方法,也是M3的先前最优模型。
    • ForecastPro: 一种现成的预测软件,基于指数平滑、ARIMA和移动平均模型之间的选择。
  • TOURISM数据集:
    • Statistical Benchmarks: 包括ETS(指数平滑与交叉验证的加法/乘法模型)、Theta方法和ForePro(与M3中相同)。
    • TOURISM Kaggle Competition Entries: 包括Stratometrics(一种未知技术)和LeeCBaker(Baker & Howard, 2011),后者是一种结合了Naïve、线性趋势模型和指数加权最小二乘回归趋势的加权组合。

元学习(Meta-Learning),也被称为"学会学习",是机器学习的一个分支,旨在设计模型可以从少量数据中快速学习并适应新任务。元学习在机器学习中的重要性在于其提供了一种机制,使得模型能够利用以往的学习经验来加速和优化新任务的学习过程。

元学习的核心概念:

  1. 内部学习程序(Inner Learning Process):这是模型在特定任务上的学习过程。在传统机器学习中,这相当于在给定数据集上训练模型。
  1. 外部学习程序(Outer Learning Process):这是模型在多个任务上学习如何更好地学习的过程。外部学习过程调整模型的学习策略,使其在遇到新任务时能更快地适应。

元学习的应用:

  1. 快速适应(Few-shot Learning):元学习使模型能够在极少量的数据上学习新任务,这对于数据稀缺的场景特别有价值。
  1. 迁移学习(Transfer Learning):元学习强化了模型的迁移能力,使其能够将在一个任务上学到的知识有效地应用到其他相似的任务上。
  1. 自适应系统(Adaptive Systems):元学习提供了创建自适应系统的可能性,这些系统能够根据不断变化的环境和任务需求进行自我调整。

元学习方法:

  1. 模型无关的元学习(Model-Agnostic Meta-Learning, MAML):这是一种流行的元学习方法,通过在多个任务上训练模型以优化其初始参数配置,使得模型可以通过少量的梯度更新快速适应新任务。
  1. 学习优化器(Learning Optimizers):在这种方法中,模型学习如何优化另一个模型的学习过程,例如,通过训练一个神经网络来生成有效的梯度下降更新规则。
  1. 记忆增强学习(Memory-Augmented Learning):这种方法涉及到使用外部记忆机制(如神经图灵机)来存储先前任务的信息,以助于新任务的学习。
元学习正变得越来越重要,特别是在数据量有限或需要模型快速适应新环境的场景中。通过元学习,模型不仅学习特定任务的解决方案,而且学习如何学习,从而在遇到新任务时能够更加有效和高效。
notion image
notion image

评论
Loading...