Популярное

Музыка Кино и Анимация Автомобили Животные Спорт Путешествия Игры Юмор

Интересные видео

2025 Сериалы Трейлеры Новости Как сделать Видеоуроки Diy своими руками

Топ запросов

смотреть а4 schoolboy runaway турецкий сериал смотреть мультфильмы эдисон
dTub
Скачать

[Paper Review] Self-Supervised Contrastive PreTraining for TimeSeries via Time-Frequency Consistency

Автор: 서울대학교 산업공학과 DSBA 연구실

Загружено: 2022-11-16

Просмотров: 2274

Описание:

발표자: 석사과정 강형원

1. 논문 제목:
Self-Supervised Contrastive Pre-Training for Time Series via Time-Frequency Consistency (Xiang Zhang, Ziyuan Zhao, Theodoros Tsiligkaridis, Marinka Zitnik, NeurIPS 2022)
링크: https://arxiv.org/abs/2206.08496

2. 논문 Overview
Time domain과 frequency domain에서 각각 augmentation을 통해 positive pair를 생성하고, contrastive learning을 수행
Frequency domain에서의 augmentation 방법으로 fourier component random remove, 또는 진폭의 변화를 주는 방법 사용
Time domain에서의 representation과 frequency domain에서의 representation이 consistency를 갖도록 학습

4. 참고 영상
[Paper Review] Unsupervised Representation Learning Approaches for Multivariate Time Series (최희정 박사과정)
링크:    • [Paper Review] Unsupervised Representation...  

[Paper Review] Unsupervised Representation Learning Approaches for Multivariate Time Series (2) (최희정 박사과정)
링크:    • [Paper Review] Unsupervised Representation...  

[Paper Review] CoST: Contrastive learning of Disentangled Seasonal-Trend Representations for Time Series Forecasting (김지나 석사졸업생)
링크:    • CoST:Contrastive Learning of Disentangled ...  

[Paper Review] TS2Vec: Towards Universal Representation of Time Series (김수빈 석사과정)
링크:    • [Paper Review] TS2Vec: Towards Universal R...  

[Paper Review] Unsupervised Time-Series Representation Learning with Iterative Bilinear Temporal-Spectral Fusion (최희정 박사과정)
링크:    • [Paper Review] Time-Series Representation ...  

[Paper Review] Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting (강형원 석사과정)
링크:    • [Paper Review] Autoformer  

[Paper Review] FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting (강형원 석사과정)
링크:    • [Paper Review] FEDformer: Frequency Enhanc...  

3. keyword: Time series, Representation learning, Contrastive learning, Time embedding, Frequency embedding, embedding space, Consistency, Contrastive, Time-Frequency Consistency, TF-C, Classification, Forecasting, Anomaly Detection, Clustering, Transfer learning

[Paper Review] Self-Supervised Contrastive PreTraining for TimeSeries via Time-Frequency Consistency

Поделиться в:

Доступные форматы для скачивания:

Скачать видео mp4

  • Информация по загрузке:

Скачать аудио mp3

Похожие видео

[Paper Review] Why do tree-based models still outperform deep learning on typical tabular data

[Paper Review] Why do tree-based models still outperform deep learning on typical tabular data

TOTEM: TOkenized Time Series EMbeddings for General Time Series Analysis

TOTEM: TOkenized Time Series EMbeddings for General Time Series Analysis

[Paper Review] ANOMALYBERT: SELF-SUPERVISED TRANSFORMERFOR TIME SERIES ANOMALY DETECTION

[Paper Review] ANOMALYBERT: SELF-SUPERVISED TRANSFORMERFOR TIME SERIES ANOMALY DETECTION

KDD2024 - Frontiers of Foundation Models for Time Series

KDD2024 - Frontiers of Foundation Models for Time Series

Chopin: Nocturnes (Complete Collection) | 1H 40m | No Vocals | Study / Sleep

Chopin: Nocturnes (Complete Collection) | 1H 40m | No Vocals | Study / Sleep

LEVEL 1: Easy! 딥러닝

LEVEL 1: Easy! 딥러닝

Краткое объяснение больших языковых моделей

Краткое объяснение больших языковых моделей

[Paper Review] TS2Vec: Towards Universal Representation of Time Series

[Paper Review] TS2Vec: Towards Universal Representation of Time Series

Почему «Трансформеры» заменяют CNN?

Почему «Трансформеры» заменяют CNN?

[Open DMQA Seminar] Augmentation of Time Series Data

[Open DMQA Seminar] Augmentation of Time Series Data

[Paper Review] UniVIP: A Unified Framework for Self-Supervised Visual Pre-training

[Paper Review] UniVIP: A Unified Framework for Self-Supervised Visual Pre-training

Преломление и «замедление» света | По мотивам лекции Ричарда Фейнмана

Преломление и «замедление» света | По мотивам лекции Ричарда Фейнмана

КЛАДБИЩЕ ВСУ. ПОТЕРИ ВСУ В КУПЯНСКЕ ОГРОМНЫ 💥Военные Сводки 31.12.2025

КЛАДБИЩЕ ВСУ. ПОТЕРИ ВСУ В КУПЯНСКЕ ОГРОМНЫ 💥Военные Сводки 31.12.2025

[Paper Review] Focus on Your Negative Samples in Time Series Representation Learning

[Paper Review] Focus on Your Negative Samples in Time Series Representation Learning

LLM и GPT - как работают большие языковые модели? Визуальное введение в трансформеры

LLM и GPT - как работают большие языковые модели? Визуальное введение в трансформеры

Как создаются степени магистра права?

Как создаются степени магистра права?

ResNet: Deep Residual Learning for Image Recognition (꼼꼼한 딥러닝 논문 리뷰와 코드 실습)

ResNet: Deep Residual Learning for Image Recognition (꼼꼼한 딥러닝 논문 리뷰와 코드 실습)

Ревизия в Форт-Нокс? | Золотой запас США предлагается «монетизировать» | статья | Валентин Катасонов

Ревизия в Форт-Нокс? | Золотой запас США предлагается «монетизировать» | статья | Валентин Катасонов

Понимание GD&T

Понимание GD&T

[Paper Review] Safety Layers in Aligned Large Language Models: The Key to LLM Security

[Paper Review] Safety Layers in Aligned Large Language Models: The Key to LLM Security

© 2025 dtub. Все права защищены.



  • Контакты
  • О нас
  • Политика конфиденциальности



Контакты для правообладателей: [email protected]