-
Vanderbilt University
- Nashville, TN
- https://rgbayrak.github.io/
- https://orcid.org/0000-0002-7197-1248
- @redgreenblues
Highlights
- Pro
Lists (1)
Sort Name ascending (A-Z)
Stars
This repository contains code to apply vision transformers on surface data
EEG2BIDS Wizard: a tool for converting raw EEG and iEEG data into the BIDS standard data structure, prepared for LORIS (Longitudinal Online Research and Imaging System).
A set of BIDS compatible datasets with empty raw data files that can be used for writing lightweight software tests.
A unified framework for machine learning with time series
A Library for Advanced Deep Time Series Models.
Code for the paper "A benchmark of individual auto-regressive models in a massive fMRI dataset".
Lightning ⚡️ fast forecasting with statistical and econometric models.
Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting
Lu, Q., Hasson, U., & Norman, K. A. (2022). A neural network model of when to retrieve and encode episodic memories. eLife
Refine high-quality datasets and visual AI models
Self-supervised learning techniques for neuroimaging data inspired by prominent learning frameworks in natural language processing + One of the broadest neuroimaging datasets used for pre-training …
This repository contains the implementation of Dynamask, a method to identify the features that are salient for a model to issue its prediction when the data is represented in terms of time series.…
[AAAI-23 Oral] Official implementation of the paper "Are Transformers Effective for Time Series Forecasting?"
Resources for Machine Learning Explainability
A professionally curated list of awesome resources (paper, code, data, etc.) on transformers in time series.
Code for predicting individual differences in behavioral variables (e.g., intelligence, personality) from resting-state fMRI functional connectivity, using data from the Young Adult Human Connectom…
About Code release for "Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting" (NeurIPS 2021), https://arxiv.org/abs/2106.13008
Scaleformer: Iterative Multi-scale Refining Transformers for Time Series Forecasting
PyTorch implementation of "Seeing the forest and the tree: Building representations of both individual and collective dynamics with transformers" (NeurIPS 2022)
NeuroKit2: The Python Toolbox for Neurophysiological Signal Processing
An implementation of several well-known dynamic Functional Connectivity assessment methods.
A curated list of reproducible research case studies, projects, tutorials, and media
Tips for releasing research code in Machine Learning (with official NeurIPS 2020 recommendations)
Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.