Skip to content

Commit

Permalink
Update to 2023-06-30.
Browse files Browse the repository at this point in the history
  • Loading branch information
SupeRuier committed Jul 3, 2023
1 parent 82f6935 commit 1da14a0
Show file tree
Hide file tree
Showing 8 changed files with 28 additions and 5 deletions.
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -145,6 +145,8 @@ There have been several reviews / surveys / benchmarks for this topic.
### **Benchmarks - 基线**:
- [A Comparative Survey: Benchmarking for Pool-based Active Learning [2021, IJCAI]](https://www.ijcai.org/proceedings/2021/0634.pdf)
- [A Framework and Benchmark for Deep Batch Active Learning for Regression [2022]](https://arxiv.org/pdf/2203.09410.pdf)
- [Re-Benchmarking Pool-Based Active Learning for Binary Classification [2023]](https://arxiv.org/pdf/2306.08954.pdf)
- [LabelBench: A Comprehensive Framework for Benchmarking Label-Efficient Learning [2023]](https://arxiv.org/pdf/2306.09910.pdf)

### **Tutorials - 教程**

Expand Down
2 changes: 2 additions & 0 deletions conference/ICML.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
- Discover-Then-Rank Unlabeled Support Vectors in the Dual Space for Multi-Class Active Learning [2023, ICML]
- Towards Controlled Data Augmentations for Active Learning [2023, ICML]
- N-Penetrate: Active Learning of Neural Collision Handler for Complex 3D Mesh Deformations [2022, ICML]
- Active Multi-Task Representation Learning [2022, ICML]
- ActiveHedge: Hedge meets Active Learning [2022, ICML]
Expand Down
1 change: 1 addition & 0 deletions conference/NeuraIPS.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@

- Batch Active Learning from the Perspective of Sparse Approximation [2022, NeurIPS]
- LOG: Active Model Adaptation for Label-Efficient OOD Generalization [2022, NeuraIPS]
- Few-Shot Continual Active Learning by a Robot [2022, NeurIPS]
- [Corruption Robust Active Learning [2021, NeurIPS]](https://arxiv.org/pdf/2106.11220.pdf):
- [BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning [2019, NeurIPS]](http://papers.nips.cc/paper/8925-batchbald-efficient-and-diverse-batch-acquisition-for-deep-bayesian-active-learning):
Expand Down
13 changes: 12 additions & 1 deletion contents/AL_combinations.md
Original file line number Diff line number Diff line change
Expand Up @@ -172,6 +172,7 @@ Video Action Recognition:

3D Semantic Segmentation:
- LiDAL: Inter-frame Uncertainty Based Active Learning for 3D LiDAR Semantic Segmentation [2022, ECCV]
- Efficient 3D Scene Semantic Segmentation via Active Learning on Rendered 2D Images [2023, TIP]

Visual Tracking:
- Active Learning for Deep Visual Tracking [2021]
Expand Down Expand Up @@ -200,9 +201,11 @@ Micro-Expression Recognition:
- Tackling Micro-Expression Data Shortage via Dataset Alignment and Active Learning [2022, IEEE Trans Multimedia]

Collision Prediction & Handling:

- N-Penetrate: Active Learning of Neural Collision Handler for Complex 3D Mesh Deformations [2022, ICML]

Super-resolution:
- USIM-DAL: Uncertainty-aware Statistical Image Modeling-based Dense Active Learning for Super-resolution [2023, UAI]

## Natural Language Processing (NLP)

NLP is also quite a wide conception.
Expand Down Expand Up @@ -299,6 +302,9 @@ Abstractive Text Summarization:
Natural Language Explanations:
- Beyond Labels: Empowering Human with Natural Language Explanations through a Novel Active-Learning Architecture [2023]

n-ary relation extraction:
- Active Learning for Cross-Sentence n-ary Relation Extraction [2023, Information Sciences]

## Domain adaptation/Transfer learning

Normally when we use AL in domain adaptation, we can obtain several true labels of the unlabeled instances on source/target domain.
Expand Down Expand Up @@ -345,6 +351,8 @@ Transfer learning:
- Active learning with cross-class similarity transfer [2017, AAAI]
- Rapid Performance Gain through Active Model Reuse [IJCAI, 2019]

Domain Generalization:
- LOG: Active Model Adaptation for Label-Efficient OOD Generalization [2022, NeuraIPS]

## One/Few/Zero-shot learning or Meta-Learning

Expand All @@ -371,6 +379,7 @@ For one/few-shot learning:
- MEAL: Stable and Active Learning for Few-Shot Prompting [2022]
- Few-shot initializing of Active Learner via Meta-Learning [2022, EMNLP]
- Active Learning for Efficient Few-Shot Classification [2023, ICASSP]
- Improved prototypical network for active few-shot learning [2023, PRL]

There are also works about zero-shot learning:
- Graph active learning for GCN-based zero-shot classification [2021, Neurocomputing]
Expand Down Expand Up @@ -533,6 +542,7 @@ For example, the ImageNet was crawled from image databases without considering s

- [Root Cause Analysis for Self-organizing Cellular Network: an Active Learning Approach](https://link.springer.com/article/10.1007/s11036-020-01589-1)
- [Active Invariant Causal Prediction: Experiment Selection through Stability](https://arxiv.org/pdf/2006.05690.pdf)
- ACE: Active Learning for Causal Inference with Expensive Experiments [2023]

## Choice Model

Expand Down Expand Up @@ -621,6 +631,7 @@ Normally there is only model selection without training.
- [Deep Multi-Fidelity Active Learning of High-Dimensional Outputs [2020]](https://arxiv.org/pdf/2012.00901.pdf)
- [Batch Multi-Fidelity Active Learning with Budget Constraints [2022, NeruaIPS]](https://arxiv.org/pdf/2210.12704.pdf)
- Disentangled Multi-Fidelity Deep Bayesian Active Learning [2023]
- Multi-Fidelity Active Learning with GFlowNets [2023]

## Online Learning System

Expand Down
4 changes: 2 additions & 2 deletions contents/AL_technique_models.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,8 @@ In this chapter, we care about how to apply AL on specific models.
# Models

## SVM/LR
Most common models, we won't waste time here.
Most of classic strategies are based on these models.

- Discover-Then-Rank Unlabeled Support Vectors in the Dual Space for Multi-Class Active Learning [2023, ICML]

## Bayesian/Probabilistic/Gaussian Progress
- Employing EM and Pool-Based Active Learning for Text Classification [[1998. ICML]](http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.50.10&rep=rep1&type=pdf):
Expand Down
3 changes: 2 additions & 1 deletion contents/MVAL.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,5 @@ AL select instances by considering the relations among views.
Works:
- Active + Semi-Supervised Learning = Robust Multi-View Learning [2002, ICML]
- Active learning with multiple views [2006, JAIR]
- Multi-view active learning in the non-realizable case [2010, NeurIPS]
- Multi-view active learning in the non-realizable case [2010, NeurIPS]
- Towards Balanced Active Learning for Multimodal Classification [2023]
4 changes: 3 additions & 1 deletion contents/deep_AL.md
Original file line number Diff line number Diff line change
Expand Up @@ -214,6 +214,7 @@ Data Augmentation:
LADA takes the influence of the augmented data into account to construct acquisition function
- Collaborative Intelligence Orchestration: Inconsistency-Based Fusion of Semi-Supervised Learning and Active Learning [KDD, 2022]
- When Active Learning Meets Implicit Semantic Data Augmentation [2022, ECCV]
- Towards Controlled Data Augmentations for Active Learning [2023, ICML]

Labeled-unlabeled data indistinguishable:
- [Deep Active Learning: Unified and Principled Method for Query and Training [2020, ICAIS]](https://arxiv.org/abs/1911.09162): WAAL.
Expand Down Expand Up @@ -244,4 +245,5 @@ Pretrain:

Both self and semi-supervised imparted:
- Self-supervised Semi-supervised Learning for Data Labeling and Quality Evaluation [2021, Arxiv]:
Use random selection with BYOL and Label Propagation.
Use random selection with BYOL and Label Propagation.
- Representation-Based Time Series Label Propagation for Active Learning [2023, CSCWD]
4 changes: 4 additions & 0 deletions on_build/AL_theory.md
Original file line number Diff line number Diff line change
Expand Up @@ -88,6 +88,10 @@ The covariate X might be the result of a complex process that the learner can in

- Nonparametric Indirect Active Learning [2022, ICML workshop]

### Formalism for AL

- A Markovian Formalism for Active Querying [2023]

# To be classified

- Active Learning with Label Comparisons [2022]

0 comments on commit 1da14a0

Please sign in to comment.