Repository for the Advanced Topics in Neural Networks laboratory, "Alexandru Ioan Cuza" University, Faculty of Computer Science, Master degree.
Google Colab: PyTorch, Pandas, Numpy, Tensorboard, and Matplotlib are already available. Wandb can be easily installed using pip install wandb
.
Local instalation:
- Create a Python environment (using conda or venv). We recommend installing conda from Miniforge.
# Create the environment
conda create -n 313 -c conda-forge python=3.13
# activate the environment
conda activate 313
# Run this to use conda-forge as your highest priority channel (not needed if you installed conda from Miniforge)
conda config --add channels conda-forge
- Install PyTorch 2.6.0+ from pytorch.org using pip.
- Example CPU:
pip install torch torchvision torchaudio
- Example CUDA:
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu126
- Example CPU:
- Install Tensorboard and W&B
conda install -c conda-forge tensorboard wandb
- Install Matplotlib.
conda install conda-forge::matplotlib
- Linear algebra:
- Essence of linear algebra (linear transformations; matrix multiplication)
- Essence of calculus (derivatives; chain rule)
- Backpropagation:
- Neural Networks (chapter 1 - chapter 4) (animated introduction to neural networks and backpropagation)
- Convolutions:
- But what is a convolution? (convolution example; convolutions in image processing; convolutions and polynomial multiplication; FFT)
- Transformers:
- Neural Networks (chapter 5 - chapter 7) (GPT; visual explanation of attention; LLMs)
- Also see Resources.md.
- Lab01: Tensor Operations (Homework 1: Multi Layer Perceptron + Backpropagation)
- Lab02: Convolutions, DataLoaders, Datasets, Data Augmentation techniques (Homework 2: Kaggle competition on CIFAR-100 with VGG-16)
- Lab03: ResNets (Homework 3: Implement a complete training pipeline with PyTorch)
- Lab04: Training pipeline implementation
- Lab05: R-CNN, Fast R-CNN, Faster R-CNN, YOLO, U-Net, 3D U-Net, Ensemble methods, Model Soup
- Lab07: Self-Supervised Learning, Autoencoders, VAE, GAN, Diffusion
- Lab09: Sequence to sequence models, RNN, LSTM, Attention Is All You Need
- Lab10: Multi-Headed Attention, Transformers, BERT, GPT, ViT
- Lab11: Generalization, Batch Sizes, SAM, Benchmarks