In Part 1, you will use PyTorch to train a model on a self-supervised task, fine-tune a subset of the model’s weights, and train a model in a fully supervised setting with different weight initializations. You will be using the CIFAR10 dataset, which is a dataset of small (32x32) images belonging to 10 different object classes. For self-supervised training, you will ignore the provided labels; however, you will use the class labels for fine-tuning and fully supervised training.
The model architecture you will use is ResNet18. We will use the PyTorch ResNet18 implementation, so you do not need to create it from scratch.
The self-supervised training task is image rotation prediction, as proposed by Gidaris et al. in 2018. For this task, all training images are randomly rotated by 0, 90, 180, or 270 degrees. The network is then trained to classify the rotation of each input image using cross-entropy loss by treating each of the 4 possible rotations as a class. This task can be treated as pre-training, and the pre-trained weights can then be fine-tuned on the supervised CIFAR10 classification task.
The top-level notebook (a3_part1_rotation.ipynb
) will guide you through all the steps of training a ResNet for the rotation task and fine-tuning on the classification task. You will implement the data loader, training and fine-tuning steps in Pytorch based on the starter code. In detail, you will complete the following tasks:
Part 1 Extra Credit
In Part 2 you will implement a YOLO-like object detector on the PASCAL VOC 2007 dataset to produce results like in the above image.
The top-level notebook (MP3_P2.ipynb
) will guide you through all the steps. You will mainly focus on implementing the loss function of YOLO in the yolo_loss.py
file. You will be provided a pre-trained network structure for the model. The network structure has been inspired by DetNet, however you are not required to understand it. In principle, it can be replaced by a different network architecture and trained from scratch, but to achieve a good accuracy with a minimum of computational expense and tuning, you should stick to the provided one.
As you start this part, you will realize that training is more computationally intensive than what you are used to. In order to get an idea whether your implementation works without waiting a long time for training to converge, here are some typical values to expect:
Epoch | mAP |
---|---|
5 | 0.2013 |
10 | 0.2545 |
15 | 0.2749 |
20 | 0.2898 |
25 | 0.3069 |
30 | 0.3355 |
35 | 0.3402 |
40 | 0.3347 |
45 | 0.2588 |
50 | 0.3836 |
The instructions in the yolo_loss.py
file should be sufficient to guide you through the assignment, but it will be really helpful to understand the big picture of how YOLO works and how the loss function is defined.
The following resources are useful for understanding YOLO in detail:
Part 2 Extra Credit
Download the starting code here.
To complete this assignment in a reasonable amount of time, you'll need to use a GPU. This can either be your personal GPU, Google Colab or Colab Pro with GPU enabled, or Google Cloud Platform (we will be distributing coupon codes).
Environment Setup
If you will be working on the assignment on a local machine then you will need a python environment set up with the appropriate packages. We suggest that you use Conda to manage python package dependencies (https://conda.io/docs/user-guide/getting-started.html). Unless you have a machine with a GPU, running this assignment on your local machine will be very slow and is not recommended. Please use Google Colab or Google Cloud Platform for this assignment. Instructions on setting up vm instances can be found here. Running Part 1 in Google Colab is fine, but a fully-trained model for Part 2 can take up to 7-8 hours.
We suggest that you use Anaconda to manage python package dependencies (https://www.anaconda.com/download). This guide provides useful information on how to use Conda: https://conda.io/docs/user-guide/getting-started.html. Ensure that IPython is installed (https://ipython.org/install.html). You may then navigate the assignment directory in terminal and start a local IPython server using thejupyter notebook
command.
Be careful using GOOGLE CLOUD PLATFORM!! Do not use all of your credits!
Data Setup
Once you have downloaded the zip file, go to the assignment3_part2 directory and execute the download_data script provided:
sh download_data.sh
The assignment deliverables are as follows. If you are working in a pair, only one designated student should make the submission.
Please refer to course policies on collaborations, late submission, etc.