Skip to content

Commit

Permalink
fix overview
Browse files Browse the repository at this point in the history
  • Loading branch information
edenlum authored Jan 1, 2024
1 parent b029d32 commit 08d46f1
Showing 1 changed file with 23 additions and 13 deletions.
36 changes: 23 additions & 13 deletions tutorials/notebooks/example_quick_start_torchvision.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -10,19 +10,24 @@
"source": [
"# Quick Start\n",
"\n",
"[Run this tutorial in Google Colab](https://colab.research.google.com/github/sony/model_optimization/blob/quickstart-table/tutorials/notebooks/example_quick_start_torchvision.ipynb)\n",
"\n",
"Overview\n",
"[Run this tutorial in Google Colab](https://colab.research.google.com/github/sony/model_optimization/blob/quickstart-table/tutorials/notebooks/example_quick_start_torchvision.ipynb)"
]
},
{
"cell_type": "markdown",
"id": "9bc664bf",
"metadata": {},
"source": [
"## Overview\n",
"This tutorial shows how to use \"quick-start\" with a pre-trained model from the torchvision library.\n",
"\n",
"The following steps will be covered:\n",
"\n",
"Steps:\n",
"* **Setup the environment**: install MCT and add tutorials to PYTHONPATH\n",
"* **Download and organize the imagenet dataset**\n",
"* **Run quick_start on your model**\n",
"\n",
"**Note**: The following code should be run on a GPU."
"**Note**: The following code will run faster on a GPU."
]
},
{
Expand All @@ -46,7 +51,8 @@
"id": "eda6ab0d8f0b6b56"
},
"source": [
"In order to convert the PyTorch model, you'll need to use the conversion code in the [MCT tutorials folder](https://github.com/sony/model_optimization/tree/main/tutorials), so we'll clone the MCT repository to a local folder and only use that code. The installed MCT package will be used for quantization.\n",
"In order to use quick-start you'll need the [MCT tutorial folder](https://github.com/sony/model_optimization/tree/main/tutorials), so we'll clone the MCT repository to a local folder and use that code.\n",
"\n",
" **It's important to note that we use the most up-to-date MCT code available.**"
]
},
Expand All @@ -64,20 +70,20 @@
"outputs": [],
"source": [
"!git clone https://github.com/sony/model_optimization.git local_mct\n",
"!pip install -r /content/local_mct/requirements.txt"
"!pip install -r ./local_mct/requirements.txt"
]
},
{
"cell_type": "code",
"execution_count": 2,
"execution_count": null,
"id": "PfJ3_AyieBL0",
"metadata": {
"id": "PfJ3_AyieBL0"
},
"outputs": [],
"source": [
"import os\n",
"os.environ['PYTHONPATH'] = '/content/local_mct/'"
"os.environ['PYTHONPATH'] = './local_mct/'"
]
},
{
Expand All @@ -92,7 +98,7 @@
"\n",
"Use torchvision.datasets.ImageNet to create the dataset in the correct structure.\n",
"\n",
"**Note**: We use validation for time convinience since the training split is too big. To measure accurate validation results, the validation samples should only be used for testing."
"**Note**: We use validation for time convinience since the training split is too big. Typically, the validation samples should only be used for testing."
]
},
{
Expand Down Expand Up @@ -125,7 +131,7 @@
"outputs": [],
"source": [
"import torchvision\n",
"ds = torchvision.datasets.ImageNet(root='/content/imagenet', split='val')"
"ds = torchvision.datasets.ImageNet(root='./imagenet', split='val')"
]
},
{
Expand All @@ -135,7 +141,11 @@
"id": "gY2UsnzOfjtk"
},
"source": [
"## Run Quick Start script on our model of choice"
"## Run Quick Start script on our model of choice\n",
"\n",
"Here we set the model name, model library, validation dataset path, and representative datasets path.\n",
" \n",
"The remaining arguments are left with their default settings. Please verify that the dataset paths are configured correctly."
]
},
{
Expand All @@ -151,7 +161,7 @@
},
"outputs": [],
"source": [
"!python /content/local_mct/tutorials/quick_start/main.py --model_name mobilenet_v2 --model_library torchvision --validation_dataset_folder /content/imagenet/val --representative_dataset_folder /content/imagenet/val"
"!python ./local_mct/tutorials/quick_start/main.py --model_name mobilenet_v2 --model_library torchvision --validation_dataset_folder ./imagenet/val --representative_dataset_folder ./imagenet/val"
]
},
{
Expand Down

0 comments on commit 08d46f1

Please sign in to comment.