Skip to content

Commit

Permalink
doc: update README.rst
Browse files Browse the repository at this point in the history
  • Loading branch information
xuyxu committed Jun 4, 2021
1 parent 4341216 commit 9574c0c
Showing 1 changed file with 14 additions and 28 deletions.
42 changes: 14 additions & 28 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,10 +20,10 @@
Ensemble PyTorch
================

A unified ensemble framework for pytorch_ to easily improve the performance and robustness of your deep learning model.
A unified ensemble framework for pytorch_ to easily improve the performance and robustness of your deep learning model. Ensemble-PyTorch is part of the `pytorch ecosystem <https://pytorch.org/ecosystem/>`__ which requires the project to be well maintained.

* `Document <https://ensemble-pytorch.readthedocs.io/>`__
* `Source Code <https://github.com/xuyxu/Ensemble-Pytorch>`__
* `Source Code <https://github.com/https://github.com/TorchEnsemble-Community/Ensemble-Pytorch/Ensemble-Pytorch>`__
* `Experiment <https://ensemble-pytorch.readthedocs.io/en/stable/experiment.html>`__

Installation
Expand All @@ -39,7 +39,7 @@ Latest version (under development):

.. code:: bash
pip install git+https://github.com/xuyxu/Ensemble-Pytorch
pip install git+https://github.com/TorchEnsemble-Community/Ensemble-Pytorch.git
Example
-------
Expand All @@ -52,43 +52,33 @@ Example
train_loader = DataLoader(...)
test_loader = DataLoader(...)
'''
[Step-1] Define the ensemble
'''
model = VotingClassifier(
# Define the ensemble
ensemble = VotingClassifier(
estimator=base_estimator, # here is your deep learning model
n_estimators=10, # number of base estimators
)
'''
[Step-2] Set the parameter optimizer
'''
model.set_optimizer(
# Set the optimizer
ensemble.set_optimizer(
"Adam", # type of parameter optimizer
lr=learning_rate, # learning rate of parameter optimizer
weight_decay=weight_decay, # weight decay of parameter optimizer
)
'''
[Step-3] Set the learning rate scheduler
'''
model.set_scheduler(
# Set the learning rate scheduler
ensemble.set_scheduler(
"CosineAnnealingLR", # type of learning rate scheduler
T_max=epochs, # additional arguments on the scheduler
)
'''
[Step-4] Train the ensemble
'''
model.fit(
# Train the ensemble
ensemble.fit(
train_loader,
epochs=epochs, # number of training epochs
)
'''
[Step-5] Evaluate the ensemble
'''
acc = model.predict(test_loader) # testing accuracy
# Evaluate the ensemble
acc = ensemble.predict(test_loader) # testing accuracy
Supported Ensemble
------------------
Expand All @@ -104,8 +94,6 @@ Supported Ensemble
+------------------------------+------------+---------------------------+
| Gradient Boosting [3]_ | Sequential | gradient_boosting.py |
+------------------------------+------------+---------------------------+
| Soft Gradient Boosting [7]_ | Parallel | soft_gradient_boosting.py |
+------------------------------+------------+---------------------------+
| Snapshot Ensemble [4]_ | Sequential | snapshot_ensemble.py |
+------------------------------+------------+---------------------------+
| Adversarial Training [5]_ | Parallel | adversarial_training.py |
Expand Down Expand Up @@ -135,8 +123,6 @@ Reference
.. [6] Garipov, Timur, et al. Loss Surfaces, Mode Connectivity, and Fast Ensembling of DNNs. NeurIPS, 2018.
.. [7] Feng, Ji, et al. Soft Gradient Boosting Machine. arXiv, 2020.
.. _pytorch: https://pytorch.org/

.. _pypi: https://pypi.org/project/torchensemble/

0 comments on commit 9574c0c

Please sign in to comment.