Skip to content

Commit

Permalink
Replaced links examples/contrib by examples
Browse files Browse the repository at this point in the history
  • Loading branch information
vfdev-5 authored Aug 19, 2023
1 parent df95465 commit e36c20b
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 5 deletions.
6 changes: 3 additions & 3 deletions src/blog/2020-09-10-pytorch-ignite.md
Original file line number Diff line number Diff line change
Expand Up @@ -990,7 +990,7 @@ with idist.Parallel(backend=backend, **dist_configs) as parallel:
Please note that these `auto_*` methods are optional; a user is free use some of them and manually set up certain parts of the code if required. The advantage of this approach is that there is no under the hood inevitable objects' patching and overriding.

More details about distributed helpers provided by PyTorch-Ignite can be found in [the documentation](https://pytorch.org/ignite/distributed.html).
A complete example of training on CIFAR10 can be found [here](https://github.com/pytorch/ignite/tree/master/examples/contrib/cifar10).
A complete example of training on CIFAR10 can be found [here](https://github.com/pytorch/ignite/tree/master/examples/cifar10).

A detailed tutorial with distributed helpers is published [here](https://pytorch-ignite.ai/posts/distributed-made-easy-with-ignite/).

Expand All @@ -1012,11 +1012,11 @@ In addition, PyTorch-Ignite also provides several tutorials:
- [Basic example of LR finder on MNIST](https://github.com/pytorch/ignite/blob/master/examples/notebooks/FastaiLRFinder_MNIST.ipynb)
- [Benchmark mixed precision training on Cifar100: torch.cuda.amp vs nvidia/apex](https://github.com/pytorch/ignite/blob/master/examples/notebooks/Cifar100_bench_amp.ipynb)
- [MNIST training on a single TPU](https://github.com/pytorch/ignite/blob/master/examples/notebooks/MNIST_on_TPU.ipynb)
- [CIFAR10 Training on multiple TPUs](https://github.com/pytorch/ignite/tree/master/examples/contrib/cifar10)
- [CIFAR10 Training on multiple TPUs](https://github.com/pytorch/ignite/tree/master/examples/cifar10)

and examples:

- [cifar10](https://github.com/pytorch/ignite/tree/master/examples/contrib/cifar10) (single/multi-GPU, DDP, AMP, TPUs)
- [cifar10](https://github.com/pytorch/ignite/tree/master/examples/cifar10) (single/multi-GPU, DDP, AMP, TPUs)
- [basic RL](https://github.com/pytorch/ignite/tree/master/examples/reinforcement_learning)
- [reproducible baselines for vision tasks:](https://github.com/pytorch/ignite/tree/master/examples/references)
- classification on ImageNet (single/multi-GPU, DDP, AMP)
Expand Down
4 changes: 2 additions & 2 deletions src/blog/2021-06-28-pytorch-ignite-distributed.md
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,7 @@ The code snippets below highlight the API's specificities of each of the distrib

PyTorch-Ignite's unified code snippet can be run with the standard PyTorch backends like `gloo` and `nccl` and also with Horovod and XLA for TPU devices. Note that the code is less verbose, however, the user still has full control of the training loop.

The following examples are introductory. For a more robust, production-grade example that uses PyTorch-Ignite, refer [here](https://github.com/pytorch/ignite/tree/master/examples/contrib/cifar10).
The following examples are introductory. For a more robust, production-grade example that uses PyTorch-Ignite, refer [here](https://github.com/pytorch/ignite/tree/master/examples/cifar10).

The complete source code of these experiments can be found [here](https://github.com/pytorch-ignite/idist-snippets).

Expand Down Expand Up @@ -285,7 +285,7 @@ while maintaining control and simplicity.
with distributed data parallel: native pytorch, pytorch-ignite,
slurm.

- [CIFAR10 example](https://github.com/pytorch/ignite/tree/master/examples/contrib/cifar10)
- [CIFAR10 example](https://github.com/pytorch/ignite/tree/master/examples/cifar10)
of distributed training on CIFAR10 with muliple configurations: 1 or
multiple GPUs, multiple nodes and GPUs, TPUs.

0 comments on commit e36c20b

Please sign in to comment.