Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 17 additions & 0 deletions distributed.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ PyTorch with each method having their advantages in certain use cases:
* `Tensor Parallel (TP) <#learn-tp>`__
* `Device Mesh <#device-mesh>`__
* `Remote Procedure Call (RPC) distributed training <#learn-rpc>`__
* `Monarch Framework <#learn-monarch>`__
* `Custom Extensions <#custom-extensions>`__

Read more about these options in `Distributed Overview <https://docs.pytorch.org/tutorials/beginner/dist_overview.html?utm_source=distr_landing>`__.
Expand Down Expand Up @@ -159,6 +160,22 @@ Learn RPC
+++
:octicon:`code;1em` Code

.. _learn-monarch:

Learn Monarch
----------

.. grid:: 3

.. grid-item-card:: :octicon:`file-code;1em`
Interactive Distributed Applications with Monarch
:link: https://docs.pytorch.org/tutorials/intermediate/monarch_distributed_tutorial.html
:link-type: url

Learn how to use Monarch's actor framework
+++
:octicon:`code;1em` Code

.. _custom-extensions:

Custom Extensions
Expand Down
9 changes: 9 additions & 0 deletions index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ Welcome to PyTorch Tutorials
* `Supporting Custom C++ Classes in torch.compile/torch.export <https://docs.pytorch.org/tutorials/advanced/custom_class_pt2.html>`__
* `Accelerating torch.save and torch.load with GPUDirect Storage <https://docs.pytorch.org/tutorials/unstable/gpu_direct_storage.html>`__
* `Getting Started with Fully Sharded Data Parallel (FSDP2) <https://docs.pytorch.org/tutorials/intermediate/FSDP_tutorial.html>`__
* `Interactive Distributed Applications with Monarch <https://docs.pytorch.org/tutorials/intermediate/monarch_distributed_tutorial.html>`__

.. raw:: html

Expand Down Expand Up @@ -688,6 +689,14 @@ Welcome to PyTorch Tutorials
:link: intermediate/monarch_distributed_tutorial.html
:tags: Parallel-and-Distributed-Training


.. customcarditem::
:header: Interactive Distributed Applications with Monarch
:card_description: Learn how to use Monarch's actor framework with TorchTitan to simplify large-scale distributed training across SLURM clusters.
:image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
:link: intermediate/monarch_distributed_tutorial.html
:tags: Parallel-and-Distributed-Training

.. Edge

.. customcarditem::
Expand Down