Skip to content

Commit 9c8ac6a

Browse files
author
Frederik Mellbye
committed
Update docs to reflect distributed module
Reviewers: #tensorflow, #framework_ip_review_-_any_oss_or_third-party_code_use_has_been_approved, #documentation, jamiep, christiana Reviewed By: #tensorflow, #framework_ip_review_-_any_oss_or_third-party_code_use_has_been_approved, #documentation, jamiep, christiana Subscribers: christiana Maniphest Tasks: T68097 Differential Revision: https://phabricator.sourcevertex.net/D73894
1 parent e4e2d32 commit 9c8ac6a

File tree

3 files changed

+11
-6
lines changed

3 files changed

+11
-6
lines changed

tensorflow/compiler/plugin/poplar/docs/api.rst

Lines changed: 8 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -92,17 +92,22 @@ For example, this will not work:
9292
:members: IPUMultiWorkerStrategy
9393
:special-members: __init__
9494

95-
.. automodule:: tensorflow.python.ipu.horovod
95+
.. automodule:: tensorflow.python.ipu.distributed
9696
:members:
9797

98-
.. automodule:: tensorflow.python.ipu.horovod.ipu_horovod_strategy
98+
.. automodule:: tensorflow.python.ipu.distributed.ipu_horovod_strategy
9999
:members: IPUHorovodStrategy
100100
:special-members: __init__
101101

102-
.. automodule:: tensorflow.python.ipu.horovod.popdist_strategy
102+
.. automodule:: tensorflow.python.ipu.distributed.popdist_strategy
103103
:members: PopDistStrategy
104104
:special-members: __init__
105105

106+
.. note::
107+
Both :py:class:`tensorflow.python.ipu.distributed.popdist_strategy.PopDistStrategy`
108+
and :py:class:`tensorflow.python.ipu.distributed.ipu_horovod_strategy.IPUHorovodStrategy`
109+
are still available through the deprecated module `tensorflow.python.ipu.horovod`.
110+
106111
.. Serving utilities
107112
108113
.. automodule:: tensorflow.python.ipu.serving

tensorflow/compiler/plugin/poplar/docs/distributed_training.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ communication over the host network, for example for broadcasting the
1010
initial values of variables from the first instance to the others.
1111

1212
To perform distributed training on Pod systems, use
13-
:class:`~tensorflow.python.ipu.horovod.popdist_strategy.PopDistStrategy`,
13+
:class:`~tensorflow.python.ipu.distributed.popdist_strategy.PopDistStrategy`,
1414
which performs data-parallel synchronous training using multiple host processes.
1515
In this sense it is similar to
1616
`MultiWorkerMirroredStrategy <https://www.tensorflow.org/api_docs/python/tf/distribute/MultiWorkerMirroredStrategy>`_
@@ -20,7 +20,7 @@ network using Horovod.
2020
Collective operations (explicitly through a member function like ``reduce()`` or
2121
implicitly by using an optimizer under the strategy scope) will be performed
2222
directly on the IPU by using compiled communications with the GCL library
23-
over the IPU links and GW-Links. The
23+
over the IPU-Links and GW-Links. The
2424
``PopDistStrategy`` is designed for use with PopDist and PopRun.
2525
Refer to the `PopDist and PopRun User Guide
2626
<https://docs.graphcore.ai/projects/poprun-user-guide/>`_ for more details.

tensorflow/python/ipu/distributed/ipu_horovod_strategy.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616
from tensorflow.python.distribute import reduce_util
1717
from tensorflow.python.distribute.cluster_resolver import cluster_resolver as cluster_resolver_lib
1818
from tensorflow.python.ipu import ipu_multi_worker_strategy
19-
from tensorflow.python.ipu.horovod import Sum, Average, size, allreduce, broadcast
19+
from tensorflow.python.ipu.distributed import Sum, Average, size, allreduce, broadcast
2020
from tensorflow.python.training import server_lib
2121
from tensorflow.python.util import deprecation
2222

0 commit comments

Comments
 (0)