11# -*- coding: utf-8 -*-
22"""
33`Introduction to ONNX <intro_onnx.html>`_ ||
4- **Exporting a PyTorch model to ONNX ** ||
4+ **PyTorch ๋ชจ๋ธ์ ONNX๋ก ๋ด๋ณด๋ด๊ธฐ ** ||
55`Extending the ONNX exporter operator support <onnx_registry_tutorial.html>`_ ||
66`Export a model with control flow to ONNX <export_control_flow_model_to_onnx_tutorial.html>`_
77
8- Export a PyTorch model to ONNX
8+ PyTorch ๋ชจ๋ธ์ ONNX๋ก ๋ด๋ณด๋ด๊ธฐ
99==============================
1010
11- **Author **: `Ti-Tai Wang <https://github.com/titaiwangms>`_, `Justin Chu <justinchu@microsoft.com>`_, `Thiago Crepaldi <https://github.com/thiagocrepaldi>`_.
11+ **์ ์ **: `Ti-Tai Wang <https://github.com/titaiwangms>`_, `Justin Chu <justinchu@microsoft.com>`_, `Thiago Crepaldi <https://github.com/thiagocrepaldi>`_.
1212
13- .. note::
14- Starting with PyTorch 2.5, there are two ONNX Exporter options available.
15- * ``torch.onnx.export(..., dynamo=True)`` is the recommended exporter that leverages ``torch.export`` and Torch FX for graph capture.
16- * ``torch.onnx.export`` is the legacy approach that relies on the deprecated TorchScript and is no longer recommended for use.
13+ **๋ฒ์ญ**: `์ด์คํ <https://github.com/titaiwangms>`_.
1714
15+ .. note::
16+ PyTorch 2.5๋ถํฐ ๋ ๊ฐ์ง ONNX ์ต์คํฌํฐ(Exporter) ์ต์
์ ์ฌ์ฉํ ์ ์์ต๋๋ค.
17+ * ``torch.onnx.export(..., dynamo=True)`` ๋ ๊ทธ๋ํ ์์ฑ์ ``torch.export`` ์ Torch FX๋ฅผ ํ์ฉํ๋ ๊ถ์ฅ ์ต์คํฌํฐ์
๋๋ค.
18+ * ``torch.onnx.export`` ๋ TorchScript์ ์์กดํ๋ ๋ ๊ฑฐ์ ๋ฐฉ์์ด๋ฉฐ ๋ ์ด์ ์ฌ์ฉ์ด ๊ถ์ฅ๋์ง ์์ต๋๋ค.
19+
1820"""
1921
2022###############################################################################
21- # In the `60 Minute Blitz <https://tutorials.pytorch.kr/beginner/deep_learning_60min_blitz.html>`_,
22- # we had the opportunity to learn about PyTorch at a high level and train a small neural network to classify images .
23- # In this tutorial, we are going to expand this to describe how to convert a model defined in PyTorch into the
24- # ONNX format using the ``torch.onnx.export(..., dynamo=True)`` ONNX exporter .
23+ # `PyTorch๋ก ๋ฅ๋ฌ๋ํ๊ธฐ: 60๋ถ๋ง์ ๋์ฅ๋ด๊ธฐ <https://tutorials.pytorch.kr/beginner/deep_learning_60min_blitz.html>`_ ์์๋
24+ # PyTorch๋ฅผ ๊ณ ์์ค์์ ๋ฐฐ์ฐ๊ณ ์์ ์ด๋ฏธ์ง ๋ถ๋ฅ ์ ๊ฒฝ๋ง์ ํ์ต์์ผ๋ณผ ์ ์์์ต๋๋ค .
25+ # ์ด ํํ ๋ฆฌ์ผ์์๋ ๊ทธ ๋ด์ฉ์ ํ์ฅ์ผ๋ก ``torch.onnx.export(..., dynamo=True)`` ONNX ์ต์คํฌํฐ๋ฅผ ์ฌ์ฉํ์ฌ
26+ # PyTorch์์ ์ ์๋ ๋ชจ๋ธ์ ONNX ํ์์ผ๋ก ๋ณํํ๋ ๋ฐฉ๋ฒ์ ์์๋ณด๊ฒ ์ต๋๋ค .
2527#
26- # While PyTorch is great for iterating on the development of models, the model can be deployed to production
27- # using different formats, including `ONNX <https://onnx.ai/>`_ (Open Neural Network Exchange)!
28+ # ๋ชจ๋ธ ๊ฐ๋ฐ๊ณผ ์คํ์๋ PyTorch๊ฐ ๋งค์ฐ ์ ์ฉํ๊ณ , PyTorch๋ก ์์ฑ๋ ๋ชจ๋ธ์ `ONNX <https://onnx.ai/>`_ (Open Neural Network Exchange)
29+ # ๋ฑ ๋ค์ํ ํ์์ผ๋ก ๋ณํํด ์ค์ ์๋น์ค ํ๊ฒฝ์ ๋ฐฐํฌํ ์ ์์ต๋๋ค.
2830#
29- # ONNX is a flexible open standard format for representing machine learning models which standardized representations
30- # of machine learning allow them to be executed across a gamut of hardware platforms and runtime environments
31- # from large-scale cloud-based supercomputers to resource-constrained edge devices, such as your web browser and phone .
31+ # ONNX๋ ๋จธ์ ๋ฌ๋ ๋ชจ๋ธ์ ๋ํ๋ด๋ ์ ์ฐํ ๊ณต๊ฐ ํ์ค ํ์์
๋๋ค. ์ด๋ฐ ํ์คํ๋ ํํ์ ์ฌ์ฉํ๋ ๋ชจ๋ธ์
32+ # ๋๊ท๋ชจ ํด๋ผ์ฐ๋ ๊ธฐ๋ฐ ์ํผ์ปดํจํฐ๋ถํฐ, ์น ๋ธ๋ผ์ฐ์ ๋ ํด๋ํฐ์ฒ๋ผ ๋ฆฌ์์ค๊ฐ ์ ํ๋ ์ฃ์ง ๋๋ฐ์ด์ค๊น์ง
33+ # ๋ค์ํ ํ๋์จ์ด ํ๋ซํผ ๋ฐ ๋ฐํ์ ํ๊ฒฝ์์ ์คํ๋ ์ ์์ต๋๋ค .
3234#
33- # In this tutorial, weโll learn how to:
35+ # ์ด ํํ ๋ฆฌ์ผ์์๋ ๋ค์ ํญ๋ชฉ๋ค์ ๋ํด์ ์์๋ณด๊ฒ ์ต๋๋ค.
3436#
35- # 1. Install the required dependencies .
36- # 2. Author a simple image classifier model .
37- # 3. Export the model to ONNX format .
38- # 4. Save the ONNX model in a file .
39- # 5. Visualize the ONNX model graph using `Netron <https://github.com/lutzroeder/netron>`_.
40- # 6. Execute the ONNX model with ` ONNX Runtime`
41- # 7. Compare the PyTorch results with the ones from the ONNX Runtime .
37+ # 1. ํ์ํ ์์กด์ฑ ์ค์นํ๊ธฐ .
38+ # 2. ๊ฐ๋จํ ์ด๋ฏธ์ง ๋ถ๋ฅ ๋ชจ๋ธ ์์ฑํ๊ธฐ .
39+ # 3. ๋ชจ๋ธ์ ONNX ํ์์ผ๋ก ๋ด๋ณด๋ด๊ธฐ .
40+ # 4. ONNX ๋ชจ๋ธ์ ํ์ผ์ ์ ์ฅํ๊ธฐ .
41+ # 5. `Netron <https://github.com/lutzroeder/netron>`_ ์ ์ฌ์ฉํด ONNX ๋ชจ๋ธ ๊ทธ๋ํ ์๊ฐํํ๊ธฐ .
42+ # 6. ` ONNX Runtime` ์ผ๋ก ONNX ๋ชจ๋ธ ์คํํ๊ธฐ.
43+ # 7. PyTorch์ ๊ฒฐ๊ณผ์ ONNX Runtime์ ๊ฒฐ๊ณผ ๋น๊ตํ๊ธฐ .
4244#
43- # 1. Install the required dependencies
45+ # 1. ํ์ํ ์์กด์ฑ ์ค์นํ๊ธฐ
4446# ------------------------------------
45- # Because the ONNX exporter uses ``onnx`` and ``onnxscript`` to translate PyTorch operators into ONNX operators,
46- # we will need to install them .
47+ # ONNX ์ต์คํฌํฐ๋ PyTorch ์ฐ์ฐ์๋ฅผ ONNX ์ฐ์ฐ์๋ก ๋ณํํ ๋ ``onnx`` ์ ``onnxscript`` ๋ฅผ ์ฌ์ฉํ๋ฏ๋ก
48+ # ์ด์ ๋ํ ์ค์น๋ฅผ ์งํํฉ๋๋ค .
4749#
4850# .. code-block:: bash
4951#
5052# pip install --upgrade onnx onnxscript
5153#
52- # 2. Author a simple image classifier model
54+ # 2. ๊ฐ๋จํ ์ด๋ฏธ์ง ๋ถ๋ฅ ๋ชจ๋ธ ์์ฑํ๊ธฐ
5355# -----------------------------------------
5456#
55- # Once your environment is set up, letโs start modeling our image classifier with PyTorch,
56- # exactly like we did in the `60 Minute Blitz <https://tutorials.pytorch.kr/beginner/deep_learning_60min_blitz.html>`_ .
57+ # ํ๊ฒฝ ์ค์ ์ด ์๋ฃ๋์์ผ๋ฉด `PyTorch๋ก ๋ฅ๋ฌ๋ํ๊ธฐ: 60๋ถ๋ง์ ๋์ฅ๋ด๊ธฐ <https://tutorials.pytorch.kr/beginner/deep_learning_60min_blitz.html>`_ ์์
58+ # ํ๋ ๊ฒ์ฒ๋ผ PyTorch๋ก ๊ฐ๋จํ ์ด๋ฏธ์ง ๋ถ๋ฅ ๋ชจ๋ธ์ ์์ฑํฉ๋๋ค .
5759#
5860
5961import torch
@@ -81,78 +83,78 @@ def forward(self, x: torch.Tensor):
8183
8284
8385######################################################################
84- # 3. Export the model to ONNX format
86+ # 3. ๋ชจ๋ธ์ ONNX ํ์์ผ๋ก ๋ด๋ณด๋ด๊ธฐ
8587# ----------------------------------
8688#
87- # Now that we have our model defined, we need to instantiate it and create a random 32x32 input .
88- # Next, we can export the model to ONNX format .
89+ # ๋ชจ๋ธ์ด ์ ์๋์์ผ๋ฏ๋ก, ์ด์ ๋ชจ๋ธ์ ์ธ์คํด์คํํ๊ณ ์์์ 32x32 ์
๋ ฅ์ ์์ฑํฉ๋๋ค .
90+ # ์ดํ ๋ชจ๋ธ์ ONNX ํ์์ผ๋ก ๋ด๋ณด๋
๋๋ค .
8991
9092torch_model = ImageClassifierModel ()
91- # Create example inputs for exporting the model. The inputs should be a tuple of tensors .
93+ # ๋ชจ๋ธ์ ๋ด๋ณด๋ด๊ธฐ ์ํ ์์ ์
๋ ฅ์ ์์ฑํฉ๋๋ค. ์ด๋ ์
๋ ฅ์ tensor์ ํํ์ด์ด์ผ ํฉ๋๋ค .
9294example_inputs = (torch .randn (1 , 1 , 32 , 32 ),)
9395onnx_program = torch .onnx .export (torch_model , example_inputs , dynamo = True )
9496
9597######################################################################
96- # As we can see, we didn't need any code change to the model .
97- # The resulting ONNX model is stored within ``torch.onnx.ONNXProgram`` as a binary protobuf file .
98+ # ์ง๊ธ๊น์ง ์งํํ ๊ณผ์ ์์๋ ๋ชจ๋ธ์ ๋ํ ์ด๋ ํ ์ฝ๋ ๋ณ๊ฒฝ๋ ํ์ํ์ง ์์์ต๋๋ค .
99+ # ๊ฒฐ๊ณผ๋ก ๋์ถ๋ ONNX ๋ชจ๋ธ์ ``torch.onnx.ONNXProgram`` ๋ด์ ์ด์ง protobuf ํ์ผ๋ก ์ ์ฅ๋ฉ๋๋ค .
98100#
99- # 4. Save the ONNX model in a file
101+ # 4. ONNX ๋ชจ๋ธ์ ํ์ผ์ ์ ์ฅํ๊ธฐ
100102# --------------------------------
101103#
102- # Although having the exported model loaded in memory is useful in many applications ,
103- # we can save it to disk with the following code:
104+ # ๋ด๋ณด๋ธ ๋ชจ๋ธ์ ๋ฉ๋ชจ๋ฆฌ์ ์ฌ๋ ค๋๊ณ ์ ์ฉํ๊ฒ ํ์ฉํ ์๋ ์์ง๋ง ,
105+ # ๋ค์ ์ฝ๋๋ก ๋ชจ๋ธ์ ๋์คํฌ์ ์ ์ฅํ ์๋ ์์ต๋๋ค.
104106
105107onnx_program .save ("image_classifier_model.onnx" )
106108
107109######################################################################
108- # You can load the ONNX file back into memory and check if it is well formed with the following code:
110+ # ๋ค์ ์ฝ๋๋ก ONNX ํ์ผ์ ๋ค์ ๋ฉ๋ชจ๋ฆฌ์ ์ฌ๋ฆฌ๊ณ ๊ทธ ํ์์ด ์ฌ๋ฐ๋ฅธ์ง ํ์ธํ ์ ์์ต๋๋ค.
109111
110112import onnx
111113
112114onnx_model = onnx .load ("image_classifier_model.onnx" )
113115onnx .checker .check_model (onnx_model )
114116
115117######################################################################
116- # 5. Visualize the ONNX model graph using Netron
118+ # 5. Netron์ ์ฌ์ฉํด ONNX ๋ชจ๋ธ ๊ทธ๋ํ ์๊ฐํํ๊ธฐ
117119# ----------------------------------------------
118120#
119- # Now that we have our model saved in a file, we can visualize it with `Netron <https://github.com/lutzroeder/netron>`_.
120- # Netron can either be installed on macos, Linux or Windows computers, or run directly from the browser .
121- # Let's try the web version by opening the following link: https://netron.app/ .
121+ # ๋ชจ๋ธ์ด ํ์ผ์ ์ ์ฅ๋์ด ์์ผ๋ฉด `Netron <https://github.com/lutzroeder/netron>`_ ์ผ๋ก ์๊ฐํํ ์ ์์ต๋๋ค .
122+ # Netron์ macos, Linux ๋๋ Windows ์ปดํจํฐ์ ์ค์นํ๊ฑฐ๋ ์น ๋ธ๋ผ์ฐ์ ์์ ์ง์ ์คํํ ์ ์์ต๋๋ค .
123+ # ์ด `๋งํฌ < https://github.com/lutzroeder/netron>`_ ๋ก ์น ๋ธ๋ผ์ฐ์ ๋ฒ์ ์ ์ฌ์ฉํด ๋ณด๊ฒ ์ต๋๋ค .
122124#
123125# .. image:: ../../_static/img/onnx/netron_web_ui.png
124126# :width: 70%
125127# :align: center
126128#
127129#
128- # Once Netron is open, we can drag and drop our ``image_classifier_model.onnx`` file into the browser or select it after
129- # clicking the **Open model** button .
130+ # Netron์ด ์ด๋ฆฌ๋ฉด ``image_classifier_model.onnx`` ํ์ผ์ ๋ธ๋ผ์ฐ์ ๋ก ๋๋๊ทธ ์ค๋ ๋๋กญ(drag and drop)ํ๊ฑฐ๋
131+ # **Open model** ๋ฒํผ์ ํด๋ฆญํ ํ ํ์ผ์ ์ ํํฉ๋๋ค .
130132#
131133# .. image:: ../../_static/img/onnx/image_classifier_onnx_model_on_netron_web_ui.png
132134# :width: 50%
133135#
134136#
135- # And that is it! We have successfully exported our PyTorch model to ONNX format and visualized it with Netron .
137+ # ์ด์ ๋์ต๋๋ค! ์ฑ๊ณต์ ์ผ๋ก PyTorch ๋ชจ๋ธ์ ONNX ํ์์ผ๋ก ๋ด๋ณด๋ด๊ณ Netron์ผ๋ก ์๊ฐํํ์ต๋๋ค .
136138#
137- # 6. Execute the ONNX model with ONNX Runtime
139+ # 6. ONNX Runtime์ผ๋ก ONNX ๋ชจ๋ธ ์คํํ๊ธฐ
138140# -------------------------------------------
139141#
140- # The last step is executing the ONNX model with `ONNX Runtime`, but before we do that, let's install ONNX Runtime .
142+ # ๋ง์ง๋ง ๋จ๊ณ๋ `ONNX Runtime` ์ผ๋ก ONNX ๋ชจ๋ธ์ ์คํํ๋ ๊ฒ์
๋๋ค. ๊ทธ ์ ์ ๋จผ์ ONNX Runtime์ ์ค์นํ๊ฒ ์ต๋๋ค .
141143#
142144# .. code-block:: bash
143145#
144146# pip install onnxruntime
145147#
146- # The ONNX standard does not support all the data structure and types that PyTorch does ,
147- # so we need to adapt PyTorch input's to ONNX format before feeding it to ONNX Runtime .
148- # In our example, the input happens to be the same, but it might have more inputs
149- # than the original PyTorch model in more complex models .
148+ # ONNX ํ์ค์ PyTorch๊ฐ ์ง์ํ๋ ๋ชจ๋ ๋ฐ์ดํฐ ๊ตฌ์กฐ์ ํ์
์ ์ง์ํ์ง๋ ์์ผ๋ฏ๋ก ,
149+ # ONNX Runtime์ ๋ฃ๊ธฐ ์ ์ ์ฐ์ PyTorch ์
๋ ฅ์ ONNX ํ์์ ๋ง๊ฒ ์กฐ์ ํด์ผ ํฉ๋๋ค .
150+ # ์ด ์์ ์์๋ ์
๋ ฅ์ด ๋์ผํ์ง๋ง, ๋ ๋ณต์กํ ๋ชจ๋ธ์์๋
151+ # ๊ธฐ์กด PyTorch ๋ชจ๋ธ๋ณด๋ค ๋ ๋ง์ ์
๋ ฅ์ผ๋ก ๋๋์ด์ผ ํ ์ ์์ต๋๋ค .
150152#
151- # ONNX Runtime requires an additional step that involves converting all PyTorch tensors to Numpy (in CPU)
152- # and wrap them on a dictionary with keys being a string with the input name as key and the numpy tensor as the value .
153+ # ONNX Runtime์ ๋ชจ๋ PyTorch tensor๋ฅผ (CPU์) Numpy tensor๋ก ๋ณํํ ๋ค,
154+ # ์
๋ ฅ ์ด๋ฆ ๋ฌธ์์ด์ ํค๋ก, Numpy tensor๋ฅผ ๊ฐ์ผ๋ก ํ๋ ๋์
๋๋ฆฌ๋ก ๊ฐ์ธ๋ ์ถ๊ฐ ๋จ๊ณ๊ฐ ํ์ํฉ๋๋ค .
153155#
154- # Now we can create an *ONNX Runtime Inference Session*, execute the ONNX model with the processed input
155- # and get the output. In this tutorial, ONNX Runtime is executed on CPU, but it could be executed on GPU as well .
156+ # ์ด์ *ONNX Runtime ์ถ๋ก ์ธ์
* ์ ์์ฑํ๊ณ , ์ฒ๋ฆฌ๋ ์
๋ ฅ์ผ๋ก ONNX ๋ชจ๋ธ์ ์คํํ์ฌ ์ถ๋ ฅ์ ์ป์ ์ ์์ต๋๋ค.
157+ # ์ด ํํ ๋ฆฌ์ผ์์ ONNX Runtime์ CPU์์ ์คํ๋์ง๋ง, GPU์์๋ ์คํ๋ ์ ์์ต๋๋ค .
156158
157159import onnxruntime
158160
@@ -166,18 +168,18 @@ def forward(self, x: torch.Tensor):
166168
167169onnxruntime_input = {input_arg .name : input_value for input_arg , input_value in zip (ort_session .get_inputs (), onnx_inputs )}
168170
169- # ONNX Runtime returns a list of outputs
171+ # ONNX Runtime์ ์ถ๋ ฅ์ ๋ํ ๋ฆฌ์คํธ๋ฅผ ๋ฐํํฉ๋๋ค.
170172onnxruntime_outputs = ort_session .run (None , onnxruntime_input )[0 ]
171173
172174####################################################################
173- # 7. Compare the PyTorch results with the ones from the ONNX Runtime
175+ # 7. PyTorch์ ๊ฒฐ๊ณผ์ ONNX Runtime์ ๊ฒฐ๊ณผ ๋น๊ตํ๊ธฐ
174176# ------------------------------------------------------------------
175177#
176- # The best way to determine whether the exported model is looking good is through numerical evaluation
177- # against PyTorch, which is our source of truth.
178+ # ๋ด๋ณด๋ธ ๋ชจ๋ธ์ด ์ฌ๋ฐ๋ฅด๊ฒ ์๋ํ๋์ง ํ์ธํ๋ ๊ฐ์ฅ ์ข์ ๋ฐฉ๋ฒ์
179+ # ์ ๋ขฐํ ์ ์๋ ๊ธฐ์ค( source of truth)์ด ๋๋ PyTorch์ ์์น์ ์ผ๋ก ๋น๊ตํ๋ ๊ฒ์
๋๋ค .
178180#
179- # For that, we need to execute the PyTorch model with the same input and compare the results with ONNX Runtime's .
180- # Before comparing the results, we need to convert the PyTorch's output to match ONNX's format .
181+ # ์ด๋ฅผ ์ํด ๋์ผํ ์
๋ ฅ์ผ๋ก PyTorch ๋ชจ๋ธ์ ์คํํ๊ณ ๊ทธ ๊ฒฐ๊ณผ๋ฅผ ONNX Runtime์ ๊ฒฐ๊ณผ์ ๋น๊ตํ ์ ์์ต๋๋ค .
182+ # ๊ฒฐ๊ณผ๋ฅผ ๋น๊ตํ๊ธฐ ์ ์ PyTorch์ ์ถ๋ ฅ์ ONNX์ ํ์๊ณผ ์ผ์นํ๋๋ก ๋ณํํด์ผ ํฉ๋๋ค .
181183
182184torch_outputs = torch_model (* example_inputs )
183185
@@ -190,20 +192,20 @@ def forward(self, x: torch.Tensor):
190192print (f"Sample output: { onnxruntime_outputs } " )
191193
192194######################################################################
193- # Conclusion
195+ # ๊ฒฐ๋ก
194196# ----------
195197#
196- # That is about it! We have successfully exported our PyTorch model to ONNX format ,
197- # saved the model to disk, viewed it using Netron, executed it with ONNX Runtime
198- # and finally compared its numerical results with PyTorch's .
198+ # ์ด๊ฒ์ผ๋ก ํํ ๋ฆฌ์ผ์ ๋ง์นฉ๋๋ค! ์ฐ๋ฆฌ๋ ์ฑ๊ณต์ ์ผ๋ก PyTorch ๋ชจ๋ธ์ ONNX ํ์์ผ๋ก ๋ด๋ณด๋ด๊ณ ,
199+ # ๋ชจ๋ธ์ ๋์คํฌ์ ์ ์ฅํ๊ณ , Netron์ ์ฌ์ฉํ์ฌ ์๊ฐํํ๊ณ , ONNX Runtime์ผ๋ก ์คํํ์ผ๋ฉฐ,
200+ # ๋ง์ง๋ง์๋ ONNX Runtime์ ๊ฒฐ๊ณผ๋ฅผ PyTorch์ ๊ฒฐ๊ณผ์ ์์น์ ์ผ๋ก ๋น๊ตํ์ต๋๋ค .
199201#
200- # Further reading
202+ # ๋ ์ฝ์ด๋ณด๊ธฐ
201203# ---------------
202204#
203- # The list below refers to tutorials that ranges from basic examples to advanced scenarios ,
204- # not necessarily in the order they are listed .
205- # Feel free to jump directly to specific topics of your interest or
206- # sit tight and have fun going through all of them to learn all there is about the ONNX exporter .
205+ # ์๋ ๋ชฉ๋ก์ ๊ธฐ๋ณธ ์์ ๋ถํฐ ๊ณ ๊ธ ์๋๋ฆฌ์ค๊น์ง ์์ฐ๋ฅด๋ ํํ ๋ฆฌ์ผ๋ค๋ก ,
206+ # ๋ฐ๋์ ๋์ด๋ ์์๋๋ก ๋ณผ ํ์๋ ์์ต๋๋ค .
207+ # ์์ ๋กญ๊ฒ ๊ด์ฌ ๊ฐ๋ ์ฃผ์ ๋ก ๋ฐ๋ก ์ด๋ํ๊ฑฐ๋,
208+ # ํธ์ํ ์์ ์ ์ฒด ํํ ๋ฆฌ์ผ์ ํ๋์ฉ ์ดํด๋ณด๋ฉฐ ONNX ์ต์คํฌํฐ์ ๋ํ ๋ชจ๋ ๊ฒ๋ค์ ๋ฐฐ์๋ณด์ธ์ .
207209#
208210# .. include:: /beginner_source/onnx/onnx_toc.txt
209211#
0 commit comments