-
Notifications
You must be signed in to change notification settings - Fork 2
Add initial version of 08-openvino #38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
bebaf86 to
fa4cfd3
Compare
fa4cfd3 to
32fe71b
Compare
| \begin{frame}{What is OpenVINO?} | ||
| \begin{columns}[T,totalwidth=\textwidth] | ||
| \begin{column}{0.7\textwidth} | ||
| OpenVINO (Open Visual Inference and Neural Network Optimization) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's a little bit wrong - please write about different arch too
| \begin{itemize} | ||
| \item \textbf{Purpose:} Optimize and deploy AI inference across Intel CPUs, GPUs, NPUs, and other accelerators | ||
| \item \textbf{Core components:} Model Optimizer, Runtime (Inference Engine), Post-Training Optimization Tool, Benchmark tools, Notebooks | ||
| \item \textbf{Model formats (Frontends):} IR (\texttt{.xml/.bin}), ONNX (\texttt{.onnx}), TensorFlow (SavedModel/MetaGraph/frozen \texttt{.pb/.pbtxt}), TensorFlow Lite (\texttt{.tflite}), PaddlePaddle (\texttt{.pdmodel}), PyTorch (TorchScript/FX) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please add extension for pytorch
|
|
||
| \begin{frame}{OpenVINO at a Glance} | ||
| \begin{itemize} | ||
| \item \textbf{Purpose:} Optimize and deploy AI inference across Intel CPUs, GPUs, NPUs, and other accelerators |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would be to add arm and riscv as additional architecture and every item need to be with that arch
| \begin{frame}{OpenVINO at a Glance} | ||
| \begin{itemize} | ||
| \item \textbf{Purpose:} Optimize and deploy AI inference across Intel CPUs, GPUs, NPUs, and other accelerators | ||
| \item \textbf{Core components:} Model Optimizer, Runtime (Inference Engine), Post-Training Optimization Tool, Benchmark tools, Notebooks |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Model optimizer is legacy, now it's OVC (OpenVINO Model Converter - https://docs.openvino.ai/2024/notebooks/convert-to-openvino-with-output.html)
| \item \textbf{Purpose:} Optimize and deploy AI inference across Intel CPUs, GPUs, NPUs, and other accelerators | ||
| \item \textbf{Core components:} Model Optimizer, Runtime (Inference Engine), Post-Training Optimization Tool, Benchmark tools, Notebooks | ||
| \item \textbf{Model formats (Frontends):} IR (\texttt{.xml/.bin}), ONNX (\texttt{.onnx}), TensorFlow (SavedModel/MetaGraph/frozen \texttt{.pb/.pbtxt}), TensorFlow Lite (\texttt{.tflite}), PaddlePaddle (\texttt{.pdmodel}), PyTorch (TorchScript/FX) | ||
| \item \textbf{Targets:} CPU, iGPU, dGPU (e.g., Intel Arc), NPU, and more via plugins |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
iGPU and dGPU are unite GPU plugin, you can decribe plugins or devices (ARM CPU, Intel CPU RISC_V cpu)
|
|
||
| \begin{frame}{Device Plugins Architecture} | ||
| \centering | ||
| \ovbox{gray!15}{\textbf{Application} (C++/Python)}\\[0.6em] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't see frontends
|
I would be to see:
|
No description provided.