Skip to content

Commit 88d43d1

Browse files
[GSoC 2025] Final Report - TMVA - SOFIE Project - PrasannaKasar (#1756)
* [GSoC 2025] Final Report - TMVA - SOFIE Project - PrasannaKasar * TMVA - SOFIE logo typo * removed image.png sofie logo * removed sofie logo * fixed the typos in the report * Implemented suggestions provided by Sanjiban in the report
1 parent a821ebf commit 88d43d1

File tree

2 files changed

+189
-0
lines changed

2 files changed

+189
-0
lines changed
Lines changed: 189 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,189 @@
1+
---
2+
3+
project: ML4EP - TMVA SOFIE
4+
title: Enhancing Keras Parser and JAX/FLAX Integration
5+
author: Prasanna Kasar
6+
photo: blog_authors/PrasannaKasar.jpeg
7+
date: 07.09.2025
8+
year: 2025
9+
layout: blog_post
10+
logo: "TMVA - SOFIE"
11+
intro: |
12+
Developed a parser within SOFIE to parse Machine Learning models trained with Keras. Rewrote the existing parser in Python, which was previously written in C++. Added support for parsing missing layers, such as Pooling and LayerNormalization, and wrote unit tests for the parser.
13+
14+
---
15+
16+
# Final Evaluation Report for GSoC 2025
17+
18+
<img width="1434" height="413" alt="image" src="https://gist.github.com/user-attachments/assets/6b8528de-aeb7-465b-9720-0b8d9d94d9a4" />
19+
20+
## Details
21+
22+
| | |
23+
| --- | --- |
24+
| Name | [Prasanna Kasar](https://github.com/prasannakasar) |
25+
| Organisation | [CERN HSF (Root Project)](https://github.com/root-project/root) |
26+
| Mentor | [Sanjiban Sengupta](https://github.com/sanjibansg), [Dr. Lorenzo Moneta](https://github.com/lmoneta)|
27+
| Project | [TMVA SOFIE - Enhancing Keras Parser and JAX/FLAX Integration](https://summerofcode.withgoogle.com/programs/2025/projects/uAjGYhgX) |
28+
29+
## Project Description
30+
31+
The SOFIE (System for Optimized Fast Inference Code Emit) project is an initiative within the TMVA (Toolkit for Multivariate Data Analysis) framework in ROOT, which aims to enhance the efficiency and speed of inference for machine learning models. SOFIE converts ML models trained in different frameworks, such as ONNX, PyTorch, and TensorFlow, into an Intermediate Representation (IR). This IR allows SOFIE to generate optimized C++ functions for fast and effective inference of neural networks and subsequently convert them into C++ header files, which can be used in a plug-and-go style for inference.
32+
33+
## SOFIE's workflow
34+
35+
To reduce the overhead of using multiple frameworks for inference, SOFIE generates unified inference code for models trained with different frameworks.
36+
37+
<img width="512" height="235" alt="image" src="https://gist.github.com/user-attachments/assets/bf1f9c4d-28e4-46c0-bdff-ab5653960512" />
38+
39+
SOFIE has mainly two components: a Parser and an inference code generator.
40+
41+
<img width="867" height="274" alt="image" src="https://gist.github.com/user-attachments/assets/096f2af8-72d4-4551-8fd5-4208f3ed1894" />
42+
43+
SOFIE currently supports parsing mechanisms for ML models built with frameworks such as ONNX, PyTorch, and TensorFlow.
44+
45+
## About SOFIE's Keras Parser
46+
47+
Currently, SOFIE's existing Keras parser is written in C++ and is quite old. Although it's written in C++, the actual parsing logic is implemented in Python. Additionally, it lacks support for parsing layers such as Pooling and LayerNormalization.
48+
49+
## Project Objectives
50+
51+
- Rewrite the Keras model parser in Python, replacing earlier C++ logic to improve modular design and flexibility, and simplify future extensions
52+
- Extend parser functionality to support Pooling and LayerNormalization layers
53+
- Enable support for Keras 3, while preserving support for Keras 2.x models, ensuring full backward compatibility
54+
- Add support for both types of models, i.e., models built using Keras' Functional as well as Sequential API.
55+
- Design comprehensive unit tests for the parser to guarantee robustness and correctness
56+
57+
## Work Accomplished
58+
59+
Since SOFIE's operators are written entirely in C++, we had to leverage ROOT's `Pythonization` functionality, which essentially allows us to bind SOFIE's C++ objects to a Pythonic interface. The overall structure of the parser is very similar to the previous one. The sequence of operations is as follows:
60+
61+
### 1. Load the Keras model
62+
### 2. Instantiate the `RModel` class
63+
### 3. Iterate over the individual layers and extract the required information
64+
65+
To create the RModel object, we had to extract layer-specific information such as Layer name, its type (Dense, Convolution, etc.), and its input and output layer names. In case of Keras 2.x and models built using Functional API, the output name of the current layer is the same as the input name for the next layer. But in the case of Keras 3, particularly with models built using the Sequential API, this changes, and the input and output names are no longer consistent ([issue link](https://github.com/keras-team/keras/issues/21599)). So, we used a custom iterator that would iterate over each of the layers and replace the suffix of the input and output names to be perfectly consistent.
66+
67+
Then we had to extract the weight's name and some more operator-specific information, such as in the case of Convolutional and Pooling layers, ONNX only supports `channels_first` data format. Whereas Keras supports both, i.e., `channels_first` and `channels_last`. After extracting the layer information for a particular layer, we add it to the `rmodel` object.
68+
69+
### 4. Adding a layer to the `rmodel` object
70+
71+
For all the other operators, the process of adding layer operators to the `rmodel` object is quite easy. But for Convolutional and Pooling layers, it's a bit different. For these layers, if the data format is `channels_last`, we have to perform a transpose before and after adding the layer to the `rmodel` object.
72+
73+
### 5. Operator-specific functions
74+
75+
To add the layer operators, we create the layer operator in layer-specific functions and return it. For this, we make use of the extracted layer information from step [3](#3-iterate-over-the-individual-layers-and-extract-required-information).
76+
77+
### 6. Extract the model's weights
78+
### 7. Adding input and output names of the Keras model to the `rmodel` object
79+
80+
While adding the input and output names of the Keras model, we need to make sure that we use the new layer iterator; otherwise, the layer names would have been inconsistent again.
81+
82+
## How to make sure the parser has backwards compatibility with Keras version 2.x?
83+
84+
Along with parsing support for models trained with Keras 3, we also needed backward compatibility with Keras 2.x. Since Keras 3 introduced significant changes in attribute and layer names and storage formats, we conducted research on the updated versions. For example, the weight names in Keras 3 are no longer unique. Assume a model has 2 dense layers. Previously, with Keras 2.x, the layer weight names would have been:
85+
```
86+
dense/kernel:0
87+
dense/bias:0
88+
dense_1/kernel:0
89+
dense_1/bias:0
90+
```
91+
92+
But with Keras 3, the layer weight names are like this:
93+
```
94+
kernel
95+
bias
96+
kernel
97+
bias
98+
```
99+
100+
101+
To remove the ambiguity, we used weight paths instead of weight names.
102+
103+
After these steps, the parser was in good shape and can now be used to parse these layers:
104+
105+
- Add
106+
- AveragePool2D channels first
107+
- AveragePool2D channels last
108+
- BatchNormalization
109+
- Concat
110+
- Conv2D channels first
111+
- Conv2D channels last
112+
- Conv2D padding same
113+
- Conv2D padding valid
114+
- Dense
115+
- Elu
116+
- Flatten
117+
- GlobalAveragePool2D channels first
118+
- GlobalAveragePool2D channels last
119+
- LayerNormalization
120+
- LeakyReLU
121+
- MaxPool2D channels first
122+
- MaxPool2D channels last
123+
- Multiply
124+
- Permute
125+
- Relu
126+
- Reshape
127+
- Selu
128+
- Sigmoid
129+
- Softmax
130+
- Subtract
131+
- Swish
132+
- Tanh
133+
134+
Along the way, we also fixed a few minor bugs within SOFIE's ROperator header files.
135+
136+
## About JAX/FLAX parser
137+
138+
Initially, we aimed for JAX/FLAX Integration within SOFIE by researching models built using its `nnx` and `linen` API, but after a careful discussion with the project mentors, we came to the conclusion of focusing on the Keras parser itself, by adding support to parse more layers and by writing unit tests for the same.
139+
140+
## Writing Unit Tests
141+
142+
Along with verifying the parsing capability to parse all the supported layers, we also needed to verify the correctness of the generated code. For this, we created two functions:
143+
144+
### 1. To generate and test the inference code
145+
146+
This function takes the file path of a model built with Keras. Then, it passes the same to the parser. After the parser returns the `rmodel` object, we generate the inference code. Now, we need to verify the correctness of the generated header file. For this, we need to pass a sample input to both the generated header file and the Keras model. To avoid hardcoding the shape of the input for each and every model, we extract the input shapes from the Keras model and using it we create the sample input. After this, we pass the sample input and get the resultant output tensor. Since SOFIE always flattens the output tensor before returning it, we also check the output tensor shape from both Keras and SOFIE.
147+
148+
### 2. Validate the accuracy of the result
149+
150+
To validate the inference result from SOFIE, we compare the element-wise values in the output tensor of Keras and SOFIE and make sure that the difference between the results is within a specified tolerance.
151+
152+
## Unit tests
153+
154+
To write the unit tests, we used the `unittest` module within Python as it allows parametrizing those with minimum code repetition. There are two different sets of tests:
155+
1. For models built using Keras' Functional API
156+
2. For models built using Keras' Sequential API
157+
158+
Within these, there are operator-specific tests that are invoked whenever a sub-test is called. While running the unit tests for both types of models, i.e., Functional and Sequential, temporary directories are created and torn down as soon as both are finished running.
159+
160+
## Pull request status
161+
162+
| Pull Request | PR Number | Status |
163+
|--------------|------------|---------|
164+
| New Keras Parser | [#19692](https://github.com/root-project/root/pull/19692) | <img src="https://img.shields.io/badge/PR-Yet_To_Be_Merged-orange?style=for-the-badge&logo=appveyor"> |
165+
166+
## Challenges faced and Learning Outcomes
167+
168+
- Faced difficulty while setting up the ROOT project with SOFIE enabled due to missing dependencies and incompatible versions of packages
169+
- Navigated through SOFIE's complex code base
170+
- Got hands-on experience with Keras, its Functional and Sequential API, and the overall structure of its models
171+
- Improved skills in reading documentation and solving bugs independently
172+
- Learned how to write concise and modular unit tests
173+
174+
## Future work
175+
176+
In the future, I would love to continue my contributions to the SOFIE codebase beyond the GSoC period. My current focus is on adding support for parsing the `Conv2DTranspose`, `Dropout`, and Recurrent layers.
177+
178+
## Conclusion
179+
180+
I am thankful for my project mentors, Sanjiban Sengupta and Dr. Lorenzo Moneta, for their kind guidance, which made my learning experience enriching and rewarding. They guided me whenever I felt any difficulty. I could ask them the silliest doubt, and they would still answer it happily. I am fortunate to be part of such a wonderful project and contribute to CERN-HSF this summer. I look forward to contributing to CERN-HSF beyond my GSoC project completion.
181+
182+
Lastly, I would like to thank my seniors from the Open-Source community at my university, Project-X, for introducing me to GSoC and helping me in the pre-GSoC period.
183+
184+
#### Thanks and Regards
185+
#### Prasanna Kasar
186+
187+
188+
189+
1.85 MB
Loading

0 commit comments

Comments
 (0)