Commit 7b7276d
Inductor freezing bfloat16 conv folding needs high tolerance (#145623)
Summary:
Issue:
pytorch/pytorch#144888
Torchbench of timm lcnet_050 model fails on accuracy in case of `--frezing` `--inference` `--bfloat16`
`res_error==0.12`
If to turn off convolution inductor constant folding - `res_error==0.016`
`float16 error ~ 0.00669`
`float16 without conv folding ~ 0.0018`
convolution folding results in increase of error almost at one order of magnitude.
I think we should revisit and try to do something to improve the accuracy for conv folding.
E.g. For example doing conv folding at compilation time with float64?
At the moment I am adding counters to identify if convolution folding happened, and in case of bfloat16 and conv_folding - increase multiplier to the max level (10) to pass accuracy test.
X-link: pytorch/pytorch#145623
Approved by: https://github.com/eellison
Reviewed By: ZainRizvi
Differential Revision: D68897700
fbshipit-source-id: f407528b4b37eb45273a8c66f791c44e86c6632e1 parent 373ffb1 commit 7b7276d
2 files changed
+49
-26
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
2528 | 2528 | | |
2529 | 2529 | | |
2530 | 2530 | | |
| 2531 | + | |
2531 | 2532 | | |
2532 | 2533 | | |
2533 | 2534 | | |
| |||
2554 | 2555 | | |
2555 | 2556 | | |
2556 | 2557 | | |
| 2558 | + | |
2557 | 2559 | | |
2558 | 2560 | | |
2559 | 2561 | | |
| |||
2573 | 2575 | | |
2574 | 2576 | | |
2575 | 2577 | | |
| 2578 | + | |
2576 | 2579 | | |
2577 | 2580 | | |
2578 | 2581 | | |
| |||
2593 | 2596 | | |
2594 | 2597 | | |
2595 | 2598 | | |
| 2599 | + | |
2596 | 2600 | | |
2597 | 2601 | | |
2598 | 2602 | | |
| |||
2685 | 2689 | | |
2686 | 2690 | | |
2687 | 2691 | | |
2688 | | - | |
2689 | | - | |
2690 | | - | |
2691 | | - | |
2692 | | - | |
2693 | | - | |
2694 | | - | |
2695 | | - | |
| 2692 | + | |
| 2693 | + | |
| 2694 | + | |
| 2695 | + | |
| 2696 | + | |
| 2697 | + | |
| 2698 | + | |
| 2699 | + | |
| 2700 | + | |
| 2701 | + | |
| 2702 | + | |
| 2703 | + | |
| 2704 | + | |
| 2705 | + | |
2696 | 2706 | | |
2697 | | - | |
2698 | | - | |
2699 | | - | |
2700 | | - | |
2701 | | - | |
2702 | | - | |
2703 | | - | |
2704 | | - | |
2705 | | - | |
2706 | | - | |
2707 | | - | |
2708 | | - | |
2709 | | - | |
2710 | | - | |
2711 | | - | |
2712 | | - | |
2713 | | - | |
2714 | | - | |
| 2707 | + | |
| 2708 | + | |
| 2709 | + | |
| 2710 | + | |
| 2711 | + | |
| 2712 | + | |
| 2713 | + | |
| 2714 | + | |
| 2715 | + | |
| 2716 | + | |
| 2717 | + | |
| 2718 | + | |
| 2719 | + | |
| 2720 | + | |
| 2721 | + | |
| 2722 | + | |
| 2723 | + | |
| 2724 | + | |
| 2725 | + | |
| 2726 | + | |
| 2727 | + | |
2715 | 2728 | | |
2716 | 2729 | | |
2717 | 2730 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
3054 | 3054 | | |
3055 | 3055 | | |
3056 | 3056 | | |
| 3057 | + | |
3057 | 3058 | | |
3058 | 3059 | | |
3059 | 3060 | | |
| |||
3114 | 3115 | | |
3115 | 3116 | | |
3116 | 3117 | | |
| 3118 | + | |
| 3119 | + | |
| 3120 | + | |
| 3121 | + | |
| 3122 | + | |
| 3123 | + | |
| 3124 | + | |
| 3125 | + | |
3117 | 3126 | | |
3118 | 3127 | | |
3119 | 3128 | | |
| |||
3133 | 3142 | | |
3134 | 3143 | | |
3135 | 3144 | | |
| 3145 | + | |
3136 | 3146 | | |
3137 | 3147 | | |
3138 | 3148 | | |
| |||
0 commit comments