Commit 805f4aa
committed
fuzz: add size limit to regex building
The fuzzer sometimes runs into situations where it builds regexes that
can take a while to execute, such as `\B{10000}`. They fit within the
default size limit, but the search times aren't great. But it's not a
bug. So try to decrease the size limit a bit to try and prevent
timeouts.
We might consider trying to optimize cases like `\B{10000}`. A naive
optimization would be to remove any redundant conditional epsilon
transitions within a single epsilon closure, but that can be tricky to
do a priori. The case of `\B{100000}` is probably easy to detect, but
they can be arbitrarily complex.
Another way to attack this would be to modify, say, the PikeVM to only
compute whether a conditional epsilon transition should be followed once
per haystack position. Right now, I think it is re-computing them even
though it doesn't have to.1 parent 01a8cf4 commit 805f4aa
File tree
2 files changed
+2
-0
lines changed- fuzz/fuzz_targets
2 files changed
+2
-0
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
55 | 55 | | |
56 | 56 | | |
57 | 57 | | |
| 58 | + | |
58 | 59 | | |
59 | 60 | | |
60 | 61 | | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
59 | 59 | | |
60 | 60 | | |
61 | 61 | | |
| 62 | + | |
62 | 63 | | |
63 | 64 | | |
64 | 65 | | |
0 commit comments