47 Weideman \parencite{Weideman2017Static} came up with |
47 Weideman \parencite{Weideman2017Static} came up with |
48 non-linear polynomial worst-time estimates |
48 non-linear polynomial worst-time estimates |
49 for regexes, attack string that exploit the worst-time |
49 for regexes, attack string that exploit the worst-time |
50 scenario, and "attack automata" that generates |
50 scenario, and "attack automata" that generates |
51 attack strings. |
51 attack strings. |
|
52 |
|
53 |
|
54 |
|
55 |
|
56 |
|
57 |
|
58 Thanks to our theorem-prover-friendly approach, |
|
59 we believe that |
|
60 this finiteness bound can be improved to a bound |
|
61 linear to input and |
|
62 cubic to the regular expression size using a technique by |
|
63 Antimirov\cite{Antimirov95}. |
|
64 Once formalised, this would be a guarantee for the absence of all super-linear behavious. |
|
65 We are working out the |
|
66 details. |
|
67 |
|
68 |
|
69 To our best knowledge, no lexing libraries using Brzozowski derivatives |
|
70 have similar complexity-related bounds, |
|
71 and claims about running time are usually speculative and backed by empirical |
|
72 evidence on a few test cases. |
|
73 If a matching or lexing algorithm |
|
74 does not come with certain basic complexity related |
|
75 guarantees (for examaple the internal data structure size |
|
76 does not grow indefinitely), |
|
77 then they cannot claim with confidence having solved the problem |
|
78 of catastrophic backtracking. |