--- a/hws/hw06.tex Tue Nov 15 11:34:33 2022 +0000
+++ b/hws/hw06.tex Thu Nov 24 19:05:34 2022 +0000
@@ -68,6 +68,11 @@
advantages to first lex a string and then feed a
sequence of tokens as input to the parser?
+% Reason 1 you can filter out whitespaces and comments, which makes the grammar rules simpler. If you have to make sure that a whitespace comes after a variable say, then your parser rule for variables gets more complicated. Same with comments which do not contribute anything to the parser tree.
+% Reason 2) The lexer can already classify tokens, for example as numbers, keywords or identifiers. This again makes the grammar rules more deterministic and as a result faster to parse.
+% The point is that parser combinators can be used to process strings, but in case of compilers where whitespaces and comments need to be filtered out, the lexing phase is actually quite useful.
+
+
\item The injection function for sequence regular expressions is defined
by three clauses: