--- a/CookBook/FirstSteps.thy Thu Feb 05 22:44:11 2009 +0000
+++ b/CookBook/FirstSteps.thy Fri Feb 06 06:19:52 2009 +0000
@@ -519,30 +519,30 @@
section {* Printing Terms, CTerms and Theorems\label{sec:printing} *}
text {*
- It will often be necesseary to inspect terms, cterms or theorems. Isabelle
- contains elaborate pretty-printing functions for that, but for quick-and-dirty
- solutions they are too unwieldy. A term can be transformed into a string using
- the function @{ML Syntax.string_of_term}
+ You will occationally feel the need to inspect terms, cterms or theorems during
+ development. Isabelle contains elaborate pretty-printing functions for that, but
+ for quick-and-dirty solutions they are way too unwieldy. A simple way to transform
+ a term into a string is to use the function @{ML Syntax.string_of_term}.
@{ML_response_fake [display,gray]
"Syntax.string_of_term @{context} @{term \"1::nat\"}"
"\"\\^E\\^Fterm\\^E\\^E\\^Fconst\\^Fname=HOL.one_class.one\\^E1\\^E\\^F\\^E\\^E\\^F\\^E\""}
- This produces a string, though with printing directions encoded in it. This
- can be properly printed, when enclosed in a @{ML warning}
+ This produces a string, though with printing directions encoded in it. The string
+ can be properly printed, when enclosed in a @{ML warning}.
@{ML_response_fake [display,gray]
"warning (Syntax.string_of_term @{context} @{term \"1::nat\"})"
"\"1\""}
- A @{ML_type cterm} can be transformed into a string by the function:
+ A @{ML_type cterm} can be transformed into a string by the following function.
*}
ML{*fun str_of_cterm ctxt t =
Syntax.string_of_term ctxt (term_of t)*}
text {*
- If there are more than one @{ML_type cterm} to be printed, we can use the function
+ If there are more than one @{ML_type cterm} to be printed, you can use the function
@{ML commas} to conveniently separate them.
*}
@@ -550,7 +550,7 @@
commas (map (str_of_cterm ctxt) ts)*}
text {*
- The easiest way to get the string of a theorem, is to transform it
+ The easiest way to get the string of a theorem is to transform it
into a @{ML_type cterm} using the function @{ML crep_thm}.
*}
@@ -562,7 +562,7 @@
end*}
text {*
- For more than one theorem, the following function adds commas in between them.
+ Again the function @{ML commas} helps with printing more than one theorem.
*}
ML{*fun str_of_thms ctxt thms =
@@ -604,7 +604,7 @@
\begin{readmore}
The most frequently used combinator are defined in the files @{ML_file "Pure/library.ML"}
and @{ML_file "Pure/General/basics.ML"}. Also \isccite{sec:ML-linear-trans}
- contains further information about combinators.
+ contains further information about them.
\end{readmore}
The simplest combinator is @{ML I}, which is just the identity function.
@@ -618,10 +618,10 @@
text {*
@{ML K} ``wraps'' a function around the argument @{text "x"}. However, this
- function ignores its argument. As a result @{ML K} defines a constant function
+ function ignores its argument. As a result, @{ML K} defines a constant function
returning @{text x}.
- The next combinator is reverse application, @{ML "|>"}, defined as
+ The next combinator is reverse application, @{ML "|>"}, defined as:
*}
ML{*fun x |> f = f x*}
@@ -642,8 +642,8 @@
the first component of the pair (Line 4) and finally incrementing the first
component by 4 (Line 5). This kind of cascading manipulations of values is quite
common when dealing with theories (for example by adding a definition, followed by
- lemmas and so on). The reverse application allows you to read what happens inside
- the function in a top-down manner. This kind of codeing should also be familiar,
+ lemmas and so on). The reverse application allows you to read what happens in
+ a top-down manner. This kind of coding should also be familiar,
if you used Haskell's do-notation. Writing the function @{ML inc_by_five} using
the reverse application is much clearer than writing
*}
--- a/CookBook/Intro.thy Thu Feb 05 22:44:11 2009 +0000
+++ b/CookBook/Intro.thy Fri Feb 06 06:19:52 2009 +0000
@@ -93,9 +93,10 @@
@{ML_response [display,gray] "3 + 4" "7"}
- The usual Isabelle commands are written in bold, for example \isacommand{lemma},
- \isacommand{foobar} and so on. We use @{text "$"} to indicate that
- a command needs to be run in the Unix-shell, for example
+ The usual user-level commands of Isabelle are written in bold, for
+ example \isacommand{lemma}, \isacommand{foobar} and so on.
+ We use @{text "$"} to indicate that a command needs to be run
+ in the Unix-shell, for example
@{text [display] "$ ls -la"}
--- a/CookBook/Parsing.thy Thu Feb 05 22:44:11 2009 +0000
+++ b/CookBook/Parsing.thy Fri Feb 06 06:19:52 2009 +0000
@@ -515,38 +515,104 @@
*}
-ML{*
- OuterParse.position
-*}
-
-
section {* Parsing Inner Syntax *}
-ML{*
-let
+ML{*let
val input = OuterSyntax.scan Position.none "0"
in
OuterParse.prop input
-end
+end*}
+
+text {* (FIXME funny output for a proposition) *}
+
+section {* Parsing Specifications *}
+
+text {*
+ There are a number of special purpose parsers that help for parsing specifications.
+ For example the function @{ML OuterParse.target} reads a target in order to indicate
+ a locale.
+
+@{ML_response [display,gray]
+"let
+ val input = filtered_input \"(in test)\"
+in
+ parse OuterParse.target input
+end" "(\"test\",[])"}
+
+ The function @{ML OuterParse.opt_target} makes this parser ``optional''.
+
+ The function @{ML OuterParse.fixes} reads a list of constants
+ which can include a type annotation and a syntax translation.
+ The list needs to be separated by \isacommand{and}.
*}
-ML{*
- OuterParse.opt_target
+text {*
+
+@{ML_response [display,gray]
+"let
+ val input = filtered_input
+ \"foo::\\\"int \<Rightarrow> bool\\\" (\\\"FOO\\\" [100] 100) and bar::nat and blonk\"
+in
+ parse OuterParse.fixes input
+end"
+"([(foo, SOME \<dots>, Mixfix (\"FOO\",[100],100)),
+ (bar, SOME \<dots>, NoSyn),
+ (blonk, NONE, NoSyn)],[])"}
+
+ The types of the constants is stored in the @{ML SOME}s.
+ We need to write @{text "\\\"int \<Rightarrow> bool\\\""} to properly
+ escape the type.
+
+ @{ML OuterParse.for_fixes} is an ``optional'' that prefixes
+ @{ML OuterParse.fixes} with the comman \isacommand{for}.
+*}
+
+ML{*let
+ val input = filtered_input "test_lemma[intro,foo,elim,dest!,bar]:"
+in
+ parse (SpecParse.thm_name ":") input
+ |> fst |> snd |> (Attrib.pretty_attribs @{context}) |> (map Pretty.string_of)
+end*}
+
+text {*
+ (FIXME: why is intro, elim and dest treated differently from bar?)
*}
-text {* (FIXME funny output for a proposition) *}
+text {*
+@{ML_response [display,gray]
+"let
+ val input = filtered_input
+ (\"even and odd \" ^
+ \"where \" ^
+ \" even0[intro]: \\\"even 0\\\" \" ^
+ \"| evenS[intro]: \\\"odd n \<Longrightarrow> even (Suc n)\\\" \" ^
+ \"| oddS[intro]: \\\"even n \<Longrightarrow> odd (Suc n)\\\"\")
-ML{*
- OuterParse.opt_target --
- OuterParse.fixes --
- OuterParse.for_fixes --
- Scan.optional (OuterParse.$$$ "where" |--
- OuterParse.!!! (OuterParse.enum1 "|" (SpecParse.opt_thm_name ":" -- OuterParse.prop))) []
-
+ val parser =
+ OuterParse.opt_target --
+ OuterParse.fixes --
+ OuterParse.for_fixes --
+ Scan.optional
+ (OuterParse.$$$ \"where\" |--
+ OuterParse.!!!
+ (OuterParse.enum1 \"|\"
+ (SpecParse.opt_thm_name \":\" -- OuterParse.prop))) []
+in
+ parse parser input
+end"
+"((((NONE, [(even, NONE, NoSyn), (odd, NONE, NoSyn)]), []),
+ [((even0,\<dots>), \"\\^E\\^Ftoken\\^Eeven 0\\^E\\^F\\^E\"),
+ ((evenS,\<dots>), \"\\^E\\^Ftoken\\^Eodd n \<Longrightarrow> even (Suc n)\\^E\\^F\\^E\"),
+ ((oddS,\<dots>), \"\\^E\\^Ftoken\\^Eeven n \<Longrightarrow> odd (Suc n)\\^E\\^F\\^E\")]), [])"}
*}
-ML{* OuterSyntax.command *}
+text {*
+ The (outer?) parser for the package: returns optionally a locale;
+ a list of predicate constants with optional type-annotation and
+ optional syntax-annotation; a list of for-fixes (fixed parameters);
+ and a list of rules where each rule has optionally a name and an attribute.
+*}
section {* New Commands and Keyword Files *}
@@ -575,7 +641,7 @@
parser producing a top-level transition function (its purpose will also explained
later).
- While this is everything we have to do on the ML-level, we need a keyword
+ While this is everything you have to do on the ML-level, you need a keyword
file that can be loaded by ProofGeneral. This is to enable ProofGeneral to
recognise \isacommand{foobar} as a command. Such a keyword file can be
generated with the command-line:
@@ -586,7 +652,7 @@
will be assigned. In the case above the file will be named @{text
"isar-keywords-foobar.el"}. This command requires log files to be
present (in order to extract the keywords from them). To generate these log
- files, we first package the code above into a separate theory file named
+ files, you first need to package the code above into a separate theory file named
@{text "Command.thy"}, say---see Figure~\ref{fig:commandtheory} for the
complete code.
@@ -633,7 +699,7 @@
@{text [display] "$ ./build -m \"Pure-ProofGeneral\" \"Pure\""}
- For the theory @{text "Command.thy"}, we first create a ``managed'' subdirectory
+ For the theory @{text "Command.thy"}, you first need to create a ``managed'' subdirectory
with
@{text [display] "$ isabelle mkdir FoobarCommand"}
@@ -647,17 +713,17 @@
./FoobarCommand/document/root.tex"}
- We need to copy the file @{text "Command.thy"} into the directory @{text "FoobarCommand"}
+ You need to copy the file @{text "Command.thy"} into the directory @{text "FoobarCommand"}
and add the line
@{text [display] "use_thy \"Command\";"}
- to the file @{text "./FoobarCommand/ROOT.ML"}. We can now compile the theory by just typing
+ to the file @{text "./FoobarCommand/ROOT.ML"}. You can now compile the theory by just typing
@{text [display] "$ isabelle make"}
- We created finally all the necessary log files. They are stored
- in the directory
+ If the compilation succeeds, you have finally created all the necessary log files.
+ They are stored in the directory
@{text [display] "~/.isabelle/heaps/Isabelle2008/polyml-5.2.1_x86-linux/log"}
@@ -674,7 +740,7 @@
Pure-ProofGeneral.gz
HOL-FoobarCommand.gz"}
- From them we create the keyword files. Assuming the name
+ From them you can create the keyword files. Assuming the name
of the directory is in @{text "$ISABELLE_LOGS"},
then the Unix command for creating the keyword file is:
@@ -687,14 +753,14 @@
that @{text "grep foobar"} on this file returns something
non-empty.} This keyword file needs to
be copied into the directory @{text "~/.isabelle/etc"}. To make Isabelle
- aware of this keyword file, we have to start Isabelle with the option @{text
+ aware of this keyword file, you have to start Isabelle with the option @{text
"-k foobar"}, i.e.
@{text [display] "$ isabelle -k foobar a_theory_file"}
- If we now build a theory on top of @{text "Command.thy"},
- then we can make use of the command \isacommand{foobar}.
+ If you now build a theory on top of @{text "Command.thy"},
+ then the command \isacommand{foobar} can be used.
Similarly with any other new command.
@@ -710,7 +776,6 @@
parser @{ML OuterParse.prop}), then prints out the tracing
information (using a new top-level function @{text trace_top_lvl}) and
finally does nothing. For this we can write
-
*}
ML{*let
@@ -797,7 +862,11 @@
Similarly for the other function composition combinators.
- (FIXME What does @{text "Toplevel.theory"} @{text "Toplevel.print"}?)
+ (FIXME What do @{ML "Toplevel.theory"}
+ @{ML "Toplevel.print"}
+ @{ML Toplevel.local_theory}?)
+
+
(FIXME read a name and show how to store theorems)
--- a/CookBook/document/root.tex Thu Feb 05 22:44:11 2009 +0000
+++ b/CookBook/document/root.tex Fri Feb 06 06:19:52 2009 +0000
@@ -61,7 +61,7 @@
\newenvironment{readmore}%
-{\begin{leftrightbar}\it{\textbf{Read More}}\\}{\end{leftrightbar}}
+{\begin{leftrightbar}\small\it{\textbf{Read More}}\\}{\end{leftrightbar}}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% this is to draw a gray box around code
Binary file cookbook.pdf has changed