diff -r aec7073d4645 -r daf404920ab9 ProgTutorial/First_Steps.thy --- a/ProgTutorial/First_Steps.thy Fri Jun 03 15:15:17 2016 +0100 +++ b/ProgTutorial/First_Steps.thy Tue May 14 11:10:53 2019 +0200 @@ -4,6 +4,8 @@ chapter {* First Steps\label{chp:firststeps} *} + + text {* \begin{flushright} {\em ``The most effective debugging tool is still careful thought,\\ @@ -12,7 +14,7 @@ \end{flushright} \medskip - Isabelle programming is done in ML. Just like lemmas and proofs, ML-code for + Isabelle programming is done in ML @{cite "isa-imp"}. Just like lemmas and proofs, ML-code for Isabelle must be part of a theory. If you want to follow the code given in this chapter, we assume you are working inside the theory starting with @@ -179,7 +181,7 @@ *} ML %grayML %grayML{*fun pretty_cterm ctxt ctrm = - pretty_term ctxt (term_of ctrm)*} + pretty_term ctxt (Thm.term_of ctrm)*} text {* Here the function @{ML_ind term_of in Thm} extracts the @{ML_type @@ -196,7 +198,7 @@ *} ML %grayML{*fun pretty_thm ctxt thm = - pretty_term ctxt (prop_of thm)*} + pretty_term ctxt (Thm.prop_of thm)*} text {* Theorems include schematic variables, such as @{text "?P"}, @@ -217,7 +219,7 @@ let val ctxt' = Config.put show_question_marks false ctxt in - pretty_term ctxt' (prop_of thm) + pretty_term ctxt' (Thm.prop_of thm) end*} text {* @@ -249,7 +251,7 @@ respectively @{ML_type ctyp} *} -ML %grayML{*fun pretty_ctyp ctxt cty = pretty_typ ctxt (typ_of cty) +ML %grayML{*fun pretty_ctyp ctxt cty = pretty_typ ctxt (Thm.typ_of cty) fun pretty_ctyps ctxt ctys = Pretty.block (Pretty.commas (map (pretty_ctyp ctxt) ctys))*} @@ -626,24 +628,24 @@ A more realistic example for this combinator is the following code *} -ML %grayML{*val (((one_def, two_def), three_def), ctxt') = +ML %grayML{*val ((([one_def], [two_def]), [three_def]), ctxt') = @{context} - |> Local_Defs.add_def ((@{binding "One"}, NoSyn), @{term "1::nat"}) - ||>> Local_Defs.add_def ((@{binding "Two"}, NoSyn), @{term "2::nat"}) - ||>> Local_Defs.add_def ((@{binding "Three"}, NoSyn), @{term "3::nat"})*} + |> Local_Defs.define [((@{binding "One"}, NoSyn), (Binding.empty_atts, @{term "1::nat"}))] + ||>> Local_Defs.define [((@{binding "Two"}, NoSyn), (Binding.empty_atts,@{term "2::nat"}))] + ||>> Local_Defs.define [((@{binding "Three"}, NoSyn), (Binding.empty_atts,@{term "3::nat"}))]*} text {* where we make three definitions, namely @{term "One \ 1::nat"}, @{term "Two \ 2::nat"} and @{term "Three \ 3::nat"}. The point of this code is that we augment the initial context with the definitions. The result we are interested in is the augmented context, that is @{ML_text "ctxt'"}, but also the side-results containing - information about the definitions---the function @{ML_ind add_def in Local_Defs} returns + information about the definitions---the function @{ML_ind define in Local_Defs} returns both as pairs. We can use this information for example to print out the definiens and the theorem corresponding to the definitions. For example for the first definition: @{ML_response_fake [display, gray] "let - val (one_trm, one_thm) = one_def + val (one_trm, (_, one_thm)) = one_def in pwriteln (pretty_term ctxt' one_trm); pwriteln (pretty_thm ctxt' one_thm) @@ -775,7 +777,7 @@ defined abbreviations. For example @{ML_response_fake [display, gray] - "Proof_Context.print_abbrevs @{context}" + "Proof_Context.print_abbrevs false @{context}" "\ INTER \ INFI Inter \ Inf @@ -864,7 +866,7 @@ ML %linenosgray{*val term_pat_setup = let - val parser = Args.context -- Scan.lift Args.name_inner_syntax + val parser = Args.context -- Scan.lift Args.embedded_inner_syntax fun term_pat (ctxt, str) = str |> Proof_Context.read_term_pattern ctxt @@ -897,7 +899,7 @@ ML %grayML{*val type_pat_setup = let - val parser = Args.context -- Scan.lift Args.name_inner_syntax + val parser = Args.context -- Scan.lift Args.embedded_inner_syntax fun typ_pat (ctxt, str) = let @@ -921,7 +923,7 @@ However, a word of warning is in order: new antiquotations should be introduced only after careful deliberations. They can potentially make your code harder, rather than easier, to read. - \begin{readmore} + \begin{readmore} @{url "Norbert/ML/ml_antiquotation.ML"} The files @{ML_file "Pure/ML/ml_antiquotation.ML"} and @{ML_file "Pure/ML/ml_antiquotations.ML"} contain the infrastructure and definitions for most antiquotations. Most of the basic operations on ML-syntax are implemented in @{ML_file "Pure/ML/ml_syntax.ML"}. @@ -947,9 +949,8 @@ in a type-safe manner...though run-time checks are needed for that. \begin{readmore} - In Isabelle the universal type is implemented as the type @{ML_type - Universal.universal} in the file - @{ML_file "Pure/ML-Systems/universal.ML"}. + In ML the universal type is implemented as the type @{ML_type + Universal.universal}. \end{readmore} We will show the usage of the universal type by storing an integer and @@ -1344,6 +1345,7 @@ in the simplifier. This can be used for example in the following proof *} + lemma shows "(False \ True) \ True" proof (rule conjI)