# HG changeset patch # User Christian Urban # Date 1329317266 0 # Node ID 386375b7a22abbc3b20429aa93625e34484e8d2c # Parent 500d1abbc98c87b9a1062d1ccf35ffdad2f21c6b# Parent dcefee89bf62bd69f4fc0763efbf8b790ea59a18 merged diff -r 500d1abbc98c -r 386375b7a22a ProgTutorial/Essential.thy --- a/ProgTutorial/Essential.thy Wed Feb 15 14:33:29 2012 +0000 +++ b/ProgTutorial/Essential.thy Wed Feb 15 14:47:46 2012 +0000 @@ -87,17 +87,23 @@ with beta-redexes @{ML_response_fake [display, gray] -"@{term \"(\x y. x) 1 2\"} +"@{term \"(\(x::int) (y::int). x)\"} $ @{term \"1::int\"} $ @{term \"2::int\"} |> pretty_term @{context} |> pwriteln" "1"} - There is a configuration value to switch off the tacit eta-contraction (see - \ref{sec:printing}), but none for beta-contraction. So in certain cases you - might have to inspect the internal representation of a term, instead of - pretty printing it. Because of the alluded puzzlement that might arise from - this feature of Isabelle, it is certainly an acrane fact that you should - keep in mind about pretty-printing terms. + There is a dedicated configuration value for switching off the tacit + eta-contraction, namely @{ML_ind eta_contract in Syntax} (see Section + \ref{sec:printing}), but none for beta-contractions. However you can avoid + the beta-contractions by switching off abbreviation using the configuration + value @{ML_ind show_abbrevs in Syntax}. For example + + + @{ML_response_fake [display, gray] + "@{term \"(\(x::int) (y::int). x)\"} $ @{term \"1::int\"} $ @{term \"2::int\"} + |> pretty_term (Config.put show_abbrevs false @{context}) + |> pwriteln" + "(\x y. x) 1 2"} Isabelle makes a distinction between \emph{free} variables (term-constructor @{ML Free} and written on the user level in blue colour) and diff -r 500d1abbc98c -r 386375b7a22a ProgTutorial/Tactical.thy --- a/ProgTutorial/Tactical.thy Wed Feb 15 14:33:29 2012 +0000 +++ b/ProgTutorial/Tactical.thy Wed Feb 15 14:47:46 2012 +0000 @@ -2114,7 +2114,6 @@ If we get hold of the ``raw'' representation of the produced theorem, we obtain the expected result. - @{ML_response [display,gray] "let val add = @{cterm \"\x y. x + (y::nat)\"} @@ -2128,6 +2127,24 @@ (Abs (\"x\",\,Abs (\"y\",\,\)) $\$\) $ (Const (\"Groups.plus_class.plus\",\) $ \ $ \)"} + or in the pretty-printed form + + @{ML_response_fake [display,gray] +"let + val add = @{cterm \"\x y. x + (y::nat)\"} + val two = @{cterm \"2::nat\"} + val ten = @{cterm \"10::nat\"} + val ctrm = Thm.capply (Thm.capply add two) ten + val ctxt = @{context} + |> Config.put eta_contract false + |> Config.put show_abbrevs false +in + Thm.prop_of (Thm.beta_conversion true ctrm) + |> pretty_term ctxt + |> pwriteln +end" +"(\x y. x + y) 2 10 \ 2 + 10"} + The argument @{ML true} in @{ML beta_conversion in Thm} indicates that the right-hand side should be fully beta-normalised. If instead @{ML false} is given, then only a single beta-reduction is performed diff -r 500d1abbc98c -r 386375b7a22a progtutorial.pdf