# HG changeset patch # User Christian Urban # Date 1330485172 0 # Node ID d3d5225f4f241678e27ff4de87fb65bc6d5bb4e8 # Parent 860df8e1262f9d6d621a1d23f9a7ba0c48d679d2 implemented all comments from the reviewer diff -r 860df8e1262f -r d3d5225f4f24 LMCS-Paper/Paper.thy --- a/LMCS-Paper/Paper.thy Wed Feb 22 12:10:17 2012 +0000 +++ b/LMCS-Paper/Paper.thy Wed Feb 29 03:12:52 2012 +0000 @@ -113,11 +113,39 @@ \noindent and the @{text "\"}-quantification binds a finite (possibly empty) set of type-variables. While it is possible to implement this kind of more general - binders by iterating single binders, this leads to a rather clumsy - formalisation of W. + binders by iterating single binders, like @{text "\x\<^isub>1.\x\<^isub>2...\x\<^isub>n.T"}, this leads to a rather clumsy + formalisation of W. For example, the usual definition when a + type is an instance of a type-scheme requires the following auxiliary \emph{unbinding relation} + + \[ + \infer{@{text T} \hookrightarrow ([], @{text T})}{}\qquad + \infer{\forall @{text x.S} \hookrightarrow (@{text x}\!::\!@{text xs}, @{text T})} + {@{text S} \hookrightarrow (@{text xs}, @{text T})} + \]\smallskip - {\bf add example about W} + \noindent + which relates a type-scheme with a list of variables and a type. The point of this + definition is to get access to the bound variables and the type-part of a type-scheme + @{text S}, though in general + we will only get access to some variables and some type @{text T} that are + ``alpha-equivalent'' to @{text S}---it is an unbinding \emph{relation}. Still, with this + definition in place we can formally define when a type is an instance of a type-scheme as + \[ + @{text "T \ S \ \xs T' \. S \ (xs, T') \ dom \ = set xs \ \(T') = T"} + \]\smallskip + + \noindent + meaning there exists a list of variables @{text xs} and a type @{text T'} to which + the type-scheme @{text S} unbinds, and there exists a substitution whose domain is + @{text xs} (seen as set) such that @{text "\(T') = T"}. + The pain with this definition is that we cannot follow the usual proofs + that are by induction on the type-part of the type-scheme (since it is under + an existential quantifier). The representation of type-schemes by iterating single binders + prevents us from directly ``unbinding'' the bound variables and the type-part of + a type-scheme. The purpose of this paper is to introduce general binders, which + allow us to represent type-schemes following closely informal practice and as + a result solve this problem. The need of iterating single binders is also one reason why Nominal Isabelle and similar theorem provers that only provide mechanisms for binding single variables have not fared extremely well with @@ -393,7 +421,7 @@ \noindent {\bf Contributions:} We provide three new definitions for when terms involving general binders are alpha-equivalent. These definitions are - inspired by earlier work of Pitts \cite{Pitts04}. By means of automatic + inspired by earlier work of Pitts \cite{Pitts04}. By means of automatically-generated proofs, we establish a reasoning infrastructure for alpha-equated terms, including properties about support, freshness and equality conditions for alpha-equated terms. We are also able to automatically derive strong @@ -578,24 +606,26 @@ two properties. \begin{prop}\label{supportsprop} - Given a set @{text "as"} of atoms.\\ - {\it (i)} If @{thm (prem 1) supp_is_subset[where S="as", no_vars]} - and @{thm (prem 2) supp_is_subset[where S="as", no_vars]} then - @{thm (concl) supp_is_subset[where S="as", no_vars]}.\\ + Given a set @{text "bs"} of atoms.\\ + {\it (i)} If @{thm (prem 1) supp_is_subset[where S="bs", no_vars]} + and @{thm (prem 2) supp_is_subset[where S="bs", no_vars]} then + @{thm (concl) supp_is_subset[where S="bs", no_vars]}.\\ {\it (ii)} @{thm supp_supports[no_vars]}. \end{prop} Another important notion in the nominal logic work is \emph{equivariance}. - For a function @{text f}, say of type @{text "\ \ \"}, to be equivariant + For a function @{text f} to be equivariant it is required that every permutation leaves @{text f} unchanged, that is \begin{equation}\label{equivariancedef} @{term "\\. \ \ f = f"} \end{equation}\smallskip - \noindent or equivalently that a permutation applied to the application - @{text "f x"} can be moved to the argument @{text x}. That means for equivariant - functions @{text f}, we have for all permutations @{text "\"}: + \noindent + If a function is of type @{text "\ \ \"}, this definition is equivalent to + the fact that a permutation applied to the application + @{text "f x"} can be moved to the argument @{text x}. That means for + functions @{text f} of type @{text "\ \ \"}, we have for all permutations @{text "\"}: \begin{equation}\label{equivariance} @{text "\ \ f = f"} \;\;\;\;\textit{if and only if}\;\;\;\; @@ -603,21 +633,26 @@ \end{equation}\smallskip \noindent - From property \eqref{equivariancedef} and the definition of @{text supp}, we - can easily deduce that equivariant functions have empty support. There is - also a similar notion for equivariant relations, say @{text R}, namely the property - that + There is + also a similar property for relations, which are in HOL functions of type @{text "\ \ \ \ bool"}. + Suppose a relation @{text R}, then + that for all permutations @{text \}: \[ - @{text "x R y"} \;\;\textit{implies}\;\; @{text "(\ \ x) R (\ \ y)"} + @{text "\ \ R = R"} \;\;\;\;\textit{if and only if}\;\;\;\; + @{text "\x y."}~~@{text "x R y"} \;\textit{implies}\; @{text "(\ \ x) R (\ \ y)"} \]\smallskip - + + \noindent + Note that from property \eqref{equivariancedef} and the definition of @{text supp}, we + can easily deduce that equivariant functions have empty support. + Using freshness, the nominal logic work provides us with general means for renaming binders. \noindent While in the older version of Nominal Isabelle, we used extensively - Property~\ref{swapfreshfresh} to rename single binders, this property + Proposition~\ref{swapfreshfresh} to rename single binders, this property proved too unwieldy for dealing with multiple binders. For such binders the following generalisations turned out to be easier to use. @@ -677,9 +712,9 @@ the notion of alpha-equivalence that is \emph{not} preserved by adding vacuous binders.) To answer this question, we identify four conditions: {\it (i)} given a free-atom function @{text "fa"} of type \mbox{@{text "\ \ atom - set"}}, then @{text x} and @{text y} need to have the same set of free + set"}}, then @{text "(as, x)"} and @{text "(bs, y)"} need to have the same set of free atoms; moreover there must be a permutation @{text \} such that {\it - (ii)} @{text \} leaves the free atoms of @{text x} and @{text y} unchanged, but + (ii)} @{text \} leaves the free atoms of @{text "(as, x)"} and @{text "(bs, y)"} unchanged, but {\it (iii)} `moves' their bound names so that we obtain modulo a relation, say \mbox{@{text "_ R _"}}, two equivalent terms. We also require that {\it (iv)} @{text \} makes the sets of abstracted atoms @{text as} and @{text bs} equal. The @@ -755,7 +790,7 @@ \noindent Now recall the examples shown in \eqref{ex1} and \eqref{ex3}. It can be easily checked that @{text "({x, y}, x \ y)"} and - @{text "({y, x}, y \ x)"} are alpha-equivalent according to + @{text "({x, y}, y \ x)"} are alpha-equivalent according to $\approx_{\,\textit{set}}$ and $\approx_{\,\textit{set+}}$ by taking @{text "\"} to be the swapping @{term "(x \ y)"}. In case of @{text "x \ y"}, then @{text "([x, y], x \ y)"} $\not\approx_{\,\textit{list}}$ @{text "([y, x], x \ y)"} @@ -891,7 +926,7 @@ \end{proof} \noindent - Assuming that @{text "x"} has finite support, this lemma together + This lemma together with \eqref{absperm} allows us to show \begin{equation}\label{halfone} @@ -900,8 +935,8 @@ \noindent which by Property~\ref{supportsprop} gives us `one half' of - Theorem~\ref{suppabs}. The `other half' is a bit more involved. To establish - it, we use a trick from \cite{Pitts04} and first define an auxiliary + Theorem~\ref{suppabs}. To establish the `other half', we + use a trick from \cite{Pitts04} and first define an auxiliary function @{text aux}, taking an abstraction as argument \[ @@ -1007,7 +1042,9 @@ \emph{recursive argument} of @{text "C\<^sup>\"}. The types of such recursive arguments need to satisfy a `positivity' restriction, which ensures that the type has a set-theoretic semantics (see - \cite{Berghofer99}). The labels annotated on the types are optional. Their + \cite{Berghofer99}). If the types are polymorphic, we require the + type variables are of finitely supported type. + The labels annotated on the types are optional. Their purpose is to be used in the (possibly empty) list of \emph{binding clauses}, which indicate the binders and their scope in a term-constructor. They come in three \emph{modes}: @@ -2468,7 +2505,8 @@ Isabelle. First, it is far too complicated to be a basis for automated proofs implemented on the ML-level of Isabelle/HOL. Second, it covers cases of binders depending on other binders, which just do not make sense for our - alpha-equated terms. Third, it allows empty types that have no meaning in a + alpha-equated terms (the corresponding @{text "fa"}-functions would not lift). + Third, it allows empty types that have no meaning in a HOL-based theorem prover. We also had to generalise slightly Ott's binding clauses. In Ott one specifies binding clauses with a single body; we allow more than one. We have to do this, because this makes a difference for our diff -r 860df8e1262f -r d3d5225f4f24 LMCS-Paper/document/root.bib --- a/LMCS-Paper/document/root.bib Wed Feb 22 12:10:17 2012 +0000 +++ b/LMCS-Paper/document/root.bib Wed Feb 29 03:12:52 2012 +0000 @@ -1,3 +1,12 @@ + +@Unpublished{Traytel12, + author = {D.~Traytel and A.~Popescu and J.~C.~Blanchette}, + title = {{F}oundational, {C}ompositional ({C}o)datatypes for {H}igher-{O}rder + {L}ogic: {C}ategory {T}heory {A}pplied to {T}heorem {P}roving}, + note = {Submitted for publication.}, + year = {2012} +} + @inproceedings{pfenningsystem, author = "F.~Pfenning and C.~Sch{\"u}rmann", title = "{S}ystem {D}escription: {T}welf - {A} {M}eta-{L}ogical diff -r 860df8e1262f -r d3d5225f4f24 LMCS-Paper/document/root.tex --- a/LMCS-Paper/document/root.tex Wed Feb 22 12:10:17 2012 +0000 +++ b/LMCS-Paper/document/root.tex Wed Feb 29 03:12:52 2012 +0000 @@ -72,8 +72,8 @@ \email{christian.urban@kcl.ac.uk} \author{Cezary Kaliszyk} -\address{University of Tsukuba, Japan} -\email{kaliszyk@cs.tsukuba.ac.jp} +\address{University of Innsbruck, Austria} +\email{cezary.kaliszyk@uibk.ac.at} \thanks{$^\star$~This is a revised and expanded version of~\cite{UrbanKaliszyk11}} \keywords{Nominal Isabelle, variable convention, theorem provers, formal reasoning, lambda-calculus} diff -r 860df8e1262f -r d3d5225f4f24 LMCS-Review --- a/LMCS-Review Wed Feb 22 12:10:17 2012 +0000 +++ b/LMCS-Review Wed Feb 29 03:12:52 2012 +0000 @@ -15,31 +15,7 @@ > The work reported is very good, but the presentation of the paper can > be improved. > -> This paper continues a line of work called "Nominal Isabelle" carried -> out by the first author and his colleagues for many years. The goal -> of this work is to support formal (machine checked) reasoning about -> languages with binding. With the theoretical foundation of "nominal -> logic" developed by Pitts and colleagues, these authors and their -> co-workers have developed a package to support such reasoning in the -> Isabelle proof tool for Higher Order Logic. This toolkit has been -> widely used, and although the technology sometimes shows through -> (e.g. explicit name swapping required in arguments) it is a very good -> package. -> -> Up to now, this package has supported single binders such as \lambda. -> Multiple simultaneous binding (e.g. letrec) had to be coded using -> iterated single binders. Not only is this coding hard to reason -> about, it often isn't a correct representation of the intended -> language. This paper describes a new version of the Isabelle package, -> "Nominal2", supporting binding of sets and lists of names in the -> Isabelle/HOL system. -> -> The amount of work involved is immense, and the first author -> especially has shown real commitment to continuing development of both -> theory and working tools. Everything provided in this package is -> claimed to be a definitional extension of HOL: no assumptions or -> built-in changes to the logic. For all of these reasons, this is very -> good work. +> [...] > > However, I recommend improvement of the presentation of the paper > before it is accepted by LMCS. While the motivation for the work of @@ -51,7 +27,8 @@ > it would be done in the new system reported in this paper showing why > the new approach works better in practice. -Added +We have expanded on the point why single binders lead to clumsy +formalisations. > Although this example is > one of the main motivations given for the work, there is apparently no @@ -64,9 +41,8 @@ > advanced tasks in the POPLmark challenge [2], because also there one > would like to bind multiple variables at once."). -No time to provide full examples yet. They will be provided -once Nominal2 becomes more mature and people are using it -and help to provide theories. +We proved the main properties about type-schemes in algorithm W. +Though the complete formalisation is not yet in place. > The new Isabelle package "Nominal2", described in this paper, is not > ready for users without a lot of hand-holding from the Nominal2 @@ -74,50 +50,36 @@ > could try the tool without so much difficulty. The plan is to have Nominal Isabelle be part of the next stable -release of Isabelle, which should be out before the summer 2012. +release of Isabelle, which should be out in the summer 2012. At the moment it can be downloaded as a bundle and is ready -to be used (we have confirmation from two groups for this). +to be used (there are two groups that already use Nominal2 and +only occasionally ask questions on the mailing list). > A few more specific points: > > Bottom of p.7: I don't understand the paragraph containing equations > (2.4) and (2.5). +We reworded this paragraph. > Bottom of p.9: The parameters R and fa of the alpha equivalence > relation are dropped in the examples, so the examples are not clear. + +We have two minds with this: Keeping R and fa is more faithful to +the definition, but it would not provide much intuition into +the definition. We feel explaining the definition in special +cases is most beneficial to the reader. + > I think there is a typo in the first example: "It can be easily > checked that ({x,y},x->y) and ({y,x},y->x) are alpha-equivalent [...]" > Did you mean "({x,y},x->y) and ({y,x},x->y) are alpha-equivalent"? - +Both are equivalent since sets {x, y} and {y, x} are equal. But we made +it clearer as you suggest. > Referee no 2: > -> * The paper can be accepted for Logical Methods in Computer Science -> after minor -> revisions -> -> General comments -> -> This paper describes a new implementation of the nominal_datatype package -> within the Isabelle/HL theorem prover. This implementation is more modular -> than previous versions, because it relies on (I think) three non-trivial -> independent packages, namely the datatype package, the function package, and -> the quotient package. This implementation is also more powerful than previous -> versions, because it deals with abstractions that bind multiple names -> at once, -> and because it offers two variants of these abstractions (baptised "set" and -> "set+") where certain structural equivalence laws, namely the exchange of two -> binders and the elimination/introduction of a vacuous binder, are built -> directly into the alpha-equivalence relation. -> -> Overall, I like the paper because it describes a useful piece of software, -> because the architecture of this software is quite non-trivial and well -> designed, and because the paper is written in a very understandable style. -> For these reasons, I believe the paper should be accepted. I do have a series -> of questions and suggestions for potential improvements and would be happy to -> review a revised version of the paper if the editor sees fit. +> [...] > > My main criticisms of the paper are: > @@ -125,11 +87,19 @@ > The general format at the beginning of section 4 is very clear, but is in > fact too general: not everything that can be written in this format makes > sense. The authors then walk the reader through a series of -> examples of what +> examples of what > is *forbidden* (with informal explanations why these examples are > forbidden), but in the end, a positive definition of what is *permitted* > seems to be missing. -> + +Yes, we agree very much, a positive definition would be very desirable. +Unfortunately, this is not as easy as one would hope. Already a positive +definition for a *datatype* in HOL is rather tricky. We would rely on this +for nominal datatypes. The only real defence for not giving a positive +we have is that we give a sensible "upper bound". To appreciate this point, +you need to consider that systems like Ott go beyond this upper bound and +give definitions which are not sensible as (named) alpha-equated structures. + > * The authors have isolated an important building block, the notion of > (multiple-name) abstraction (in Section 3). (Actually, there are three > variants of it.) This is good: it makes the whole construction modular @@ -148,36 +118,76 @@ > explain/motivate the design of the surface specification language in > terms of these elementary notions, the paper might become all the more > compelling. -> + +We are not sure whether we can make progress with this. There is such a +"combinator approach" described by Francois Pottier for C\alphaML. His approach +only needs an inner and outer combinator. However, this has led to quite +non-intuitive representations of "nominal datatypes". We attempted +to be as close as possible to what is used in the "wild". This led +us to isolate the notions of shallow, deep and recursive binders. + > In the present state of the paper, I think the *implementation* of the > nominal package is very useful for the end user, but the *theory* that is > presented in this paper is still a bit cumbersome: the definitions of free > atoms, alpha-equivalence, etc. presented on pages 16-20 are understandable > but not compelling by their simplicity. -> + +We agree - simpler would be much better. We refer the referee to "competing" +definitions that are substantially more complicated. Please try to understand +the notion of alpha-equivalence for Ott (which is published in he literature). +So we really of the opinion our definitions are a substantial improvement, +as witnessed we were actually able to code them up. + > * I do not quite understand the treatment of the finiteness restriction. > I understand that things must have finite support so as to allow picking > atoms outside of their support. But finiteness side conditions seem to > appear pretty early and in unexpected places; e.g. I would expect the > support of a set of atoms "as" to be equal to "as", regardless of whether > "as" is finite or infinite. This could be clarified. -> + +This is a peculiarity of support which has been explained elsewhere in the nominal +literature. Assume all_atoms is the set of all atoms then + + supp all_atoms = {} + +For sets of atoms the support behaves as follows: + + supp as = as if as is finite + supp as = all_atoms if as and all_atoms - as are infinite + supp as = all_atoms - as if as is co-finite + +As said, such properties have been described in the literature and it would be +overkill to include them again. Since in programming language research nearly +always only finitely supported structures are of interest, we often restrict +to these cases only. Knowing that an object of interest has only finite support +is needed when fresh atoms need to be chosen. + > * The choice of abstraction "style" is limited to three built-in forms (list, > set, and set+). Perhaps one could make this user-extensible. After -> all, very -> few properties seem to be required of the basic abstraction forms, -> so why not -> let the user define new ones? -> +> all, very few properties seem to be required of the basic abstraction forms, +> so why not let the user define new ones? + +This would be nice, but we do make use of the equality properties +of abstractions (which are different in the set, set+ and list case). +So we have not found a unifying framework yet. + > * One may argue that the set-abstractions are an attempt to kill two birds > with one stone. On the one hand, we take the quotient raw terms modulo a > standard notion of alpha-equivalence; on the other hand, at the same time, > we take the quotient modulo a notion of structural equivalence (permutation > of binders, removal or introduction of vacuous binders). One could argue -> that dealing with structural equivalence should be left to the -> user, because +> that dealing with structural equivalence should be left to the user, because > in general the structural equivalence axioms that the user needs can be -> arbitrarily complex and application-specific. There are object languages, +> arbitrarily complex and application-specific. + +Yes, we can actually only deal with one non-trivial structural equivalence, +namely set+ (introduction of vacuous binders). For us, it seems an application +that occurs more often than not. Type-schemes are one example; the Psi-calculus +from the group around Joachim Parrow are another. So we like to give automated +support for set+. Going beyond this is actually quite painful for the user +if no automatic support is provided. + +> There are object languages, > for instance, where abstractions commute with pairs: binding a name in a > pair is the same as binding a name within each of the pair components. > (This is the case in first-order logic where forall distributes over @@ -188,7 +198,11 @@ > of structural equivalence anyway. I don't think that the paper provides > convincing evidence that set and set+ abstractions are useful. (That said, > they don't cost much, so why not include them? Sure.) -> + +From our experience, once a feature is included, applications will +be found. Case in point, the set+ binder arises naturally in the +psi-calculus. + > * Here is little challenge related to set-abstractions. Could you explain how > to define the syntax of an object language with a construct like this: > @@ -204,24 +218,9 @@ > this example, otherwise set-abstractions will not be very useful in > practice. ->> datatype trm = ->> Var string ->> | App trm trm ->> | Lam string trm ->> | Let "(string * trm) fset" trm ->> Not a problem. Both finite sets and bags should be possible as ->> constructors within the new package. ->> Best regards and a happy new year! ->> Andrei Popescu +We can. We added this as an example in the conclusion section. -> -> Detailed comments -> -> [Written while I was reading, so sometimes I ask a question whose -> answer comes -> a bit later in the paper.] -> > p.2, "this leads to a rather clumsy formalisation of W". Could you explain > why? Although I can understand why in some circumstances it is desirable to > have a notion of alpha-equivalence that includes re-ordering binders, @@ -237,46 +236,88 @@ > with one another. For this reason, it is not clear that a notion of > alpha-equivalence for type schemes is required at all, let alone that it must > allow re-ordering binders and/or disregarding vacuous binders. -> + +In the correctness proof of W, the notion of capture-avoiding +substitution plays a crucial role. The notion of capture-avoiding +substitution without the notion of alpha-equivalence does not make +any sense. This is different from *implementations* of W. They +might get away with concrete representations of tyep-schemes. + > p.3, "let the user chose" -> "choose" -> + +Fixed. + > p.5, I am not sure what you mean by "automatic proofs". Do you mean > automatically-generated proof scripts, or proofs performed automatically by a > decision procedure, or ... ? -> + +Fixed. + > p.5, "adaption" -> + +Fixed. + > p.5, it seems strange to use the symbol "+" for composition, a > non-commutative > operation. -> + +Yes. This is a "flaw" in the type-class setup. If we want to reuse libraries +from Isabelle, we have to bite into this "sour" apple. + > Equation (2.2) is unfamiliar to me. I am used to seeing "supp x" defined as > the least set L such that for every permutation pi, if pi fixes L, then pi > fixes x. I assume that the two definitions are equivalent? Is there a reason > why you prefer this one? -> + +Our definition is equivalent to yours for finite supported x. +The advantage of our definition is that can be easily stated +in a theorem prover since it fits into the scheme + + lhs = rhs + +Your definition as the least set...does not fit into this simple +scheme. Our definition has been described extensively elsewhere, +to which we refer the reader. + > Proposition 2.3, item (i) is not very easy to read, because text and math -> are mixed and "as" happens to be an English word. More importantly, could +> are mixed and "as" happens to be an English word. + +Replaced by bs. + +> More importantly, could > you explain why the hypothesis "finite as" is needed? The proposition seems > intuitively true if we remove this hypothesis: it states exactly that > "supp x" > is the least set that supports x (this is actually the definition of "supp" > that I expected, as mentioned above). -> + +It does not work for non-finitely supported sets of atoms. Imagine +bs is the set of "even" atoms (a set that is not finite nor co-finite). + + bs supports bs + +but + + supp bs = all_atoms !<= bs + > p.8, "equivariant functions have empty support". I suppose the converse is > true, i.e. "functions that have empty support are equivariant". If this is > correct, please say so. -> -> p.8, "we used extensively Property 2.1". You mean "Proposition 2.1". Perhaps -> it would be good to choose distinct numbers for inline equations and for -> propositions. -> + +We have reworded this paragraph. + +> p.8, "we used extensively Property 2.1". You mean "Proposition 2.1". + +Fixed. + > p.8, "we identify four conditions: (i) [...] x and y need to have the same > set of free atoms". You seem to be saying that fa(x) and fa(y) should be > equal. But this is too strong; I suppose you mean fa(x) \ as = fa(y) \ bs. > Please clarify. (Definition 3.1 indeed clarifies this, but I believe that > the text that precedes it is a bit confusing.) -> + +Fixed. + > p.9, it seems to me that alpha-equivalence for Set+ bindings (Definition 3.3) > is in a sense the most general of the three notions presented here. Indeed, > alpha-equivalence for Set bindings can be defined in terms of it, as follows: @@ -300,34 +341,73 @@ > more radical > suggestion, could one decide to expose only Set+ equality to the programmer, > and let him/her explicitly encode Set/List equality where desired? -> + +The second property holds outright. The first holds provided as and bs +are finite sets of atoms. However, we seem to not gain much code reduction +from these properties, except for defining the modes Set and List in terms +of Set+. But since they are three different modes and they behave differently +as alpha-equivalence, the encoding, in our opinion, blurrs this difference. +We noted these facts in the conclusion. + > p.10, "in these relation" -> + +Fixed. + > p.10, isn't equation (3.3) a *definition* of the action of permutations > on the newly defined quotient type "beta abs_{set}"? -> + +Yes, in principle this could be given as the definition. Unfortunately, this is not +as straightforward in Isabelle - it would need the function package +and a messy proof that the function is "proper". It is easier to +define permutation on abstractions as the lifted version of the permutation +on pairs, and then derive (with the support of the quotient package) that +(3.3) holds. + > p.11, why do you need to "assume that x has finite support" in order to > obtain property 3.4? It seems to me that this fact should also hold for -> an x with infinite support. Same remark in a couple of places further +> an x with infinite support. + +Fixed. + +> Same remark in a couple of places further > down on this page. You note that "supp bs = bs" holds "for every finite > set of atoms bs". Is it *not* the case that this also holds for infinite -> sets? If so, what *is* the support of an infinite set of atoms? Why not +> sets? If so, what *is* the support of an infinite set of atoms? + +This property only holds for finite sets bs. See above. + +> Why not > adopt a definition of support that validates "supp bs = bs" for *every* > set of atoms bs? Is there a difficulty due to the fact that what you > call a "permutation" is in a fact "a permutation with finite support"? > I think it would be good to motivate your technical choices and clarify > exactly where/why a finite support assumption is required. -> + +This cannot be made as it is very natural to have the property +that the set of all atoms has empty support. There is no permutation +that does not fix that set. We cannot see how this can be made +to work so that + + supp bs = bs + +for all sets. + > p.11, "The other half is a bit more involved." I would suggest removing > this rather scary sentence. The proof actually appears very simple and > elegant to me. -> + +Fixed. + > p.12, "mutual recursive" -> "mutually recursive" -> + +Fixed. + > p.12, does the tool support parameterized data type definitions? If so, > please mention it, otherwise explain whether there is a difficulty (e.g. > the parameters would need to come with a notion of permutation). -> + +Yes. Note added. + > p.12, "Interestingly, [...] will make a difference [...]". At this > point, upon > first reading, this is not "interesting" but rather frustrating, because it @@ -336,30 +416,41 @@ > u". Because > a forward pointer is missing, I cannot find immediately where this is > explained, and this problem hinders my reading of the beginning of section 5. -> -> p.13, the type of sets now seems to be "fset" whereas it was "set" -> previously. -> + +We have given a forward pointer. + > p.13, the type of atoms now seems to be "name", whereas it was previously > "atom". The remark on the last line of page 13 leads me to understand that > "name" refers to one specific sort of atoms, whereas "atom" refers to an > atom of any sort (right?). The function "atom" converts one to the other; > but what is its type (is it overloaded?). -> + +Yes, atom is overloaded. + > p.13, you distinguish shallow binders (binds x in ...) and deep binders > (binds bn(x) in ...). I would hope that a shallow binder is just syntactic > sugar for a deep binder where "bn" is the "singleton list" or "singleton > set" function. Is this the case? If not, why not? If yes, perhaps you could > remove all mentions to shallow binders in section 5. -> + +No, quite. A deep binder can be recursive, which is a notion +that does not make sense for shallow binders. + > p.14, "we cannot have more than one binding function for a deep binder". You > exclude "binds bn_1(p) bn_2(p) in t". Couldn't this be accepted and > interpreted as "binds bn_1(p) \cup bn_2(p) in t"? (I guess it does not matter > much either way.) -> + +That would make the definition of fa-function more complicated. + > p.14, you also exclude "binds bn1(p) in t1, binds bn2(p) in t2". Two > questions. First, a clarification: if bn1 and bn2 are the same function, is -> this allowed or excluded? Second, I don't understand why you need this +> this allowed or excluded? + +For practical purposes, this is excluded. But the theory would +support this case. + +> Second, I don't understand why you need this > restriction, that is, why you are trying to prevent an atom to be "bound and > free at the same time" (bound in one sub-term and free in another). I > mean, in @@ -368,10 +459,11 @@ > (at least, you have not stated that you disallow this). There, occurrences of > x in t1 are considered bound, whereas occurrences of x in t2 are considered > free; is this correct? If so, why not allow "binds bn1(p) in t1, binds bn2(p) -> in t2", which seems to be of a similar nature? Is this a somewhat ad hoc -> restriction that simplifies your implementation work, or is there really a -> deep reason why accepting this clause would not make sense? -> +> in t2", which seems to be of a similar nature? + +The problem is that bn1 and bn2 can pick out different atoms to +be free and bound in p. This requires to exclude this case. + > p.14, example 4.4, the restriction that you impose here seems to rule out > an interesting and potentially useful pattern, namely telescopes. A telescope > is a list of binders, where each binder scopes over the rest of the @@ -413,7 +505,9 @@ > issue. Is it conceivable that an extension of your system could deal with > telescopes? Other researchers have proposed approaches that can deal with > them (I am thinking e.g. of ``Binders Unbound'' by Weirich et al.). -> + +Yes, we cannot deal with telescopes. We added a comment. + > Here is another general question. How would you declare a nominal data type > for ML patterns? Informally, the syntax of patterns is: > @@ -429,17 +523,23 @@ > same sets of names. How would you deal with this? I imagine that one could > just omit these two side conditions in the definition of the nominal data > type, and deal with them separately by defining a well-formedness predicate. + +Yes, that is how we would exclude non-linear patterns. + > One question: in the definition of the "term" data type, at the point where > one writes "binds bn(p) in t", which variant of the "binds" keyword would one > use: "binds", "binds(set)", or "binds(set+)"? Does it make any difference, > considering that a pattern can have multiple occurrences of a name in binding > position? It would be interesting if you could explain how you would handle > this example. -> -> Another interesting (perhaps even more tricky) example is the syntax of the -> join-calculus. In terms of binding, it is really quite subtle and worth a -> look. -> + +You can write + + binds bn(p) in t, + binds(set) bn(p) in t, + binds(set+) bn(p) in t + + > p.15, just before section 5, I note that the completion process does *not* > produce any clause of the form "binds ... in x" (in the Lam case). One could > have expected it to produce "binds x in x", for instance. One could imagine @@ -463,23 +563,9 @@ > side of a "binds" clause). As far as I can see, however, you have not > explicitly forbidden this situation. So, is it forbidden or allowed? Please > clarify. -> -> If there is indeed a partition between "terms" and "binders", please justify -> why things must be so. I can think of a more general and more symmetric -> approach, where instead of writing "binds bn(p) in t" and considering that "p -> is a binder" and "t is a term", one would write "binds bn(p) in p t" and -> consider that p and t play a priori symmetric roles: the only difference -> between them stems from the fact that we collect the bound names -> inside p, but -> not inside t. (I am not suggesting that the user should write this, but that -> the user syntax could be desugared down to something like this if this makes -> the theory simpler.) Ah, but I guess that if one were to follow this path, -> then one would need a way of distinguishing recursive versus non-recursive -> binders. I guess I see why your design makes sense, but perhaps you should -> better explain that it is a compromise between several other possible designs -> (``alphacaml'', ``binders unbound'', etc. are examples of other designs) and -> how you reached this particular point in the design space. -> + +It is disallowed. We clarified the text. + > OK, now I see that, since you allow ``recursive binders'', there is not a > partition between ``terms'' and ``binders''. A recursive binder appears both > on the left- and right-hand sides of a binds clause. Do you require that it @@ -490,12 +576,7 @@ > second clause. This would be kind of weird, and (I imagine) will not lead > to a reasonable notion of alpha-equivalence. I am hoping to find out later > in the paper. -> -> p.17, "we have to add in (5.3) the set [...]". It is not very clear whether -> you are suggesting that equation 5.3 is incomplete and something should be -> added to it, or equation 5.3 is fine and you are referring to B' which is -> there already. I suppose the latter. -> + > p.17, "for each of the arguments we calculate the free atoms as > follows": this > definition relies on the fact that "rhs" must be of a specific *syntactic* @@ -508,45 +589,40 @@ > you are not restricting just *the values* that these functions can > return, but > the *syntactic form* of these functions. -> + +Fixed. + > p.23, "We call these conditions as": not really grammatical. -> + +Fixed. + > p.23, "cases lemmas": I suppose this means an elimination principle? -> + +Yes. + > p.23, "Note that for the term constructors" -> "constructor" -> + +Fixed. + > p.26, "avoid, or being fresh for" -> "avoid, or are fresh for" -> + +Fixed. + > p.30, "Second, it covers cases of binders depending on other binders, > which just do no not make sense [...]". I am curious why the designers > of Ott thought that these cases make sense and you don't. Perhaps this > point would deserve an example and a deeper discussion? -> -> p.30, at last, here is the discussion of "binds ... in s t" versus -> "binds ... in s, binds ... in t". I see that the difference in the -> two interpretations boils down to an abstraction whose body is a pair, -> versus a pair of abstractions. It is indeed interesting to note that -> these notions coincide for single-name abstractions, and for list -> abstractions, but not for set and set+ abstractions. -> + +We have added more comments why such binders would not make sense +for alpha-equated terms (fa-functions would not lift). We do not +know why Ott allows them - probably because they do not establish +a reasoning infrastructure for alpha-equated terms. + > p.32, "It remains to be seen whether properties like [...] allow us > to support more interesting binding functions." Could you clarify > what you mean? Do you mean (perhaps) that fa_bn(x) could be defined > as fa_ty(x) \ bn(x), regardless of the definition of bn(x), instead > of by induction over x? Do you mean something else? -> -> The example in Figures 1 and 2 do not seem very interesting to me. It -> involves single binders and flat lists of binders. Not much subtlety going on -> here. I think this example could be reduced in size without losing -> anything in -> terms of content. And perhaps a trickier example could be added (I have two -> suggestions, which I mentioned above already: ML with conjunction and -> disjunction patterns; and the join-calculus). -> -> -> -> -> -> -> +Yes. +