Logic & Programming

Danko Ilik's research blog
<– home page

Wim Veldman taught me classical model theory and classical descriptive set theory during the 2006-2007 masterclass in logic. I continued to learn from him after the masterclass, as he had important results about Kripke models, the fan theorems, bar induction, and open induction, topics that I was and that I am still interested in. He was very kind to accept to be member of the jury for my PhD thesis defense and came to Paris for the occasion. I also remember receiving a review of a paper – a very objective one – that was expressly signed by him, although it was supposed to be an anonymous review.

Wim was in fact an intuitionistic mathematician, in the original sense of the word, a student of J. J. de Iongh, who learned about intuitionism from L. E. J. Brouwer. This is one of the aspects that made Wim a very interesting teacher of model theory and descriptive set theory. During the classes, he would warn us the students of an incoming powerful use of the axiom of choice in a proof of a theorem, so that we are aware of the danger, or reassure us in case the choice principle were only used to deal with a countable language. When the law of excluded middle were necessary, he would look upwards and evoke the Book of God, mockingly (although he was deeply religious I think), to point out the gravity of the act of evoking the LEM.

Model theory became more constructive for me thanks to these clarifications. Model theory is constructive, as a matter of fact, at least in the sense that it contains a wealth of counter-models that a constructive mathematician could use, as a guide, as a complement, while looking for constructive proofs or definitions.

Another aspect that made Wim an excellent teacher was his teaching technique. He had a very developed sense of measure of how much information to give out. He wanted to draw our full attention to completing the picture he wanted to transmit. Sometimes, he would put his hands behind his back, look at us and pose a question, and I had the impression that he was hiding the answer in his closed fist, and that we only needed to guess what the answer was. At the exams, the question posed did not require to prove or disprove a particular statement; the question asked to make up ones mind about the truth of the statement, and then either prove it or prove its negation.

I last saw Wim a couple of years ago in Nijmegen. We met up for dinner in the city center and we walked across the Waal River and back. He mentioned in passing his recent long stay at a hospital, but it had not stopped him from taking his bike to come to meet me and having very long and enthusiastic discussion on logic, life, music and literature. We had exchanged a few emails since then, one on the intuitionistic arithmetical hierarchy. I had been following his latest papers as a reviewer or on arXiv. The last one I read was a beautiful and elementary account of the basic ideas around intuitionism, Intuitionism: an inspiration?, intuitionism in the original sense of the word.

He passed away on November 30, 2024. My condolences to his family and friends. I will continue to gladly remember Wim.

Comment here: @danko@mamot.fr Follow using the Fediverse ID: @danko@blog.iaddg.net

by Makoto Fujiwara, Hajime Ishihara, Takako Nemoto, Nobu-yuki Suzuki and Keita Yokoyama

The authors introduce a new notion of model, called extended frame, that consists of two Kripke frames and a monotone map between them. They then use it to show the intuitionistic non-provability of certain omniscience principles, variants of the excluded middle, of double-negation elimination, of de Morgan laws, and their first-order counterparts MP, LPO, WLPO, LLPO, $\Delta_1$-PEM.

It is a new, simple and uniform method for separating these semi-classical principles, because previously the separation results had to obtained by building dedicated realizability models. For the moment, the approach seems to be limited to principles which are obtained from propositional formulas, by replacing a proposition with a $\Sigma_1$-predicate, which does not cover essentailly first-order principles like the Double Negation Shift.

Comment here: @danko@mamot.fr Follow using the Fediverse ID: @danko@blog.iaddg.net

A new version of this draft paper has been posted here: https://arxiv.org/abs/2401.13304

Hugo Herbelin is the principal author of this paper.

For some earlier preliminary work see my PhD thesis and the formalization. (this old formalization will be updated soon)

Comment here: @danko@mamot.fr Follow using the Fediverse ID: @danko@blog.iaddg.net

По покана на Филозофското друштво на Македонија и Институтот за филозофија при УКИМ, на 13. јануари 2024 одржав предавање на Зум со следнава содржина:

Во ова предавање ќе ги објасниме феномените на комплетност и на некомплетност на формалните логички системи, од нивната историска генеза, преку теоремите на Гедел и на Кирби-Парис, па сѐ до некои од перспективите кои овие резултати ги отвориле или ги оставиле отворени до денешен ден.

Линк до слајдовите: https://iaddg.net/danko/den-na-logikata-2024.pdf

Видеото онлајн: https://blog.iaddg.net/dash/sdl2024/

Видеото за симнување преку торент: https://iaddg.net/danko/sdl2024.torrent

Comment here: @danko@mamot.fr Follow using the Fediverse ID: @danko@blog.iaddg.net

by Kawano, Tomoaki; Matsuda, Naosuke; Takagi, Kento

The paper studies the equivalence of intuitionistic and classical validity in propositional logic for a language that contains as the only logicial connectives the ones that represent monotonic Boolean formulas.

It is shown that, for this class of formulas, intuitionistic and classical validity coincide. The notion of classical validity is the usual one (sometimes called Tarski semantics), while, for the notion of intuitionistic validity, the authors use what they call extended Kripke interpretation: the interpretation consists in, for every connective, prefixing a universal quantification $v\ge w$ (meaning ``in a possible world $v$ extending the world $w$'') before the classical interpretation of the connective.

The result is interesting, because usually intuitionistic and classical validity are shown to coincide for classes of formulas such as the so called “negative” formulas (that exclude the authentically intuitionistic connectives of disjunction and existential quantification). The class of monotone Boolean formulas thus in a way generalizes the class of negative ones.

The result shown is said to answer an open question posed by van der Giessen about the proof theory of inuitionistic and classical natural deduction calculi, however in the paper the link between the model theory developed in the paper and the proof theory of the open question is not made in sufficient detail (is there a soundness and completeness theorem for the notion of validity used with respect to the calculi?).

Comment here: @danko@mamot.fr Follow using the Fediverse ID: @danko@blog.iaddg.net

My PhD student Wendlasida Ouedraogo defended his PhD thesis on Source code optimization for safety critical software at the École Polytechnique in Palaiseau.

He showed how to do provably correct transformations of source-code for a fragment of the Ada language, used in real-world critical software, inside the Coq proof assistant. Using such transformations, one can optimize programs at the entry of a compiler, and does not pose the burden on proving simulation theorems between various compiler intermediate languages, like it is usually done (ex. in CompCert).

He also developed a tool-chain to write provably correct lexers, CoqLex, lexing previously being a compiling phase that was not done formally.

Congratulations and the best of luck to Wendlasida!

Comment here: @danko@mamot.fr Follow using the Fediverse ID: @danko@blog.iaddg.net

This book shows how to formalize some very basic mathematical notions from Logic, Abstract Algebra, Analysis, Linear Algebra, Differential Equations and Probability Theory, inside the functional programming language Haskell. Haskell is appropriate for expressing such notions since it encourages the use of high-order functions (resulting in concise, category-theory-like formulations), its variables are immutable (just like in Mathematics, eliminating computational side-effects related to variable mutation in imperative programming languages), it has algebraic data types (allowing inductive/recursive formulations and reasoning), and type classes (useful for treating mathematical theories in a compositional way). These same features are shared by proof assistant software such as Coq, which have been used extensively for formalization of Mathematics, however, what is different in the approach of this book, with respect to formalization efforts in settings with richer type systems, is that it allows one to build up an abstract mathematical concept more rapidly and experiment with it computationally, making it an interesting setting for doing experimental mathematics.

The book could be useful (indeed, it started as a coursebook for undergraduate students) for Mathematics students studying functional programming or Computer Science students studying Mathematics, but it could also be useful for researchers learning how to efficiently formalize their favorite mathematical theory inside a proof assistant. The formulations of the exponential function, the Laplace transform and the “axiomatization” of probability are elegant (concise!). One could perhaps regret the absence of definition of real numbers in the framework (real numbers are treated abstractly).

Comment here: @danko@mamot.fr Follow using the Fediverse ID: @danko@blog.iaddg.net

The paper studies the algebraic notion of Jacobson radical as a set theoretic notion. For a logician, it seems easiest to understand the Jacobson radical logically : the radical $\text{Jac}(T)$ of a theory $T$ is the set of all propositions $a$, such that an inconsistency of $a$ with a proposition $b$ translates to an inconsistency of $T$ with the proposition $b$: $$ \text{Jac}(T) = \{ a ~|~ \forall b(a,b\vdash\bot \to T,b\vdash\bot)\}. $$It is a concept used in logic, for instance, in the construction of maximally consistent extensions in the proof of the completeness theorem.

The paper studies a generalized set-theoretic formulation of the Jacobson radical and shows how it is applicable both in algebra and in logic. The generalization is obtained by replacing the algebraic statement $1\in\langle a,b\rangle$ and the logical $a,b\vdash\bot$ by a predicate $R(a,b)$ on subsets $a,b$ of the universe, called an ``inconsistency predicate''.

It is shown (Theorem 1) that the membership to the Jacobson radical is an inductively generated relation, and that this holds constructively, in CZF.

In Theorem 2, it is shown, via Theorem 1, Raoult's Open Induction principle and classical logic, that $\text{Jac}(U)$ is equal to the intersection of all possible maximally consistent extensions of $U$; this holds in ZFC.

The paper then gives application to algebra and logic, among which that the universal closure of Theorem 2 is equivalent to the axiom of choice (Proposition 3) and the derivation of Glivenko's theorem for propositional logic.

Comment here: @danko@mamot.fr Follow using the Fediverse ID: @danko@blog.iaddg.net

The paper preciselly formalizes Martin-Löf constructive type theory with the W-type (inductive definitions) and one universe, in its intensional and extensional variants, and in its variants “à la Tarski” and “à la Russell”, as well as a version/extension of classical Kripke-Platek set theory. It is then shown how to interpret the extensional type theory into Kripke-Platek set theory, from which an upper bound on the proof-theoretic strength is obtained. Together with previous results on a lower bound for intensional type theory (and an embedding into the extensional one), one obtains as a precise characterisation for the proof-theoretic strength of all variants of type theory the ordinal ψΩ1ΩI+ω.

Actually this is the most precise and succinct formulation of type theory that I have seen and should be helpful for relative interpretations of/inside type theory as well as implementing proof tools.

Comment here: @danko@mamot.fr Follow using the Fediverse ID: @danko@blog.iaddg.net

Paper by Yoshiki Nakamura and Naosuke Matsuda

Let IL and CL denote the fragments of intutionistic and classical propositional logic having implication as the sole logical connective. Call a formula (a theorem) α of CL “minimal in CL” if α cannot be obtained by substituting in another formula β ∈ CL a propositional variable p by any formula γ. In a way, α being minimal in CL means that α is apt to be an axiom (schema) of an intermediate logic, since its form is general enough that it cannot be obtained as an instantiation of another more general axiom (schema). IL and CL themselves can be completely axiomatized by such formulas and a question was posed by Komori and Kashima whether one can axiomatize a proper intermediate logic using such a formula.

The authors show that it can be done, by displaying an explicit formula G', a variant of and equivalent to the formula G := ((a → b) → c) → ((b → a) → c) → c, which when added to IL produces the intermediate Gödel-Dummett logic. One could not simple take G to be the required formula, because it is not minimal in CL.

The authors also discuss three related open problems.

Comment here: @danko@mamot.fr Follow using the Fediverse ID: @danko@blog.iaddg.net