Yes, that is a valid alternative encoding: https://en.wikipedia.org/wiki/Church_encoding#Represent_the_...

In general, what you describe is representing an inductive type by its _eliminator_: https://www.quora.com/In-type-theory-what-is-an-eliminator-a...

There are many different ways to encode lists. You are using the Church encoding, where a value is a fold,

```
nil = λn.λc.n
cons = λx.λxs.λn.λc.c x (xs n c)
fold = λf.λz.λxs.xs z f
```

Another option is the Scott encoding, where a value is a case distinction```
nil = λn.λc.n
cons = λx.λxs.λn.λc.c x xs
fold = Y (λfold.λf.λz.λxs.xs z (λhead.λtail.f head (fold f z tail)))
```

And then there is the 3-tuple thing proposed in the article.I'm confused by their "3 part linked list". I always thought the elegance of lambda calculus was that each data structure was equivalent to the method by which you use it. A boolean *is* a function that does a conditional statement. A number *is* a function that does a loop. So logically, a list *is* a function that does a fold, like "LIST :: (a -> b -> a) -> a -> a".

It's things like this that make me wonder what the world would be like if we had embraced the Church-style Lambda Calculus model instead of the Turing system. I don't know if it would be better or worse, but I think it's an interesting thought experiment.

I've been wondering why a lot of things are the way they are in functional programming, not being very lambda inclined myself, I didn't seek it out, but this post summarized a lot that has been on my mind lately.

In particular, and with apologies for being so clueless – it never occurred to me why currying was necessary – I didn't realize lambda calculus required functions to take only one argument. With that limitation, currying is genius.

This is a jump straight into the deep end. I can't recommend starting here. "How to prove it" https://www.amazon.com/How-Prove-Structured-Approach-2nd/dp/... would be a much more appropriate start

Naw, straight into the deep end is to start with reading the Homotopy Type Theory book... https://homotopytypetheory.org/book/

I also recommend Types and Programming Languages by Benjamin Pierce. It is a delightful textbook well suited to self study.

Tangential book recommendation: I have found the first chapters of "Type Theory and Formal Proof" a great in-depth introduction into lambda calculus and its typed variations.

> Thou shalt not make use of French if thou doest not know the language.

The ironing is delicious.

The washing's even better.

Which is pronounced [doh] (homer simpson style) ;)

deaux!

Thou shalt not make use of French if thou doest not know the language. Deaux is a village in France (you can google it).

I also wrote one at

http://tromp.github.io/cl/Binary_lambda_calculus.html#Brainf...

to demonstrate the invariance theorem.

A while ago I wrote a Brainfuck interpreter in lambda calculus. I abused the Clojure macro system to make my job a little easier, and of course, I wrapped it in an interface that could do IO, but this does compile to a single pure lambda calculus function which takes a (church encoded) brainfuck program and input string and returns an output string.

https://github.com/MichaelBlume/lambdafuck/blob/master/src/l...

Turing discovered the Universal machine. Church tried convincing Godel that lambda-calculus is universal, but Godel wasn't ready to accept it.

When Turing defined his machine model as a model for mechanical computation, Godel was convinced.

In an appendix to that paper, Turing also showed that his model was computationally equivalent to the lambda calculus. This involved writing a "universal" lambda expression, i.e. a combinator. (Y combinator is the most famous one, I think that was discovered by Haskell B. Curry.)

It was when Turing proved this equivalence that Godel accepted lambda calculus as a model of intuitive computation.

In between, I have heard that Church's student, Kleene, did formulate fixed-point combinators when proving addition was computable in the lambda-calculus. Kleene was then convinced that this was a universal model of computation. I do not know why Church did not follow this up.

So Turing does have some primacy over Church.

*edit: Most of Turing's work on this was done as an undergraduate at Cambridge, /before/ he became Church's doctoral student. So the work was not done under Church's supervision.

If you want to read further, you can read Soare's work on this tangled history:

http://www.people.cs.uchicago.edu/~soare/History/compute.pdf

https://en.wikipedia.org/wiki/History_of_the_Church%E2%80%93...

Surely Church must also have discovered the universal machine? Church's major result was the undecidability of the halting problem, and to set up the diagonalization proof you need a notion of interpreting a program. Looking at his paper[0], he has

THEOREM XII. It is possible to associate simultaneously with every well-formed formula an enumeration of the formulas obtainable from it by conversion, in such a way that the function of two variables, whose value, when taken of a well-formed formula A and a positive integer n, is the n-th formula in the enumeration of the formulas obtainable from A by conversion, is recursive.

THEOREM XVI. Every recursive function of positive integers is λ-definable.

So approximately, Theorem XII gives an algorithm for evaluating a program A n steps, and theorem XVI says that this algorithm can be compiled in the a λ-calculus program.

Church cites a paper by Kleene for this construction[1]. It's true that the Kleene doesn't prove this using a general recursion combinator like the Y combinator; instead he first shows how to λ-encode primitive recursive functions, and gives a combinator for the μ operator [2].

When you talk about Church "not following up" the notion of universality, I'm not sure what more you would want him to do. Of course he did not prove that λ-calculus was equivalent to Turing machines, because those had not been invented yet. But he and Kleene *did* prove that the λ-definable functions coincide with the μ-recursive functions, which was the best known model of universal computation. And he argues ([0] section 7) that this captures the intuitive notions of "a function for which there exists an algorithm" and "a function which can be proven to have a given value".

[0] https://www.ics.uci.edu/~lopes/teaching/inf212W12/readings/c... [1] https://projecteuclid.org/download/pdf_1/euclid.dmj/10774894... [2] https://en.wikipedia.org/wiki/%CE%9C-recursive_function

To add to other good answers here, I think there is something of the practical versus academic bias to be seen here as well. It was always hinted, but recently verified as documents have been declassified, that much of Mr. Turing's efforts after World War II were attempts to explore the consequences and abilities of devices he managed to actually build during the war, but was unable to continue working directly on after the war, so he did what he could skirting what tools he could use and how he could tell people based on what had been classified.

I think that makes Mr. Church and his American team's work all the more interesting because they hadn't built practical machines and presumably never saw practical machines and worked almost exclusively in theory.

But it's tougher for our culture sometimes to credit the theorists and academics than it is to credit practical genius. Especially with the declassified documents of the incredible practical computing work Mr. Turing accomplished during the war (and helping the war effort), it is easy to give him a lion's share of the credit and ignore some of the equally fascinating/important academic work of Mr. Turing's contemporaries. (Especially knowing in retrospect that the practical work fed the academic work and vice versa.)

Turing was Church's doctoral student, too. In fact, Church taught many of the greatest computer scientists of the last century. Perhaps he doesn't get as much credit as he should because his influence is so pervasive and impossible to separate from his peers' accomplishments.

Perhaps the manner of his last years and eventual death is the reason for that. The man has a statue on a bench in Manchester. It's in the gay village.

Turing gets more credit in popular culture, but academically Church is more influential. Just look at the list of Church's PhD students: http://www.genealogy.ams.org/id.php?id=8011

I have no clue why Turing became such a pop-culture icon, though...

Turing doesn't really have the status of a pop-culture icon. Outside of the phrase "Turing complete", and maybe historical accounts around breaking the Enigma, I doubt most people have even heard of him.

He's definitely more known than Alonzo Church, though.

What about "The Imitation Game"? I suppose that movie made Turing a quite well known figure, especially viz. Church.

>I have no clue why Turing became such a pop-culture icon, though...

Turing had personal attributes that culture-makers in NY and Hollywood want to promote to people.

Idunno, they made two equivalent models of computation. I think that Alan's discrete computational model is more useful for programmable computers, Alonzo's is more useful for FPGAs and circuitry. Alan just turned out to have the model which was most amenable to direct mapping of a stored-program computer. If you want to run Alonzo's model on a stored program computer, you need a compiler. And to do it efficiently, you need a better compiler than has ever been devised.

I often feel that Alan gets adequate credit but Alonzo too little. There must be something I don't get.

I don't think I saw it as an attempt to be original or funny. It's more interesting and educational in that it shows how to do build up a simple program from the very limited operations of lands calculus.

See, the original post about white boarding in LISP was original with certain humor to it.

I can't say this was even close in terms of enjoyment or originality. Maybe I missed the point?

Factorials in pure lambda calculus:

Another tangential reading recommendation: I only started reading about Lambda Calculus a few days ago. The Imposter's Handbook does a good job covering it, but also praises this medium article that's super helpful (and also explains the y combinator): https://medium.com/@ayanonagon/the-y-combinator-no-not-that-...

Reminded me about good old https://codon.com/programming-with-nothing

Oh, it definitely can be simplified a bunch. There's all sorts of stuff to prune out. I didn't even attempt to try and code golf at all. There's library definitions included in that snippet that probably aren't needed.

> Anyway, here’s fizz buzz in pure lambda calculus:

Serious question: if the code is legit and can't be simplified any shorter (like you could in any C style lang), can we consider lambda calculus the string theory of programming? By that I mean lots of work for little gain (as it turned out to be nowadays) ((I know it has gains in other fields)).

I can imagine it like an assembly like compile target and it can explain the verboseness.

The same thing, but in Ruby stabby-proc notation (from 2012) by Tom Stuart:

https://www.youtube.com/watch?v=VUhlNx_-wYk

Well, perhaps not *exactly* the same thing, but only because Ruby requires the Z-combinator due to applicative order.

And a more severe route would be to allow just S and K.

Just use Lazy K: https://tromp.github.io/cl/lazy-k.html

Instead of pure lambda calculus, another less severe route would be to simply restrict yourself to Church-encoded or Scott-encoded formulations of data structures.

I love this so much.

There's an aside about how I've chosen precedence. The way I've defined operator precedence and binding, (\x.x) y is not a valid expression (parentheses *only* mean function application), and (λx.x y) returns y. You can argue I defined things wrong, but it is all in how you define things.

Nice article! But you have a typo.

(λx.x y)

does not return y, though

(\x.x) y

does.