The term "axiomatic system" is consequently misused in the article. Probably the author had a "formal system" in mind, an axiomatic system is one in which one can deduce theorems from axioms and I don't think lambda calculus is such a system. A lot of the statements in the article are so vague and imprecise, it is hard to assign any meaning to them. Just look at the fragment of the article:
"Common Lisp is one answer to this question. Its core is an axiomatic system, the lambda calculus, which John McCarthy extended just enough to write an interpreter for the resulting calculus in itself."
Lambda calculus is not at the core of Common Lisp or even Lisp, it was just an inspiration for Lisps model of computation.
"Common Lisp augments the interpreter with other automata on the same substrate which make construction straightforwards. Though the choice of automata may vary, the essential idea of reifying the act of construction is universal."
Can someone explain what is he trying to say here? You can build languages with extremely different semantics while still keeping the ideas of S-expressions, centering the language around function application and closures, for example you can choose lazy evaluation over eager evaluation and end up with a very different language. In no way is lambda calculus some immutable core of Lisp, its model of computation is much more complex and arbitrary.
"Common Lisp and Haskell are built around variants of the lambda calculus. Each substrate makes certain approaches to problems natural, though neither is a natural vehicle for human thought. The brain’s adaptation to the lambda calculus is usually experienced as an epiphany."
Seldom if ever has programming in Lisp or Haskell anything to do with lambda calculus, unless of course you mean in a way so vague the term loses its meaning... He then goes one to essentially describe how Haskell is referentially transparent and Common Lisp is not, but he makes it sound like it is some philosophical difference deeply rooted in the ideas behind Lisp or Haskell, while it's just a design decision where each option has specific tradeoffs. To me all this is just some "metaphysical" mumbo-jambo, the metaphor comparing programming language to an artists medium from which he starts doesn't bring any practical insight into anything - I think some people like it only because the greatest painters are widely admired by the general public, while this is rarely so even with the greatest programmers, so someone might feel his job is a little bit cooler when reading this. I do not see what real insight someone could gain from this article.
Totally agree with you on how difficult this is to read. Author is speaking some form of high academic CS (and it isn't obviously used correctly).
"Common Lisp augments the interpreter with other automata on the same substrate which make construction straightforwards. Though the choice of automata may vary, the essential idea of reifying the act of construction is universal."
Here he is basically saying that you've turned the elements of program construction into a data-structure (the s-expression). I believe automata just refers to any program in this context. (Think computability + complexity; languages executable by a computer can be described in terms of different types of automata... NFA, DFA, PDA etc). So you are making program construction into a data-structure which can be used to manipulate programs construction, or can be manipulated itself.
In the PLs context, "axiomatic system" has been used for decades, especially in Lispy parts of PLs, in a sort of metaphorical way to refer to languages that start from a small core of primitive built-ins that are taken as givens (the "axioms"), and then build up the rest of the language by composing those "axioms" via "theorems" within the language that define the higher-level functionality. John McCarthy introduced the axiom/theorem terminology in his 1960 Lisp paper. Paul Graham is possibly responsible for re-popularizing it, since he uses that terminology a lot when discussing the design of Arc.
Can you back up what you said with any link or quotation? I did not find any mention of the word "axiom" in McCarthy's 1960 paper ("Recursive functions of symbolic expressions and their computation by machine"). Google for "axiomatic system lisp" pops up my own comment above as the first result.
I doubt this is in any way an established terminology, sounds again just like a very sloppy metaphor on part of the author - are complex pieces build from simple primitive pieces enough to call for "theorems" and "axioms" ?
I don't disagree. The use of technical terms is sloppy, and some of the examples are weird (at the end, Prolog is said to be somehow an experiment along the same direction as Scheme...).
But I thought this was smart:
"C was shaped by poets. Kernighan, Ritchie, Pike, and Thompson built the language alongside the idiom. This had an effect equivalent to Petrarch writing in Italian. The language suddenly found itself in possession of existence and poetry at once. It has no theoretical basis. It has no overarching logic. [...] but in the hands of a master, it scintillates as its brethren do not."
It gets at the way the concise coding style in the K&R book guided people to a programming idiom that is pleasurable to write and read in a way that comparable languages, say Pascal or Basic, are not.
| λ calculus | machine memory
---------------------+---------------+------------------
language as given | Haskell | C
---------------------+---------------o------------------
language as medium | Common Lisp | Forth
| |
"Language as given" seems to mean that the language provides a level of abstraction that the programmer is not intended to break, and "language as medium" means that the language is a means of manipulating a lower-level structure, which the programmer is supposed to be aware of.
That's also completely incorrect for the way people use C nowadays. You use C because you want to control the machine, and people write their code based on machine word size, caches, etc.
Hm, I read "language as given" as that programmers use the
language to write their programs, whereas "language as medium" implies using the language to write a language (to write a language,...) to write their programs in.
Of course, one can argue that is just a gradual difference. After all, any sufficiently advanced C program will contain half of Common Lisp (or use arrays of function pointers to get half a Forth)
Yeah, but sometimes programmers ignore that. For instance, C is defined in terms of chunks of memory that are all the same, but the way people actually write C is by thinking of cache line sizes and disk-read chunks, which mean that all chunks of memory are not the same.
The alternative is to write your programs using the model provided and trust that they will be fast enough. I think that's how Haskell is written now, but I'm not sure.
In either case, though, this is a property of how the language is used rather than the language itself, at least as I understand what he means.
I have no idea about the Haskell/CL thing - I don't know much about how either language is used.
I thinks the given vs medium relates to whether a language is image based. Forth and Lisp are image based so you could say "you're sculpting the medium". Also Lisp and Forth are more friendly towards hacking the language it self.
I've been learning Haskell for the past few months, and it is the most expressive language I've ever looked into. (More so even than lisp, I thought anyway, but I have only spent a couple of weeks in that direction).
I thought, why bother with python / ruby, save for existing libraries and tools already made? All I need is Haskell and C!
This article explains it. That's right, I don't! I should go read about Forth, though.
(I still use python, precisely because of existing libraries and tools.)
Haskell is temporarily the extreme. There are other languages hovering around this same fringe such as Clean, and others which may well be beyond it like Cayenne.
Instead of Cayenne I think the action in languages with dependent types is greater in Agda and maybe Epigram at the moment. I see much more happening with Haskell than with Clean right now, but maybe I need to get out (or stay in) more.
Lisp is its own model of evaluation. It had a procedural-abstraction operator, LAMBDA, which was inspired by Church but LAMBDA didn't become a "true" lambda until the seventies with Scheme. (CL always had lexical scoping and a true lambda, because it is younger than Scheme and its designers deliberately adopted these ideas.)
The essence of the article, if you can't get it from the cache, is this chart: (sorry, can't seem to make it look
Language/core style | Lambda calculus core | Machine memory core
--------------------+----------------------+----------------
language as given | Haskell | C
language as medium | Common Lisp | Forth
"Common Lisp is one answer to this question. Its core is an axiomatic system, the lambda calculus, which John McCarthy extended just enough to write an interpreter for the resulting calculus in itself."
Lambda calculus is not at the core of Common Lisp or even Lisp, it was just an inspiration for Lisps model of computation.
"Common Lisp augments the interpreter with other automata on the same substrate which make construction straightforwards. Though the choice of automata may vary, the essential idea of reifying the act of construction is universal."
Can someone explain what is he trying to say here? You can build languages with extremely different semantics while still keeping the ideas of S-expressions, centering the language around function application and closures, for example you can choose lazy evaluation over eager evaluation and end up with a very different language. In no way is lambda calculus some immutable core of Lisp, its model of computation is much more complex and arbitrary.
"Common Lisp and Haskell are built around variants of the lambda calculus. Each substrate makes certain approaches to problems natural, though neither is a natural vehicle for human thought. The brain’s adaptation to the lambda calculus is usually experienced as an epiphany."
Seldom if ever has programming in Lisp or Haskell anything to do with lambda calculus, unless of course you mean in a way so vague the term loses its meaning... He then goes one to essentially describe how Haskell is referentially transparent and Common Lisp is not, but he makes it sound like it is some philosophical difference deeply rooted in the ideas behind Lisp or Haskell, while it's just a design decision where each option has specific tradeoffs. To me all this is just some "metaphysical" mumbo-jambo, the metaphor comparing programming language to an artists medium from which he starts doesn't bring any practical insight into anything - I think some people like it only because the greatest painters are widely admired by the general public, while this is rarely so even with the greatest programmers, so someone might feel his job is a little bit cooler when reading this. I do not see what real insight someone could gain from this article.