I'm not sure "functional programming language" means the same thing to any two people. Just like "object oriented programming", it once labeled what might have been a very precise idea, but has since been bundled and unbundled with any number of other concepts.
Unfortunately the term has become so muddled in the common usage that if at this point, if someone wants to call APL functional, they probably can.
The APL family actually has a fairly strong claim to the term given that APL was cited as a major influence in John Backus's lecture "Can Programming Be Liberated From the von Neumann Style? A Functional Style and its Algebra of Programs".
APL definitely influenced functional programming, but why would they even want to claim that word when they were already happy with array programming? What would Iverson say if he were still alive?
The term only has become muddled in the minds of those that don't want to learn, but have an interest in using it for marketing purposes — just like how people tried to redefine what Open Source means.
Functional Programming is programming with mathematical functions (aka pure functions).
And an FP language is one that has features, but also a culture that encourage FP as the main paradigm for solving problems.
Examples of FP languages: Haskell, SML, OCaml, Scala and Clojure.
Common Lisp, Emacs Lisp interestingly are not. Python, Java and C# are not. JavaScript is not, although it has a cultural shift in progress with pretty good results. Go at this point is actually anti-FP, if such a thing was possible.
But if you look at the history of the term, the Lisp or APL families of languages have just as much of a right to claim "functional programming" as their own as the ML family. I say this as someone who really likes OCaml and has never gotten along very well with Lisp.
Also I don't see how Go can be considered actively "anti-FP" when it supports passing closures as arguments, which many, many languages do not. I don't understand why (some) FP and Go partisans feel such a need to see one another as enemies.
Closures are simply an implementation detail of first-class procedures. But functional programming, unlike object-oriented programming, isn't just about making procedures first-class. It's about using, to the greatest extent possible, procedures that evaluate functions.
Yes, I understand that having closures does not mean that a language supports functional programming. What I am taking issue with is the idea that Go is somehow especially "anti-FP" when an enormous number of languages have even fewer core functional programming features.
Lisp was based on the lambda calculus, not sure if you can get more functional than that. Common Lisp was obviously more multi paradigm, having one of the most powerful object system ever. Scala is obviously also multiparadigm, though many would rather ignore that side of the language.
Most languages are multi paradigm, even clojure is used with an object-like entity component system, while there are papers that push Haskell as the best language for imperative programming. The word is kind of meaningless when applied to languages but more meaningful when applied to code.
It doesn't matter if the language is multi-paradigm. The most influential programming style in Common Lisp is not FP and if you'll take a look at any CLisp book, you'll be hard pressed to find any FP in it.
LISP might have been born of Lambda Calculus, but Common Lisp is no longer based on it.
Scala might be multi paradigm, but is the only language on top of the JVM where pure FP programming is made possible by a pretty good and well maintained ecosystem of libraries, see for example: https://typelevel.org/
LISP borrowed the "lambda" word from lambda calculus (not the Greek symbol). It didn't implement the lexical scope of lambda calculus, and provided lots of semantics other than lambda-calculus-like function application right off the bat.
Let's see: lambda calculus has no representation of its own code; it is not code. In lambda calculus, there is no QUOTE. There is no CAR nor CDR to walk around the code, no EVAL.
In lambda calculus, there is no mutation: no SETQ, no RPLACA.
Pure lambda calculus has no terms other than functions (unless extended); LISP had symbols, numbers, conses right off the bat. Arrays and character strings came soon.
All in all, equating Lisp with lambda calculus is silly.
Sure, but that would be an object language embedded in the lambda calculus, not the lambda calculus itself.
btw, variants of the lambda calculus do exist where some values (so-called “reference cells”) are object identities. But a value cannot be a lambda abstraction and a reference cell at the same time.
How is Go anti FP? It's anti-generic programming and I think that's a great idea.
As a side note, there are a LOT of considerations that are implicit within a language when you do functional programming. Let me explain. Let's consider a `map` function. There are at least 2 ways to write a non-generic map function:
func MapSafe(f func(int) int, l []int) []int {
retVal := make([]int, len(l)
for i := range l {
retVal[i] = f(l[i])
}
return retVal
}
func MapClobber(f func(int) int, l []int) []int {
for i := range l {
l[i] = f(l[i])
}
}
Any good FP-er will tell you clobbering data is NOT a good idea - that functional programming is all about immutable data. Yeah, the immutable data version however, allocates memory. So you either need:
a) impressive compiler building skills to reason that for some application of `map` clobbering can be done, and for others it cannot;
b) failing to do that, you would need a very awesome GC that handles memory that work on a very short timescale (that is, within callframes)
Rust is nice in the sense that the compiler does a LOT of the heavy lifting, but it's far from userfriendly. Requiring users to manage and be clear about memory ownership is a Hard Sell (though I will concede, a Good Idea). Haskell is another language which relies heavily on the compiler, and GHC is an amazing compiler. However, performance for Haskell isn't that great. It's on the opposite end of the scale: super user friendly, offloads a lot of work to the compiler, but suffers in performance.
The problem is I don't think we're there yet. Not with Go, but with functional languages. State wrangling is still a problem. Running away from it by hiding under layers of functional language works for a small number of problems (business rules,etc) but in my view, not the majority of the problems. Programmers don't spend a lot of time on high level problems.
A long time ago, there was a compiler for Haskell called JHC. It was nice because you could use it to write C, which would expose the underlying state for the programmer to modify to her heart's content. The project's been dead for 5-6 years now I think. GHC has a weird --from-c option that I've never been able to use mainly because you needed to recompile GHC from scratch, which again is something I struggle with and rapidly give up.
> how is Go more anti FP than Java or other more classic imperative languages?
Than Java or any of a number of other popular languages, because it uses a static type system without generics. You can build functional control structures up from common imperative ones as well as from an in-language base of the simpler functional ones, but without either dynamic typing or generics, this is a painful exercise in copy-and-paste coding.
> Functional Programming is programming with mathematical functions (aka pure functions).
Exactly. To which the following must be added: Functions are mappings from values to values. It makes no sense to try to do functional programming in a language whose semantics does not provide a rich enough supply of values.
> And an FP language is one that has features, but also a culture that encourage FP as the main paradigm for solving problems.
Culture matters very little. What really matters is what the language's semantics allows.
> Culture matters very little. What really matters is what the language's semantics allows.
I disagree. Java's semantics permit defining pure functions, immutable values and persistent data structures, but the "culture" is biased towards idiomatic Java not FP. The languages syntax, defaults and core libraries can make it difficult to program in an FP style. The syntax, defaults and libraries will not change while the culture persists.
Most of my work involved javascript, and me being enamoured by functional programming, I tried applying it as much as I could.
And yet it was only when I used a language that was designed for it that I realized how often I used various 'escape hatches', and when I learned the 'meat' of functional programming.
It's possible that I was unusually lazy about it all, but I doubt that. I really was a huge fan of functional programming. But if the language 1) gets in the way (this was pre-arrow functions) and 2) makes it easy to take non-functional shortcuts, it's just too easy to resist.
So, as you also say, the 'culture' is biased to non-FP programming. it's hard to go against that without at least some experience with (slightly) stricter FP languages.
> Java's semantics permit defining pure functions, immutable values
Only by social convention. Not enforced by the language's semantics in any meaningful way. In fact, Java does not even allow the programmer to define custom values. All values are primitives or object identities. Note that so-called “value objects” will not fix anything.
A fundamental property of functions is so-called “function extensionality”. Two functions “f, g : Foo -> Bar” are equal if and only if “f(x) = g(x)” for every “x \in Foo”. In particular, this means that there should be no equality testing operator for functions, because function equality is undecidable.
---
OCaml at least has bona fide values. It allows you to implement procedures that evaluate interesting functions (i.e. value-to-value mappings), even if it doesn't typefully distinguish a special class of procedures that are syntactically guaranteed to evaluate functions.
F# and Clojure are lost cases, though.
As for Haskell's IO and IORef (or ML's ref), there is absolutely nothing wrong with the ability to implement procedures that do other things besides evaluating functions. The problem is when you lack the ability to express interesting functions, and the root cause is the lack of a sufficiently rich universe of values.
---
Your mention of purity completely misses the point. The ability to do traditional imperative programming is a feature, not a bug! The problem here is the inability to do functional programming. The reasons for this are technical, not cultural.
> Only by social convention. Not enforced by the language's semantics
Many FP languages do not enforce purity (OCaml, F#, Clojure). Even in Haskell, one could use IO and IORefs everywhere, the semantics fully permit writing "Java code" in Haskell. We need a culture and an understanding of what it means to write idiomatic Haskell.
EDIT: responding to your edit:
> The problem here is the inability to do functional programming. The reasons for this are technical, not cultural.
Yes. But it's not just problems with semantics, it's also the syntax, defaults and available libraries. Culture feeds into all of this, especially as the language and ecosystem evolves. I'm arguing that it does matter.
I can't prove anything about programs other people will write, unless the language itself makes guarantees about the programs they can write. Culture doesn't help one iota.
> In fact, Java does not even allow the programmer to define custom values. All values are primitives or object identities.
Immutable objects do not have identity and are values. And the distinction between primitives and objects in Java, for our discussion here, is irrelevant.
That the language does not enforce purity in any way (besides `final` members and values) that's irrelevant. A lot of things in programming happen by convention.
That Haskell can force purity via its laziness, that's a nice feature of the language, but not required for doing FP, just like how static types aren't required for doing FP either — Haskell devs would like to promote the notion that Haskell and derivatives are the only true languages for FP, but they've been preceded by LISP and ML.
A lot of things in programming happen by convention. That's not a good argument, or an argument that can be used efficiently for language advocacy. OOP isn't supported explicitly by C either, yet people have built an entire GUI/window manager, along with apps, on top of GObject, which in many ways is more OOP than C++ ;-)
> And the distinction between primitives and objects in Java, for our discussion here, is irrelevant.
All it takes to disabuse you of this notion is a little bit of reflection.
> That Haskell can force purity via its laziness
This is not true. Purity is enforced via the type system.
> but not required for doing FP, just like how static types aren't required for doing FP either
Completely agree here.
> but they've been preceded by LISP and ML.
ML is a functional language. Scheme already forces you to squint your eyes a lot. Lisp is not a functional language by any stretch of the term's meaning.
Even if their encoding has identify, immutable value don’t need it. Languages like C# allow you to throw off identity altogether with pass/store by value structs.
> Even if their encoding has identify, immutable value don’t need it.
Values indeed don't have physical identities. The problem is that, when you use a language with pervasive object identities, the values you want only exist in your head, not in the semantics of the language you are using.
> Languages like C# allow you to throw off identity altogether with pass/store by value structs.
Please teach me how to define list or tree values in C#, sensei.
Unfortunately the term has become so muddled in the common usage that if at this point, if someone wants to call APL functional, they probably can.