#### Subtitles: - Polymorphism without curly braces. - Lifting == next polymorphism. - How to program your bike tube. - Grandparent of Functional Programming.

I have finished my last year posting by writing about polymorphism, I want to start on the same subject and do a better job. This is the longest post I ever wrote and I hope the longest I ever will. I hope you will find it interesting and Thank You for reading!

**POST INTRO:**

- 'Doctor, each time I see a code like that in Groovy

def listX = listY.collect{ ... }

I see a torus and want to ride my bike.'

- 'Oh no! you have a case of Polymorphism'

This post is not about what polymorphism is. It is about what polymorphism can be.

**POST BODY:**

**Definition:**For an OO programmer polymorphism means a more sophisticated version of a cookie-cutter. Typically this term is associated with the idea of applying the same 'thing' across many different 'things'. I want to define it as

*Apply concept A to a set of concepts X*. Most often A will be an operation/method (or a set of these) and will describe some type of a behavior, so a developer can say that

*A can be polymorphically applied to set X*.

Classic OO imperative polymorphism suffers a bit from weak semantics where 'A' does not really mean 'A'. To achieve a 'Better Polymorphism' we will need to work on stronger semantics. Most interesting stuff happens when X contains bunch of different things and A provides a 'viewpoint' on all the things in X, but the essence of 'A' stays intact in some way. These are the cases which go beyond interesting and become an eye opening intellectual experience.

**Classic Examples of OO Polymorphism:**As I said, OO polymorphism is typically semantically weak: apples become oranges so we can make ourselves a polymorphic screwdriver.

One set of such examples is when X=everything. If you can say something about everything then it is either obvious or incorrect. Still, many programming examples do exactly that:

**In obvious category:**

Java Object.toString(): A='describe yourself', X=anything

**In incorrect category:**

Java Object.equals(Object o): A='concept of being equal', X=anything

(Why do I think that is incorrect? Why would you think a meaningful equals comparison is always possible computationally or even logically? But that is a longer story well beyond this post ...)

In Java, 'equals' becomes something less than what the 'real equals' should be.

**More interesting example**: C++ or Groovy overloading of +: A='+ operation', X includes numbers, strings, collections, maybe a timestamp and duration, whatever else programmer decides makes sense with '+'.

This is all very imperative, programmer 'implements' and decides what '+' means. So often '+' becomes just a neat syntax and any deeper 'essence' of '+' is lost.

**Polymorphism in Other Sciences?**Can software engineering learn something new about polymorphism from other sciences? My examples will have a heavy bias on math, because math was my strong interest in the past and I still remember a little bit about it. I challenge you to come up with your own examples.

Partial Differential Equations (PDEs) offer many examples fitting well into my definition of polymorphism.

**PDE Example 1:**Often the same family of equations is used to model a range of totally different concepts. It maybe no surprise that the same set of equations models gas flow and traffic congestion. It is much harder to grasp that the famed Black–Scholes formula for pricing stock options (in Finance) is really a time-reversed (or backward) heat equation (in Physics)!

So: A=single PDE, X=parts of Physics, Finance and maybe other parts of science.

The essence of that PDE is described by the PDE itself, it is the same equation, yet it can be polymorphically applied to a large set of examples. This is 'semantically strong' and often eye opening!

**PDE Example 2**: Once upon a time, I wrote a paper studying convergence of numerical solutions to a system of PDEs called Broadwell Model. I studied these using ... fractals.

So A=fractals, X=a PDE model in gass dynamics.

To do that I had to, of course, do more than just to start using fractal lingo when talking about PDEs (that would be nonsense).

**PDE Example 3:**Compensated Compactness Theory studies convergence of solutions to PDEs using ... measure theory.

Let us move on to something else than PDEs (or is it something else only because we have not found a polymorphic enough way to look at it yet? ;) ).

**Grandparent of Functional Programming:**There is this super cool part of math called Algebraic Topology. This branch of math is just one big polymorphism and is one of the most intellectually stimulating pieces of science I have ever seen.

Algebraic Topology transforms manifolds (geometrical and topological concepts) into groups (algebraic concepts). I remember using phrases like 'thinking in CAPS' or 'functorial thinking' to describe the overall idea of such high level transforming. I was not much into programming at that time so the term polymorphism did not cross my mind.

Still, the definition of polymorphism is matched perfectly (A=Algebra Groups, X=Manifolds).

Many things that have been extremely hard to prove for an orthodox topologist became almost trivial to do in algebra. This polymorphic approach offered by Algebraic Topology was a game changer in understanding of manifolds and lead to discovery of a lot of very cool math.

Algebraic Topology does its transforming in several different ways. These include concepts like homology, cohomology, homotopy .... All of these are a way to map a manifold into an algebra group preserving the 'essence' of that manifold in some way.

**Side Note to get some intuition about what Algebraic Topology is about:**2-D sphere is totally different than 2-D torus (think of surface of a tube in your bike). Both are hollow but in a very different way!

On the sphere, all closed loops can be shrinked to a point so H1 (single dimension homology) is trivial. On the 2-D torus you can draw loop around each of the 2 perimeters and you can start adding these loops (groups are about adding things) by rotating around one perimeter n-times and around the other k-times. You can stretch and shrink these loops all you like, they will still rotate around torus perimeters the same number of times. So H1 of 2-D torus is the same as product of all Integers (understood as a group with + operation) with itself!

There are many other examples where concepts in one branch of math are used to solve problems in another branch... but I need to stop, this post is not about math.

**Another Side Note:**I hope I did not lie too much. This post brought some very fond memories, but I have not done math, and Algebraic Topology in particular, for so long.

**Functor:**We need something which is good at 'transforming concepts preserving essence of things'. The name I am thinking about here is 'Functor' and all the transformation used in Algebraic Topology (homology, cohomology, ...) are, in fact, functors. (Not to be confused with C++ concept using the same name).

Functors have been studied on their own by a branch of math called Category Theory (I never embedded myself much in it). Category Theory is arguably the parent of Functional Programming. If that is so, Algebraic Topology is the grandparent. And that is the coolest grandma any science can have!

Before we move on, I would like to assert one observation, science seems to have no use for semantically weak polymorphism, it cares that A is still A in some deep way and that it is not just a name universally applied to different concepts.

**Functor Definition:**So what is a functor? Here is a wikipedia link for you: http://en.wikipedia.org/wiki/Functor The most intuitive way to think about it: functors are for mapping things over. It must map identity into identity and preserve morphism composition (think for now that morphism==function and 'o'is function composition). The covariant functor definition rules are copied here:

F(id) = id

F(f o g) = F(f) o F(g)**Functor in Programming!:**Can you think of a functor in programming? There are actually a few of them but I will just explain one: Think of transforming which works on a very high level (remember this is a high level thinking, thinking in CAPS). Think of transforming types, one type into another. One such transform is

T -> List

I like the [] notation so I will call this transform (or mapping):

[]: T -> [T]

but Java hardcores should read this as List: T -> List < T >

(which with Java type erasure means nothing, but well, still makes some sense, right?).

There is a method in Groovy working on collections called 'collect' and it's equivalent is typically called 'map' in other languages. Here is an example usage:

assert [1,4,9] == [1,2,3].collect {it*it}

I am sure you agree that:

Closure id = {it}

assert anyList.collect(id) == anyList == id(anyList)

and that is the first rule for being a functor. Let test the other one using an example:

assert [1,5,3].collect(it-2).collect(it * it) ==

[1,5,3].collect((it-2) * (it-2))

Yes, Groovy's collect makes [] into a covariant functor!

**Better Polymorphism (using Functor):**Now since the [] operation preserves 'essence' of things, I should be able to use it. Suppose I want to lowercase all elements from a list of strings.

You could write this:

def lowerCaseList = ['HELLO', 'THERE'].collect {it.toLowerCase()}

but that is just missing the point altogether (and it uses evil curly braces). I want to be able to write something closer to the following (because functors preserve essence of things):

def lowerCaseList = ['HELLO', 'THERE'].toLowerCase()

the idea is that the function is magically lifted or mapped over by the [] functor so it can operate on functor values. After all, compiler should know that [] is a functor!

Maybe in the future... for now, at least in Haskell, you could do something like this

lowerCaseList = (fmap toLowerCase) ['HELLO', 'THERE']

That is some polymorphism for you!!! Think of different functors ([] is just one example) and ability to write a cookie-cutter code across all of them. Examples are

*Maybe*(a concept replacing handling of null values) and

*Either*(concept of handling error conditions without ugly try-catch). You code might even end up working on the tube in your bike! (think of a changeMe function ;).

**Even Better than Functor! (== Better than Better Polymorphism):**Functor is not the end of the story for 'preserving the essence of things' as far as programming goes, it is the begining.

There are 2 stronger super-concepts: Applicative Functors improve upon Functors and Monads improve upon Applicative Functors.

The idea is that I do NOT want to implement '+' operation on lists! I want the language just to know what to do! If I know what 1+4 is and 1+5 is, etc I should automatically know what [1,2,5] + [4,5] is! There is a little issue with using plain functors because they do not understand interactions between elements. This is where Applicative Functor solves the problem!

It turns out that there are two natural ways to make [] an Applicative Functor (and I wrote a bit about both in my previous post). Here is the recap: one way is so that:

[1,2,5] '+' [4, 5]=[1+4, 1+5, 2+4, 2+5, 5+4, 5+5]

(this is called non-deterministic computing)

and the other works so that:

[1,2,5] '+' [4, 5] = [1+4, 2+5]

(these are zip lists). Haskell has decided to make [] into non-deterministic computing and created a separate type called ZipList to handle the other case.

Now we could 'lift' any two or more argument operations automatically! There is some extra typing involved in Haskell (you cannot just say [1,2,5] + [4,5]) but the work is really done for you and this extra typing is minimal.

I need to stop somewhere so I am not even going to try to define Applicative Functor.

We have done a full loop, from OO imperative '+' example to the '+' example preserving the 'essence' of '+'.

**POST SUMMARY:**So all this transforming can lead to a very interesting polymorphism. List is polymorphic with a tube in your bike! So how is that useful? I think some points are worth reiterating:

(1) The fact that mathematical concepts like homology are polymorphic with several programming concepts (like homogeneous lists, handling of nulls

*(Maybe)*or handling of error conditions

*(Either)*) is very much an eye opener. That means that, in particular, we could write code that works across all these concepts! In fact, Functor is a

*class type*(kinda like interface) in Haskell. OK, there is probably not that much that can be coded at such high level of abstraction, but more coding is possible (and done) against Applicative Functor (also a class type in Haskell) and even more against Monad (again a class type). This is a true high level coding in CAPS!

(2) And these concepts are not apples and oranges polymorphic, they are 'better polymorphism' because they preserve the essence of things is some way. That means we can 'lift' functionality. If you wrote a library that does interesting things with type T, a lot of your work could be just 'lifted' to [T] (or any other Functors mapping T) ... and even more of your code will 'lift' for free to any Applicative Functor.

(3) Let me emphasize the 'lifting' point. 'Lifting' is the next level of polymorphism. This concept is equivalent to some of smartest parts of math ever, Algebraic Topology lifts algebra into topology. Other math examples lifted Measure Theory or Fractals into PDEs.

But wait, examples with PDEs did not use functors! What I am trying to say is that functor is not the only way to lift. It is however the only way to lift I know in computing.

I hope you enjoyed this post. It has been written by a newbe Functional Programming enthusiast, but it covered some less than trivial stuff (maybe this is a risky combination). It also was opinionated like my other posts, but I hope opinionated in a good way :). And I hope it offered a unique perspective even to some Functional savvy developers.

By the way, did you notice that my post is visually organized into 3 sections making it polymorphic with other writings using this important presentation guideline. You think that is cool, NO, it is not cool! That is just a contract-based, imperative, old style polymorphism, Gotcha! ;)