What are the key differences between the approaches taken by Scala and F# to unify OO and FP paradigms?
EDIT
What are the relative merits and demerits of each approach? If, in spite of the support for subtyping, F# can infer the types of function arguments then why can't Scala?
I have looked at F#, doing low level tutorials, so my knowledge of it is very limited. However, it was apparent to me that its style was essentially functional, with OO being more like an add on -- much more of an ADT + module system than true OO. The feeling I get can be best described as if all methods in it were static (as in Java static).
See, for instance, any code using the pipe operator (
|>
). Take this snippet from the wikipedia entry on F#:The function
map
is not a method of the list instance. Instead, it works like a static method on aList
module which takes a list instance as one of its parameters.Scala, on the other hand, is fully OO. Let's start, first, with the Scala equivalent of that code:
Here,
map
is a method on the instance ofList
. Static-like methods, such asintWrapper
onPredef
orapply
onList
, are much more uncommon. Then there are functions, such asfib
above. Here,fib
is not a method onint
, but neither it is a static method. Instead, it is an object -- the second main difference I see between F# and Scala.Let's consider the F# implementation from the Wikipedia, and an equivalent Scala implementation:
The above Scala implementation is a method, but Scala converts that into a function to be able to pass it to
map
. I'll modify it below so that it becomes a method that returns a function instead, to show how functions work in Scala.So, in Scala, all functions are objects implementing the trait
FunctionX
, which defines a method calledapply
. As shown here and in the list creation above,.apply
can be omitted, which makes function calls look just like method calls.In the end, everything in Scala is an object -- and instance of a class -- and every such object does belong to a class, and all code belong to a method, which gets executed somehow. Even
match
in the example above used to be a method, but has been converted into a keyword to avoid some problems quite a while ago.So, how about the functional part of it? F# belongs to one of the most traditional families of functional languages. While it doesn't have some features some people think are important for functional languages, the fact is that F# is function by default, so to speak.
Scala, on the other hand, was created with the intent of unifying functional and OO models, instead of just providing them as separate parts of the language. The extent to which it was succesful depends on what you deem to be functional programming. Here are some of the things that were focused on by Martin Odersky:
Functions are values. They are objects too -- because all values are objects in Scala -- but the concept that a function is a value that can be manipulated is an important one, with its roots all the way back to the original Lisp implementation.
Strong support for immutable data types. Functional programming has always been concerned with decreasing the side effects on a program, that functions can be analysed as true mathematical functions. So Scala made it easy to make things immutable, but it did not do two things which FP purists criticize it for:
Support for Algebraic Data Types. Algebraic data types (called ADT, which confusingly also stands for Abstract Data Type, a different thing) are very common in functional programming, and are most useful in situations where one commonly use the visitor pattern in OO languages.
As with everything else, ADTs in Scala are implemented as classes and methods, with some syntactic sugars to make them painless to use. However, Scala is much more verbose than F# (or other functional languages, for that matter) in supporting them. For example, instead of F#'s
|
for case statements, it usescase
.Support for non-strictness. Non-strictness means only computing stuff on demand. It is an essential aspect of Haskell, where it is tightly integrated with the side effect system. In Scala, however, non-strictness support is quite timid and incipient. It is available and used, but in a restricted manner.
For instance, Scala's non-strict list, the
Stream
, does not support a truly non-strictfoldRight
, such as Haskell does. Furthermore, some benefits of non-strictness are only gained when it is the default in the language, instead of an option.Support for list comprehension. Actually, Scala calls it for-comprehension, as the way it is implemented is completely divorced from lists. In its simplest terms, list comprehensions can be thought of as the
map
function/method shown in the example, though nesting of map statements (supports withflatMap
in Scala) as well as filtering (filter
orwithFilter
in Scala, depending on strictness requirements) are usually expected.This is a very common operation in functional languages, and often light in syntax -- like in Python's
in
operator. Again, Scala is somewhat more verbose than usual.In my opinion, Scala is unparalled in combining FP and OO. It comes from the OO side of the spectrum towards the FP side, which is unusual. Mostly, I see FP languages with OO tackled on it -- and it feels tackled on it to me. I guess FP on Scala probably feels the same way for functional languages programmers.
EDIT
Reading some other answers I realized there was another important topic: type inference. Lisp was a dynamically typed language, and that pretty much set the expectations for functional languages. The modern statically typed functional languages all have strong type inference systems, most often the Hindley-Milner1 algorithm, which makes type declarations essentially optional.
Scala can't use the Hindley-Milner algorithm because of Scala's support for inheritance2. So Scala has to adopt a much less powerful type inference algorithm -- in fact, type inference in Scala is intentionally undefined in the specification, and subject of on-going improvements (it's improvement is one of the biggest features of the upcoming 2.8 version of Scala, for instance).
In the end, however, Scala requires all parameters to have their types declared when defining methods. In some situations, such as recursion, return types for methods also have to be declared.
Functions in Scala can often have their types inferred instead of declared, though. For instance, no type declaration is necessary here:
List(1, 2, 3) reduceLeft (_ + _)
, where_ + _
is actually an anonymous function of typeFunction2[Int, Int, Int]
.Likewise, type declaration of variables is often unnecessary, but inheritance may require it. For instance,
Some(2)
andNone
have a common superclassOption
, but actually belong to different subclases. So one would usually declarevar o: Option[Int] = None
to make sure the correct type is assigned.This limited form of type inference is much better than statically typed OO languages usually offer, which gives Scala a sense of lightness, and much worse than statically typed FP languages usually offer, which gives Scala a sense of heavyness. :-)
Notes:
Actually, the algorithm originates from Damas and Milner, who called it "Algorithm W", according to the wikipedia.
Martin Odersky mentioned in a comment here that:
He goes on to state that it may not be actually impossible, and it came down to a trade-off. Please do go to that link for more information, and, if you do come up with a clearer statement or, better yet, some paper one way or another, I'd be grateful for the reference.
Let me thank Jon Harrop for looking this up, as I was assuming it was impossible. Well, maybe it is, and I couldn't find a proper link. Note, however, that it is not inheritance alone causing the problem.