Scala versus F# question: how do they unify OO and

2019-03-07 23:08发布

What are the key differences between the approaches taken by Scala and F# to unify OO and FP paradigms?

EDIT

What are the relative merits and demerits of each approach? If, in spite of the support for subtyping, F# can infer the types of function arguments then why can't Scala?

7条回答
Melony?
2楼-- · 2019-03-08 00:09

I have looked at F#, doing low level tutorials, so my knowledge of it is very limited. However, it was apparent to me that its style was essentially functional, with OO being more like an add on -- much more of an ADT + module system than true OO. The feeling I get can be best described as if all methods in it were static (as in Java static).

See, for instance, any code using the pipe operator (|>). Take this snippet from the wikipedia entry on F#:

[1 .. 10]
|> List.map     fib

(* equivalent without the pipe operator *)
List.map fib [1 .. 10]

The function map is not a method of the list instance. Instead, it works like a static method on a List module which takes a list instance as one of its parameters.

Scala, on the other hand, is fully OO. Let's start, first, with the Scala equivalent of that code:

List(1 to 10) map fib

// Without operator notation or implicits:
List.apply(Predef.intWrapper(1).to(10)).map(fib)

Here, map is a method on the instance of List. Static-like methods, such as intWrapper on Predef or apply on List, are much more uncommon. Then there are functions, such as fib above. Here, fib is not a method on int, but neither it is a static method. Instead, it is an object -- the second main difference I see between F# and Scala.

Let's consider the F# implementation from the Wikipedia, and an equivalent Scala implementation:

// F#, from the wiki
let rec fib n =
    match n with
    | 0 | 1 -> n
    | _ -> fib (n - 1) + fib (n - 2)

// Scala equivalent
def fib(n: Int): Int = n match {
  case 0 | 1 => n
  case _ => fib(n - 1) + fib(n - 2)
}

The above Scala implementation is a method, but Scala converts that into a function to be able to pass it to map. I'll modify it below so that it becomes a method that returns a function instead, to show how functions work in Scala.

// F#, returning a lambda, as suggested in the comments
let rec fib = function 
    | 0 | 1 as n -> n 
    | n -> fib (n - 1) + fib (n - 2)

// Scala method returning a function
def fib: Int => Int = {
  case n @ (0 | 1) => n
  case n => fib(n - 1) + fib(n - 2)
}

// Same thing without syntactic sugar:
def fib = new Function1[Int, Int] {
  def apply(param0: Int): Int = param0 match {
    case n @ (0 | 1) => n
    case n => fib.apply(n - 1) + fib.apply(n - 2)
  }
}

So, in Scala, all functions are objects implementing the trait FunctionX, which defines a method called apply. As shown here and in the list creation above, .apply can be omitted, which makes function calls look just like method calls.

In the end, everything in Scala is an object -- and instance of a class -- and every such object does belong to a class, and all code belong to a method, which gets executed somehow. Even match in the example above used to be a method, but has been converted into a keyword to avoid some problems quite a while ago.

So, how about the functional part of it? F# belongs to one of the most traditional families of functional languages. While it doesn't have some features some people think are important for functional languages, the fact is that F# is function by default, so to speak.

Scala, on the other hand, was created with the intent of unifying functional and OO models, instead of just providing them as separate parts of the language. The extent to which it was succesful depends on what you deem to be functional programming. Here are some of the things that were focused on by Martin Odersky:

  • Functions are values. They are objects too -- because all values are objects in Scala -- but the concept that a function is a value that can be manipulated is an important one, with its roots all the way back to the original Lisp implementation.

  • Strong support for immutable data types. Functional programming has always been concerned with decreasing the side effects on a program, that functions can be analysed as true mathematical functions. So Scala made it easy to make things immutable, but it did not do two things which FP purists criticize it for:

    • It did not make mutability harder.
    • It does not provide an effect system, by which mutability can be statically tracked.
  • Support for Algebraic Data Types. Algebraic data types (called ADT, which confusingly also stands for Abstract Data Type, a different thing) are very common in functional programming, and are most useful in situations where one commonly use the visitor pattern in OO languages.

    As with everything else, ADTs in Scala are implemented as classes and methods, with some syntactic sugars to make them painless to use. However, Scala is much more verbose than F# (or other functional languages, for that matter) in supporting them. For example, instead of F#'s | for case statements, it uses case.

  • Support for non-strictness. Non-strictness means only computing stuff on demand. It is an essential aspect of Haskell, where it is tightly integrated with the side effect system. In Scala, however, non-strictness support is quite timid and incipient. It is available and used, but in a restricted manner.

    For instance, Scala's non-strict list, the Stream, does not support a truly non-strict foldRight, such as Haskell does. Furthermore, some benefits of non-strictness are only gained when it is the default in the language, instead of an option.

  • Support for list comprehension. Actually, Scala calls it for-comprehension, as the way it is implemented is completely divorced from lists. In its simplest terms, list comprehensions can be thought of as the map function/method shown in the example, though nesting of map statements (supports with flatMap in Scala) as well as filtering (filter or withFilter in Scala, depending on strictness requirements) are usually expected.

    This is a very common operation in functional languages, and often light in syntax -- like in Python's in operator. Again, Scala is somewhat more verbose than usual.

In my opinion, Scala is unparalled in combining FP and OO. It comes from the OO side of the spectrum towards the FP side, which is unusual. Mostly, I see FP languages with OO tackled on it -- and it feels tackled on it to me. I guess FP on Scala probably feels the same way for functional languages programmers.

EDIT

Reading some other answers I realized there was another important topic: type inference. Lisp was a dynamically typed language, and that pretty much set the expectations for functional languages. The modern statically typed functional languages all have strong type inference systems, most often the Hindley-Milner1 algorithm, which makes type declarations essentially optional.

Scala can't use the Hindley-Milner algorithm because of Scala's support for inheritance2. So Scala has to adopt a much less powerful type inference algorithm -- in fact, type inference in Scala is intentionally undefined in the specification, and subject of on-going improvements (it's improvement is one of the biggest features of the upcoming 2.8 version of Scala, for instance).

In the end, however, Scala requires all parameters to have their types declared when defining methods. In some situations, such as recursion, return types for methods also have to be declared.

Functions in Scala can often have their types inferred instead of declared, though. For instance, no type declaration is necessary here: List(1, 2, 3) reduceLeft (_ + _), where _ + _ is actually an anonymous function of type Function2[Int, Int, Int].

Likewise, type declaration of variables is often unnecessary, but inheritance may require it. For instance, Some(2) and None have a common superclass Option, but actually belong to different subclases. So one would usually declare var o: Option[Int] = None to make sure the correct type is assigned.

This limited form of type inference is much better than statically typed OO languages usually offer, which gives Scala a sense of lightness, and much worse than statically typed FP languages usually offer, which gives Scala a sense of heavyness. :-)

Notes:

  1. Actually, the algorithm originates from Damas and Milner, who called it "Algorithm W", according to the wikipedia.

  2. Martin Odersky mentioned in a comment here that:

    The reason Scala does not have Hindley/Milner type inference is that it is very difficult to combine with features such as overloading (the ad-hoc variant, not type classes), record selection, and subtyping

    He goes on to state that it may not be actually impossible, and it came down to a trade-off. Please do go to that link for more information, and, if you do come up with a clearer statement or, better yet, some paper one way or another, I'd be grateful for the reference.

    Let me thank Jon Harrop for looking this up, as I was assuming it was impossible. Well, maybe it is, and I couldn't find a proper link. Note, however, that it is not inheritance alone causing the problem.

查看更多
登录 后发表回答