Given the strong type system of Scala, I had an ambitious project which I'm about to abandon now because the effort to usefulness ratio seems to be too high.
Basically I have some graph elements (GE
) and they correspond to sound processes which are carried out at a given calculation rate. Graph elements are composed from other graph elements forming their inputs. Now there are rather arbitrary constraints on the inputs' rates. In the source language (SuperCollider) the rates are checked at runtime, naturally because it's a dynamically typed language. I wanted to see if I can enforce the check at compile time.
Some constraints are fairly simply and can be expressed in the forms of "rate of arg1 must be at least as high as rate of arg2". But others get intricate, e.g.
"if arg0's rate is 'demand', args1's rate must be either 'demand' or 'scalar' or equal to the enclosing GE's rate".
The question is: Should I give up on this? Here is how it looks with runtime check:
sealed trait Rate
case object demand extends Rate
case object audio extends Rate
case object control extends Rate
case object scalar extends Rate
trait GE { def rate: Rate }
// an example GE:
case class Duty(rate: Rate, in0: GE, in1: GE) extends GE {
def checkRates(): Unit =
require(in0.rate != demand || (in1.rate != demand &&
in1.rate != scalar && in1.rate != rate))
}
And in constrast how it could look with type parameters for the rates:
sealed trait Rate
trait audio extends Rate
trait demand extends Rate
trait control extends Rate
trait scalar extends Rate
trait GE[R <: Rate]
object Duty {
trait LowPri {
implicit def con1[R, T]: RateCons[R, audio , T] = new ConImpl[R, audio , T]
implicit def con2[R, T]: RateCons[R, control, T] = new ConImpl[R, control, T]
implicit def con3[R, T]: RateCons[R, scalar , T] = new ConImpl[R, scalar , T]
implicit def con4[R, T]: RateCons[R, demand , demand] =
new ConImpl[R, demand, demand]
implicit def con5[R, T]: RateCons[R, demand , scalar] =
new ConImpl[R, demand, scalar]
}
object RateCons extends LowPri {
implicit def con6[R]: RateCons[R, demand, R] = new ConImpl[R, demand, R]
}
private class ConImpl[ R, S, T ] extends RateCons R, S, T ]
sealed trait RateCons[ R, S, T ]
def ar[S <: Rate, T <: Rate](in0: GE[S], in1: GE[T])(
implicit cons: RateCons[audio, S, T]) = apply[audio, S, T](in0, in1)
def kr[S <: Rate, T <: Rate](in0: GE[S], in1: GE[T])(
implicit cons: RateCons[control, S, T]) = apply[control, S, T](in0, in1)
}
case class Duty[R <: Rate, S <: Rate, T <: Rate](in0: GE[S], in1: GE[T])(
implicit con: Duty.RateCons[R, S, T]) extends GE[R]
Tests:
def allowed(a: GE[demand], b: GE[audio], c: GE[control], d: GE[scalar]): Unit = {
Duty.ar(b, c)
Duty.kr(b, c)
Duty.ar(b, a)
Duty.ar(b, d)
Duty.ar(a, b)
Duty.kr(a, c)
}
def forbidden(a: GE[demand], b: GE[audio], c: GE[control], d: GE[scalar]): Unit = {
Duty.kr(a, b)
Duty.ar(a, c)
}
A path worth pursuing? Three more things that speak against it, apart from the code bloat:
- There are probably a couple of dozen
GE
s which would need custom constraints - Composing
GE
s becomes increasingly difficult: code might need to pass around dozens of type parameters - Transformations might become difficult, e.g. imagine a
List[GE[_<:Rate]].map( ??? )
. I mean how wouldDuty.RateCons
translate toTDuty.RateCons
(whereTDuty
is a differentGE
)...
I had invested quite a bit of time in this project already, that's why I'm reluctant to give up so easily. So... convince me that I'm doing something useful here, or tell me that I should go back to the dynamically checked version.