Quick refresher: self-types are commonly used when writing traits that want to proscribe that they get mixed in to a particular class. For example, the cake-pattern leverages them. In the example below, FooTrait specifies a self-type of FooTraitConfiguration to insure that it is mixed in to a class that provides the expected times val.
import actors.Actor
import actors.Actor._
trait FooTraitConfiguration { val times : Int }
trait FooTrait { self:FooTraitConfiguration =>
case object Ping
case object Pong
val a = actor {
loop {
react {
case Ping =>
self ! Pong
case Pong =>
for(_ <- (1 to times)) { print(".") }
System.out.println("pong.")
} } }
def ping = a ! Ping
def pong = a ! Pong
}
class Foo extends FooTrait with FooTraitConfiguration { @Override val times = 5 }
But, alas, this fails to compile:
error: value ! is not a member of FooTrait with FooTraitConfiguration self ! Pong
It seems that the self-trait has broken the Actor API! And indeed, it has. Because self-traits are not usually specified with self! It should have been:
this:FooTraitConfiguration =>
The self-type means that within FooTrait the type of this is considered to be FooTrait with FooTraitConfiguration. Using a word other than this additionally sets up an alias to that type for e.g. use within nested classes. And there’s the rub: Actors depend on a method named self which is shadowed when the alias to the type is named self.
Note to self: Don’t use self when specifying self types!