Of course - for example:
import scala.language.experimental.macros import scala.reflect.macros.Context object Demo { def at(xs: Any*)(i: Int) = macro at_impl def at_impl(c: Context)(xs: c.Expr[Any]*)(i: c.Expr[Int]) = { import c.universe._
And then:
scala> Demo.at(1, 'b, "c", 'd')(1) List(Int(1), Symbol, String("c"), Char('d')) res0: Symbol = 'b scala> Demo.at(1, 'b, "c", 'd')(2) List(Int(1), Symbol, String("c"), Char('d')) res1: String = c
Note that inferred types are exact and correct.
Note also that this will not work if the argument is a sequence with type of type _* , of course, and that you will need to write something like the following if you want to catch this case and provide a useful error message:
def at_impl(c: Context)(xs: c.Expr[Any]*)(i: c.Expr[Int]) = { import c.universe._ xs.toList.map(_.tree) match { case Typed(_, Ident(tpnme.WILDCARD_STAR)) :: Nil => c.abort(c.enclosingPosition, "Needs real varargs!") case _ => i.tree match { case Literal(Constant(index: Int)) => xs.lift(index).getOrElse( c.abort(c.enclosingPosition, "Invalid index!") ) case _ => c.abort(c.enclosingPosition, "Need a literal index!") } } }
See my question here and bug report here for a more detailed discussion.
source share