Swift type inference is smarter than you think. The secret is to look at the signature of the nil-coalescing operator ?? :
func ??<T>(optional: T?, defaultValue: @autoclosure () -> T) -> T func ??<T>(optional: T?, defaultValue: @autoclosure () -> T?) -> T?
Using this operator, it is clear that it will promote T as the closest common ancestor of any types passed to it, up to Any , for example:
let i: Int? = 3 let s: String? = "3" let a: Any? = i ?? s
This compiles and works, but the type of a is Any (which is actually a protocol). In some cases, you need to provide type proofs for the compiler, and in some cases you will not. If the types have a common ancestor that is not a protocol, then it seems that evidence is not needed. Can you think that ?? gets special treatment from the compiler, but itβs not. You can easily roll your own.
To come to your attempt, you have thus overdone it. (As I was when I had a similar problem.) You only need one type.
If we used the + function as a function, it would look like this:
func joinArrays<T>(array1: [T], array2: [T]) -> [T] { return array1 + array2 } class Base {} class Derived1 : Base {} class Derived2 : Base {} let a1 = [Derived1(), Derived1()] let a2 = [Derived2(), Derived2()] let a = joinArrays(a1, a2)
Type a - Array<Base> , since it is the closest common ancestor of type type parameters.
I used this "type advancement" to create all kinds of complex coalescing / monadic Γ la Haskell operators. The only misunderstanding was that Swift does not support covariance of type type parameters.