EDIT:
Although the issue in the question has been updated, an algebraic approach can still be used to simplify the questions. You still don't need to worry about 3D matrices. Your result will be as follows:
output = mean(v.^2).*A.^2 + 2.*mean(v.*w).*A.*B + mean(w.^2).*B.^2;
If your matrices and vectors are large, this solution will give you much better performance due to less memory required compared to solutions using BSXFUN or REPMAT .
Explanation:
Assuming M is the m-by-n-by-d matrix that you get as a result, before you take the average of the third dimension, this is what the span of the third dimension will cover:
M(i,j,:) = A(i,j).*v + B(i,j).*w;
In other words, the vector v scaled by A(i,j) plus the vector w scaled by B(i,j) . And this is what you get when you apply elemental squaring:
M(i,j,:).^2 = (A(i,j).*v + B(i,j).*w).^2; = (A(i,j).*v).^2 + ... 2.*A(i,j).*B(i,j).*v.*w + ... (B(i,j).*w).^2;
Now, when you take the average of the third dimension, the result for each element of output(i,j) will be as follows:
output(i,j) = mean(M(i,j,:).^2); = mean((A(i,j).*v).^2 + ... 2.*A(i,j).*B(i,j).*v.*w + ... (B(i,j).*w).^2); = sum((A(i,j).*v).^2 + ... 2.*A(i,j).*B(i,j).*v.*w + ... (B(i,j).*w).^2)/d; = sum((A(i,j).*v).^2)/d + ... sum(2.*A(i,j).*B(i,j).*v.*w)/d + ... sum((B(i,j).*w).^2)/d; = A(i,j).^2.*mean(v.^2) + ... 2.*A(i,j).*B(i,j).*mean(v.*w) + ... B(i,j).^2.*mean(w.^2);