I donβt have a framework for profiling operations, but I would of course change the code to reuse an easily distinguished vector. In addition, I would raise the initial intersection from the loop. In addition, std::back_inserter() should make sure that the elements are added in the right place, and not at the beginning:
int func() { vector<vector<unsigned> > t = some_initialization(); if (t.empty()) { return; } vector<unsigned> intersectedValues(t[0]); vector<unsigned> tempIntersectedSubjects; for (std::vector<std::vector<unsigned>>::size_type i(1u); i < t.size() && !intersectedValues.empty(); ++i) { std::set_intersection(t[i].begin(), t[i].end(), intersectedValues.begin(), intersectedValues.end(), std::back_inserter(tempIntersectedSubjects); std::swap(intersectedValues, tempIntersectedSubjects); tempIntersectedSubjects.clear(); } }
I think this code has a chance to become faster. It may also be wise to cross many sets: instead of holding one set and intersecting with it, you could create a new intersection for pairs of neighboring sets, and then cross the first sets with respect to the neighboring ones:
std::vector<std::vector<unsigned>> intersections( std::vector<std::vector<unsigned>> const& t) { std::vector<std::vector<unsigned>> r; std::vector<std::vector<unsignned>>::size_type i(0); for (; i + 1 < t.size(); i += 2) { r.push_back(intersect(t[i], t[i + 1])); } if (i < t.size()) { r.push_back(t[i]); } return r; } std::vector<unsigned> func(std::vector<std::vector<unsigned>> const& t) { if (t.empty()) { } std::vector<std::vector<unsigned>> r(intersections(t)) return r.size() == 1? r[0]: func(r); }
Of course, you would not implement it like this: you would use the Stepanov binary counter to store intermediate sets. This approach suggests that the result is most likely not empty. If the expectation is that the result will be empty, this may not be an improvement.
source share