We have an OSGi container with many products working internally, one of which is our product.
We have some performance tests, and there is this strange problem: every restart of the OSGi container will lead to a performance deviation of up to 400% for some of our tests.
After some testing and stuff, I was able to track this with this method:
public static Method getMethodForSlotKey(Class<?> cls, String slotKey, String methodType) { Method[] methods = cls.getMethods(); if (methods != null && methods.length > 0) { for (Method method : methods) { String methName = method.getName(); if (methName.startsWith(methodType)) { IDataAnnotation annot = method.getAnnotation(IDataAnnotation.class); if (annot != null) { String annotSlotKey = annot.SlotKey(); if (annotSlotKey != null && annotSlotKey.equals(slotKey)) { Class<?>[] paramTypes = method.getParameterTypes(); // for now, check length == 1 for setter and 0 for getter. int len = SET_TXT.equals(methodType) ? 1 : 0; if (paramTypes != null && paramTypes.length == len) { return method; } } } } } } return null; }
This method basically does reflection and string comparison.
Now, I did this to cache the results of this method, and instantly our deviation is reduced to 10-20%. Of course, this method is often called, so there is an improvement that is obvious.
However, I donβt understand why the non-cached version has such a high deviation with the only difference being that restarting OSGi / JVM? What exactly can happen during a restart? Are there any known performance issues for different class loaders? Is it possible that in OSGi, libraries will be loaded in a different order between restarts?
I am looking for a key for this to make sense.
UPDATE
It turns out this call:
Method[] methods = cls.getMethods();
causes a deviation. I still do not understand this, so if anyone does this, I will be glad to hear about it.
sveri source share