A manual approach to this may be to create a filter for the servlets in question, you can measure the time required to complete all the executions and store them somewhere. This is good because the overhead for profiling calls is just as much as the code you write in the filter, I also found it very useful to find pages and servlets that take longer to fully load.
Here's how filters work.
- Url called
- The filter enters and saves the URL and current time (startTime)
- The filter invokes the recipient servlet
- Servlet runs (servlet, page or jsp)
- Control returns to filter runtime: currentTime - startTime
- Record or save the data obtained for future comparisons (i.e. save it in a csv file).
- Filtering ends.
To do this, create a class that implements the javax.servlet.Filter interface.
public class ProfilingFilter implements Filter { public void init(FilterConfig fc){} public void doFilter(ServletRequest req, ServletResponse resp, FilterChain chain){ long startTime = System.currentTimeInMillis(); String servletName = ((HttpServletRequest) req)..getPathInfo(); chain.doFilter(req, resp); long executionTime = System.currentTimeInMillis()-startTime;
Now add it to your web.xml
<filter> <filter-name>profilingFilter</filter-name> <filter-class>com.awesome.ProfilingFilter</filter-class> </filter>
And finally, the mapping:
<filter-mapping> <filter-name>profilingFilter</filter-name> <url-pattern>*</url-pattern> </filter-mapping>
Remember that this will filter ALL, so you will profile, html, jsp, servlets, images, and what not.
We found that it was very useful to find that it was in the application too much time to answer or was extremely difficult.
You can even include this in production for a day or two to get real data, the impact on productivity will be as big as your code spent on saving profiling data.
Some ideas. Save the statistics on the map inside the servlet context, and then another servlet will display these statistics. this way you donβt even need to access the disk.