We use the dynamic Script tag with the JsonP mechanism to achieve cross-domain Ajax calls. The widget interface is very simple. It simply calls the web search service, passing the search criteria provided by the user, and receives and dynamically displays the results.
Note For those familiar with the dynamic Script tag with the JsonP method of executing Ajax-like requests to a service that returns Json formatted data, I can explain how to use it if you think this may be relevant to the problem.
WCF is hosted in IIS. This is Restful, so the first thing we do when a user clicks on a search is to generate an Url containing the criteria. It looks like this...
https: //.../service.svc? criteria = john + smith
Then we use the dynamically created html Script tag with the source attribute set in the above URL to make a request to our service. The result is returned, and we process it to show the results.
All this works fine, but we noticed that when using IE, the service receives a request from the client Twice . I used Fiddler to track traffic coming out of the browser, and of course, I see two requests with the following addresses ...
The second request was added with some kind of identifier. This identifier is different for each request.
I immediately thought that this was due to caching. Adding a random number to the end of the URL is one of the classic approaches to disabling browser caching. To prove this, I adjusted the cache settings in IE.
I set "Check newer versions of saved pages" to "Never" - this led to the fact that each request was made only once. Unit with a random number at the end.
I set the default value for this setting to "Automatic", and requests immediately began to be sent again.
Interestingly, I do not receive both requests on the client. I found this link where someone suggests this might be a bug with IE. The fact that this does not happen to me in Firefox supports this theory.
- Can anyone confirm that this is a bug with IE? It may be by design.
- Does anyone know how I can stop this?
Some of the more obscure searches that my users run will take up enough processing time to double something very bad idea. I really want to avoid this, if at all possible :-)