I have a wcf service that I configure to run under IIS 7. I have a set of streaming services for transmission. When I host the service in a console application myself, every thing works fine. But when the client connects to the iIS hosting, it seems to be buffered, and the client eventually disconnects. I used a violinist to determine that this client timeout occurs before the http request is executed.
Here are the server bindings.
var binding = new CustomBinding(); binding.Elements.Add( new TextMessageEncodingBindingElement() { MessageVersion = MessageVersion.Soap12WSAddressing10 } ); var secBinding = SecurityBindingElement.CreateUserNameOverTransportBindingElement(); secBinding.AllowInsecureTransport = true; binding.Elements.Add( secBinding ); binding.Elements.Add( new HttpTransportBindingElement() { TransferMode = TransferMode.Streamed, MaxReceivedMessageSize = Int32.MaxValue, } );
And client binding:
var binding = new CustomBinding(); binding.Elements.Add( new TextMessageEncodingBindingElement() { MessageVersion = MessageVersion.Soap12WSAddressing10 } ); var secBinding = SecurityBindingElement.CreateUserNameOverTransportBindingElement(); secBinding.AllowInsecureTransport = true; binding.Elements.Add( secBinding ); binding.Elements.Add( new HttpTransportBindingElement() { TransferMode = TransferMode.Streamed, MaxReceivedMessageSize = Int32.MaxValue, MaxBufferSize = 400 } );
In contrast, the connection times out because the stream is infinite and the server must read the first few bytes and then close the stream.
source share