I understand from MSDN docs that the DataReceived event will not necessarily be fired once for each byte.
But does anyone know what exactly is the mechanism that makes the event fire?
Is every timer byte received, which must reach, for example, 10 ms between bytes, before the event occurs?
I ask because I am trying to write an application that reads XML data coming from the serial port.
Since my laptop does not have serial ports, I use a virtual serial port emulator. (I know, I know - I can do nothing with this ATM).
When I pass data through an emulated port to my application, the event fires once for each XML record (about 1,500 bytes). Excellent. But when a colleague in another office tries it with two computers connected to the actual cable, the DataReceived event fires repeatedly, after every 10 or so XML bytes, which completely discards the application.
source share