What triggers the DataReceived event of the .NET SerialPort class?

I understand from MSDN docs that the DataReceived event will not necessarily be fired once for each byte.

But does anyone know what exactly is the mechanism that makes the event fire?

Is every timer byte received, which must reach, for example, 10 ms between bytes, before the event occurs?

I ask because I am trying to write an application that reads XML data coming from the serial port.

Since my laptop does not have serial ports, I use a virtual serial port emulator. (I know, I know - I can do nothing with this ATM).

When I pass data through an emulated port to my application, the event fires once for each XML record (about 1,500 bytes). Excellent. But when a colleague in another office tries it with two computers connected to the actual cable, the DataReceived event fires repeatedly, after every 10 or so XML bytes, which completely discards the application.

+4
source share
2 answers

DataReceived can be triggered at any time when one or more bytes are ready for reading. Exactly, when it starts, it depends on the OS and drivers, and there will also be a slight delay between the received data and the event triggered in .NET.

You should not rely on DataReceived event times for the control flow.

Instead, analyze the underlying protocol and, if you have not received the full message, wait more. If you receive more than one message, do not forget to leave the upper left parsing of the first message, because it will be the beginning of the next message.

+6
source

As Mark Byers noted, it depends on the OS and drivers. At the lowest level, the standard RS232 chip (throughout my life I can’t remember the purpose of the one that everyone copied to create the “standard”) will interrupt the interrupt when it has data in its incoming register. The "lower end" of the driver should receive data (which can be any size up to the size of the chip buffer) and store it in the driver buffer and signal to the OS that it has data. At this point, the .NET platform may begin to find out that the data is available. Depending on when the OS signals the application that opened the serial port (which is an OS level operation and provides a "real" link from the .NET platform to implement the OS / driver level), there can literally be any amount of data> 1 byte in the buffer because the lower end of the driver could load more data during this time. My bet is that on your system the driver provides a huge buffer and signals only after a significant pause in the data stream. On the other hand, your peer system signals much more often. Again, Mark Bayer advises to parse the protocol. I implemented a similar system over TCP sockets, and the only way to deal with the situation is to buffer the data until you get the full protocol message, and then send the full message to the application.

+2
source

Source: https://habr.com/ru/post/1301864/


All Articles