I know Laravel's queue drivers like redis and beanstalkd, and I read that you can increase the number of workers for beanstalkd, etc. However, I'm just not sure if these solutions are suitable for my scenario. This is what I need;
I listen to the XML feed through a socket connection, and the data just continues to grow rapidly. forever and ever. I get dozens of XML documents per second.
I read the data from this socket line by line, and as soon as I get into the XML close tag, I send a buffer to another process that needs to be parsed. I simply encoded xml in base64 and ran a separate php process for each xml.shell_exec('php parse.php' . $base64XML);
This allowed me to pretty quickly parse this endless xml data. Type of manual threading. Now I would like to use the same functions with Laravel, but I am wondering if there is a better way to do this. I believe that Artisan :: call ('command') does not bring it to the background. Of course, I could make shell_exec in Laravel, but I would like to know if I can benefit from Beanstalkd or a similar solution.
So, the real question is: . How to set the number of work queues for beanstalkd or redis drivers? Like 20 threads running simultaneously. More if possible.
A slightly less important question: How many threads are too many? If I had a high-performance dedicated server that could handle the load just fine, would create 500 threads / workers using these tools to cause any code level problems?
source
share