So far I have written a Perl server that is constantly running in the background, when it receives incoming connections, the process unfolds and then processes this single connection. What I ultimately want to be possible is to accept incoming php connections through a socket, of course run these commands, and then send and return the information. So far, I have managed to get this work 100% with a script written in Perl, but it does not work 100% with php.
[Instead of inserting the wall of the hole in the text, this is the actual section for sending and receiving.]
print "Binding to port ...\n"; $server = IO::Socket::INET->new( Listen => 1, LocalAddr => $_server, LocalPort => $_port, Proto => 'tcp', Reuse => 1, Type => SOCK_STREAM) || die "Cant bind : $@ \n"; $proccess = fork(); if ($proccess) { kill(0); } else { while(1) { while ($client = $server->accept()) { $client->autoflush(1); $con_handle = fork(); if ($con_handle) { print "Child Spawned [$con_handle]\n"; }else{ while (defined($line = <$client>)) { $command = `$line`; print $client $command; } exit(0); } } }
As I said, this works fine with the perl written client both locally and remotely, but it does not work 100% with php, which means 100% that the server will receive the command but cannot send it back, or the server can restore the command, but the client cannot read the response.
Here is the client [php] that works for me the most.
$handle = fsockopen("tcp://xx.xx.xx.xx",1234); fwrite($handle,"ls"); echo fread($handle); fclose($handle);
Here is the perl working client
#!/usr/bin/perl -w use strict; use IO::Socket; my ($host, $port, $kidpid, $handle, $line); unless (@ARGV == 2) { die "usage: $0 host port" } ($host, $port) = @ARGV;
If this helps, I can send the entire server if necessary.