Using cURL Output

I plan to record live tweets on a specific topic. However, I am using the twitter stream API with cURL in PHP.

Here is the code:

<?php $username = "xxxxx"; $password = "xxxxx"; $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, 'https://stream.twitter.com/1/statuses/filter.json? track=SEARCH_PARAMETER'); curl_setopt($ch, CURLOPT_USERPWD, $username.":".$password); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); $result = curl_exec($ch); $jsonOBJ = json_decode ($result); curl_close($ch); print_r($jsonOBJ); ?> 

My problem: if I set CURLOPT_RETURNTRANSFER to 0, I can see the tweets on the terminal. But I can not store and print $ jsonOBJ in the variable.

Please, help!

+6
source share
4 answers

UPDATE : see the new code at the end of the post, it was pretty easy to do with cURL, but I made a mistake with it for the first time.

I was not able to get the Twitter thread API to work using cURL in conjunction with CURLOPT_READFUNCTION , but fsockopen () and fread() used successfully. I'm not sure why the read function did not work, since I used it before, but it should have something to do with the fact that the response data is "streaming" and is not sent using HTTP encoded encoding. In fact, my read function was never called, so I could not process the data.

Method I use:

  • Connect using fsockopen to ssl://stream.twitter.com
  • Perform basic HTTP request for stream data using fputs
  • Use the HTTP response headers and make sure there are no errors.
  • Read data using fread in an infinite loop
  • Every time a piece of data is read, I call the internal buffer function
  • The buffer function adds new data to the buffer.
  • Then the buffer function tries to process all the messages in the buffer (if we have 1 or more complete messages)
  • When it processes each message, the buffer is reduced until it becomes empty, and then the function returns and the data is read again.

It worked for me for several hours and did not have a dropped connection, and I processed more than 30,000 messages without errors.

I basically implemented a callback system, so that every time a complete message is read from the buffer, it calls a user callback using a json message so that the application can do whatever it needs with the message (for example, paste it into databases )

I don’t have any short snippets to publish them here, but if you want, write to me by going to the site indicated in my profile and fill out the contact form and I will be happy to share it. Maybe we can work together if someone is interested. I just did it for fun, I don’t have an interest in Twitter and I don’t use it for financial reasons. In the end, I'll put it on GitHub.

EDIT:

Below is the cURL code that will connect to the streaming API and send JSON messages to the callback function as they become available. This example uses gzip encoding to save bandwidth.

 <?php $USERNAME = 'youruser'; $PASSWORD = 'yourpass'; $QUERY = 'nike'; /** * Called every time a chunk of data is read, this will be a json encoded message * * @param resource $handle The curl handle * @param string $data The data chunk (json message) */ function writeCallback($handle, $data) { /* echo "-----------------------------------------------------------\n"; echo $data; echo "-----------------------------------------------------------\n"; */ $json = json_decode($data); if (isset($json->user) && isset($json->text)) { echo "@{$json->user->screen_name}: {$json->text}\n\n"; } return strlen($data); } $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, 'https://stream.twitter.com/1/statuses/filter.json?track=' . urlencode($QUERY)); curl_setopt($ch, CURLOPT_USERPWD, "$USERNAME:$PASSWORD"); curl_setopt($ch, CURLOPT_WRITEFUNCTION, 'writeCallback'); curl_setopt($ch, CURLOPT_TIMEOUT, 20); // disconnect after 20 seconds for testing curl_setopt($ch, CURLOPT_VERBOSE, 1); // debugging curl_setopt($ch, CURLOPT_ENCODING, 'gzip, deflate'); // req'd to get gzip curl_setopt($ch, CURLOPT_USERAGENT, 'tstreamer/1.0'); // req'd to get gzip curl_exec($ch); // commence streaming $info = curl_getinfo($ch); var_dump($info); 
+4
source

I am also working on the same ... The problem is that when you do this in the terminal, it is a thread, so the connection stays alive until you kill it. (i.e. curl_exec () does not end)

Try looking at CURLOPT_PROGRESSFUNCTION and CURLOPT_READFUNCTION. They can give you some advice.

+1
source

@Reza Sanaie and others who may find this helpful.

I used the SEARCH TWITTER API and get live tweets. So it can be useful. Here is the code:

 <?php $query = "SEARCH_PARAMETER"; $request = "http://search.twitter.com/search.json?q=".urlencode($query); $response = file_get_contents($request); $jsonobj = json_decode($response); print_r($jsonobj); ?> 

I also have a MySQL connection setup to insert into the database, and this script is added to crontab to automate the whole process.

+1
source

I’m just throwing you an answer, leaving for the day. It seems like it should work.

Below is a function in which I pass it the URL and some XML data, and it returns an associative array saying true or false for success, and the retuen value as a string.

 function do_curl($url, $data) { $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $url); curl_setopt($ch, CURLOPT_POST, 1); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_POSTFIELDS, $data); $result = curl_exec ($ch); $curl_return=array(); if (!is_string($result)) { $curl_return['STATUS'] = FALSE; $curl_return['ERRMSG'] = curl_error($ch); } else { $curl_return['STATUS'] = TRUE; $curl_return['RESPONSE'] = $result; } curl_close($ch); return $curl_return; } 
0
source

Source: https://habr.com/ru/post/914247/


All Articles